Lost amid the conversation on big data and the accelerating advancement of just about every aspect of enterprise software that manages information are the things that hold it all together. Yet this is critical: information-management components must come together in a meaningful fashion or there will be unneeded redundancy and waste and opportunities missed. Considering that optimizing the information asset goes directly to the organization’s bottom line, it behooves us to play an exceptional game— not a haphazard one—with our technology building blocks.
Deze presentatie is gegeven aan de Open Universiteit te Amsterdam bij het afscheidv van prof. Lex Bijlsma. Er wordt een overzicht gegeven van enterprise architectuur, het gebruik van ArchiMate en de noodzaak om eerst te denken en dan pas te doen
Unified data management is becoming strategically important for companies to gain insights from large and diverse data in real time. Effective data management solutions can support business operations and analytics to improve processes and decision making. However, developing a unified strategy is challenging and requires collaboration between IT and business users. When both perspectives are incorporated into creating governance policies and selecting tools, companies can better integrate, access, and leverage their data to increase competitiveness.
The document defines enterprise architecture and discusses its key components and levels of detail. It also reviews major industry trends like big data, data analytics, mobility, and cloud computing that enterprise architects should focus on. The benefits of enterprise architecture are outlined as more efficient IT operations, reduced business risk, and faster time to market. Customer relationship management (CRM) aims to increase profitability through solidifying customer satisfaction and loyalty. True CRM provides a holistic view of customers to inform business decisions.
Building a strong Data Management capability with TOGAF and ArchiMateBas van Gils
This is the deck that I used for my presentation at the EAM conference in 2013. It gives a high-level overview of the need for a solid data management capability before giving and overview of how enterprise architecture methods can be used to build this capability.
Department of Homeland Security (DHS) Chief Information Officer (CIO) Luke McCormack recently
submitted testimony to a US Senate Subcommittee [1]. This case study, which is based on CIO
McCormack’s testimony, demonstrates how enterprise architects using the ArchiMate® language [2] can
quickly capture business situations using viewpoints defined in the ArchiMate specification. These
viewpoints are templates for views that address particular sets of stakeholder concerns. This case study
contains views based on and named after standard templates.
The document discusses product information management (PIM) for HP Printing and Personal Systems. It outlines the challenges of managing vast amounts of product data across departments and systems. It then describes how a PIM solution could address these challenges by providing a single source of truth for product information through capabilities like data integration, governance and a centralized repository. The paper also provides details on how HP could implement a PIM architecture using a transactional hub model to manage master product data.
Robotic Process Automation in East Coast - KatprotechKatprotech1
Robotic Process Automation in East Coast, Machine Health Monitoring Solution for Manufacturing , Accounts Process Rpa Implementation, Powerapp Development, Katprotech
This document discusses implementing a single view of customer data across an enterprise. It begins by outlining common barriers such as a lack of digital experience strategy, silos between teams, and challenges measuring ROI. It then proposes using MongoDB as a flexible data platform to integrate new and existing data sources. Pentaho is recommended for blended analytics across data silos. The approach aims to provide a single customer view, resolve technology skills gaps, and iteratively define strategies by starting small projects and engaging stakeholders.
Deze presentatie is gegeven aan de Open Universiteit te Amsterdam bij het afscheidv van prof. Lex Bijlsma. Er wordt een overzicht gegeven van enterprise architectuur, het gebruik van ArchiMate en de noodzaak om eerst te denken en dan pas te doen
Unified data management is becoming strategically important for companies to gain insights from large and diverse data in real time. Effective data management solutions can support business operations and analytics to improve processes and decision making. However, developing a unified strategy is challenging and requires collaboration between IT and business users. When both perspectives are incorporated into creating governance policies and selecting tools, companies can better integrate, access, and leverage their data to increase competitiveness.
The document defines enterprise architecture and discusses its key components and levels of detail. It also reviews major industry trends like big data, data analytics, mobility, and cloud computing that enterprise architects should focus on. The benefits of enterprise architecture are outlined as more efficient IT operations, reduced business risk, and faster time to market. Customer relationship management (CRM) aims to increase profitability through solidifying customer satisfaction and loyalty. True CRM provides a holistic view of customers to inform business decisions.
Building a strong Data Management capability with TOGAF and ArchiMateBas van Gils
This is the deck that I used for my presentation at the EAM conference in 2013. It gives a high-level overview of the need for a solid data management capability before giving and overview of how enterprise architecture methods can be used to build this capability.
Department of Homeland Security (DHS) Chief Information Officer (CIO) Luke McCormack recently
submitted testimony to a US Senate Subcommittee [1]. This case study, which is based on CIO
McCormack’s testimony, demonstrates how enterprise architects using the ArchiMate® language [2] can
quickly capture business situations using viewpoints defined in the ArchiMate specification. These
viewpoints are templates for views that address particular sets of stakeholder concerns. This case study
contains views based on and named after standard templates.
The document discusses product information management (PIM) for HP Printing and Personal Systems. It outlines the challenges of managing vast amounts of product data across departments and systems. It then describes how a PIM solution could address these challenges by providing a single source of truth for product information through capabilities like data integration, governance and a centralized repository. The paper also provides details on how HP could implement a PIM architecture using a transactional hub model to manage master product data.
Robotic Process Automation in East Coast - KatprotechKatprotech1
Robotic Process Automation in East Coast, Machine Health Monitoring Solution for Manufacturing , Accounts Process Rpa Implementation, Powerapp Development, Katprotech
This document discusses implementing a single view of customer data across an enterprise. It begins by outlining common barriers such as a lack of digital experience strategy, silos between teams, and challenges measuring ROI. It then proposes using MongoDB as a flexible data platform to integrate new and existing data sources. Pentaho is recommended for blended analytics across data silos. The approach aims to provide a single customer view, resolve technology skills gaps, and iteratively define strategies by starting small projects and engaging stakeholders.
Slides zum Impuls-Vortrag "Data Strategy & Governance" - BI or DIE LEVEL UP 2022
Aufzeichnung des Vortrags: https://www.youtube.com/watch?v=705DfyfF5-M
The Enterprise Architecture Toolkit (EATK) is a solution accelerator that introduces new capabilities to simplify and consolidate enterprise architecture design. It leverages existing tools like Microsoft SharePoint, Office, and Visio by providing architecture templates, a portal, and processes. The EATK aims to enable transparency, create a proactive architecture process, and empower architects by surfacing relevant information through its architecture repository.
User-centric enterprise architecture (EA) focuses on developing useful and usable information products and governance services for end users. It captures key business and technical information across an organization to support better decision-making. User-centric EA provides information that is relevant, easy to understand, and accessible to all stakeholders, not just IT. The author developed a user-centric approach at the Secret Service and Coast Guard by focusing EA products on clear user needs and ensuring information is presented at multiple layers of detail and perspectives to be understandable and useful for a wide audience.
Master data management executive mdm buy in business case (2)Maria Pulsoni-Cicio
The document provides guidance on gaining executive support for master data management (MDM) projects. It recommends quantifying the hidden costs of bad data, conducting interviews with stakeholders across business units to understand data issues, and analyzing the findings to build a business case that shows the specific financial benefits of implementing MDM. Key steps include identifying stakeholders in IT and business functions, preparing interview questions tailored to different roles, interviewing a wide range of staff, and using the results to quantify savings and improved revenues from reducing data problems.
This document provides an introduction and overview of master data management (MDM). It begins with defining MDM as managing an organization's critical data. The agenda then outlines an overview of MDM, how it helps businesses succeed, and risks and challenges. It provides examples of master data and how MDM systems work. Key benefits of MDM include a single source of truth, reduced costs, and increased customer satisfaction by avoiding duplicate or inconsistent data across systems. Risks include data inconsistencies from mergers and acquisitions. Challenges involve determining what data to manage, ensuring consistency, and establishing appropriate data governance and information systems.
The document introduces the Microsoft Enterprise Architecture Toolkit (EATK), which aims to help organizations address challenges in enterprise architecture. The EATK provides tooling to assist enterprise architects, solution architects, and domain architects in performing functions that span IT, business, and operations to optimize an enterprise and fulfill its strategic mission. This is done within governance frameworks, operating models, and review boards to produce information supporting enterprise decisions, define processes for new technologies, create an architecture roadmap, and develop standards and patterns.
The use of an architecture–centered development process for delivering information technology began with
the introduction of client / server based systems. Early client/server and legacy mainframe applications did not
provide the architectural flexibility needed to meet the changing business requirements of the modern
publishing organization. With the introduction of Object Oriented systems, the need for an architecture–
centered process became a critical success factor. Object reuse, layered system components, data
abstraction, web based user interfaces, CORBA, and rapid development and deployment processes all
provide economic incentives for object technologies. However, adopting the latest object oriented technology,
without an adequate understanding of how this technology fits a specific architecture, risks the creation of an
instant legacy system.
Publishing software systems must be architected in order to deal with the current and future needs of the
business organization. Managing software projects using architecture–centered methodologies must be an
intentional step in the process of deploying information systems – not an accidental by–product of the
software acquisition and integration process.
Data-Ed Webinar: Design & Manage Data Structures DATAVERSITY
This document discusses different data structures and their appropriate usage. It begins with an overview of data structures and how they enable efficient data storage and organization. The webinar will cover various available data structures and when each should be used, with the goal of helping attendees apply the correct structures to fit their business needs and maximize business value. Learning objectives include understanding how different structures create different business value and applying the right structures to business requirements. The webinar will be presented on July 8, 2014 by Dave Marsh and Peter Aiken.
Data-Ed Online: Unlock Business Value through Reference & MDMDATAVERSITY
In order to succeed, organizations must realize what it means to utilize reference and MDM in support of business strategy. This presentation provides you with an understanding of the goals of reference and MDM, including the establishment and implementation of authoritative data sources, more effective means of delivering data to various business processes, as well as increasing the quality of information used in organizational analytical functions, e.g. BI. We also highlight the equal importance of incorporating data quality engineering into all efforts related to reference and master data management.
Learning objectives include:
What is Reference & MDM and why is it important?
Reference & MDM Frameworks and building blocks
Guiding principles & best practices
Understanding foundational reference & MDM concepts based on the Data Management Body of Knowledge (DMBOK)
Utilizing reference & MDM in support of business strategy
Data Architecture Process in a BI environmentSasha Citino
The document discusses the role of data architects in a business intelligence (BI) environment. It begins with an introduction to the author and their experience. It then provides an overview of what BI is and how it relates to data warehousing. The main roles and responsibilities of data architects are then outlined, including dimensional modeling, integration design, and defining architecture standards. Finally, it describes the typical steps in the data architecture process, from requirements gathering to data profiling, conceptual modeling, and physical design.
The Importance of Master Data ManagementDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and Master Data Management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI). To that end, attendees of this webinar will learn how to:
Structure their Data Management processes around these principles
Incorporate Data Quality engineering into the planning of reference and MDM
Understand why MDM is so critical to their organization’s overall data strategy
Discuss foundational MDM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Analytics Organization Modeling for Maturity Assessment and Strategy DevelopmentVijay Raj
The paper discusses Business Intelligence Organization Modeling as a concept along with practical implementation aspects with reference to Analytics and Business Intelligence Strategy in large enterprises. BI organization modeling revolves around the ability to model the patterns of BI prevalent within a corporate structure to assess organizational capability and maturity, and there by contributing towards BI strategy development and implementation. The paper also details Analytics & BI organization modeling in a predominantly SAP based enterprise ecosystem and is demonstrated with BI systems based on the SAP NetWeaver Business Warehouse (BW) using data discovery and machine learning techniques. The data discovery process for Analytics & BI organization modeling is carried out using SAP Lumira Data Visualization tool connected to an SAP NetWeaver BW based Global Enterprise Data Warehousing and Reporting System.
The document discusses various aspects of IT governance including:
1) IT governance provides a framework to link IT resources and information to business goals and strategies and institutionalizes best practices.
2) IT governance is important for effective enterprise governance as businesses are increasingly dependent on IT.
3) Good IT governance establishes a framework to assess costs/benefits of IT investments and has oversight committees review projects and hold parties accountable.
This document discusses BP's data modelling challenges and solutions. BP has over 100,000 employees operating in over 100 countries with 250 data centers and over 7,000 applications. Their challenges included decentralized management of data modelling, lack of standards and governance, and models getting lost after projects. Their solution included a self-service DMaaS portal for ER/Studio licensing and model publishing. It provides automated reporting, judicious use of macros, and a community of interest. Next steps include promoting data modelling to SAP architects and expanding training, certification and the online community.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
This document discusses trends in business intelligence (BI) and how adopting an agile approach can help address challenges in BI initiatives. It identifies a lack of flexibility as a key reason why many BI initiatives fail despite investments. The document advocates for adopting agile BI best practices like having automated and unified BI technologies that are pervasive and limitless. It recommends that organizations structure themselves to support agile BI with a hub-and-spoke model and business ownership of governance. Overall, the document argues that agility will be crucial for BI over the next decade to enable flexibility in responding to changing business needs.
A Reference Process Model for Master Data ManagementBoris Otto
This document presents an overview of a reference process model for master data management. It includes an introduction discussing business requirements for master data and challenges in managing master data quality. It also describes the research methodology used to develop an iterative reference process model. The results section provides an overview of the reference process model and discusses its evaluation through three case studies. The conclusion recognizes the model's contribution in explicating the design process for master data management organizations.
In business, master data management is a method used to define and manage the critical data of an organization to provide, with data integration, a single point of reference.
The document discusses the importance of IT architecture for organizations. It states that IT architecture defines a company's IT infrastructure and provides a framework for technology investments and use. It also describes the key components of an effective IT architecture, including an inventory of hardware, software, data and communication links, as well as the functional use of IT and a strategic plan for technology use.
Applying reference models with archi mateBas van Gils
This is the slidedeck for a webinar that I presented for the opengroup. It presents a high-level overview of the use of reference model in the field of EA. Even more I present with some tips on how to use BiZZdesign architect to effectivdely implement reference models in organizations
This document discusses analytics and information architecture. It begins by describing how analytics workloads are moving away from data warehouses to more specialized platforms. It then discusses what distinguishes analytics from reporting, including that analytics involve complex summaries of information and linking analyses to business actions. The document examines various data platforms used for analytics and contends that ParAccel Analytic Database is well-suited for analytics workloads due to its columnar structure, compression, SQL support, and ability to utilize Hadoop data without replication. It concludes by proposing an information architecture with Hadoop for big data, ParAccel for analytics, and data warehouses for operational support.
Making Information Management The Foundation Of The Future (Master Data Manag...William McKnight
More complex and demanding business environments lead to more heterogeneous systems environments. This, in turn, results in requirements to synchronize master data. Master Data Management (MDM) is an essential discipline to get a single, consistent view of an enterprise\’s core business entities – customers, products, suppliers, and employees. MDM solutions enable enterprise-wide master data synchronization. Given that effective master data for any subject area requires input from multiple applications and business units, enterprise master data needs a formal management system. Business approval, business process change, and capture of master data at optimal, early points in the data lifecycle are essential to achieving true enterprise master data.
Slides zum Impuls-Vortrag "Data Strategy & Governance" - BI or DIE LEVEL UP 2022
Aufzeichnung des Vortrags: https://www.youtube.com/watch?v=705DfyfF5-M
The Enterprise Architecture Toolkit (EATK) is a solution accelerator that introduces new capabilities to simplify and consolidate enterprise architecture design. It leverages existing tools like Microsoft SharePoint, Office, and Visio by providing architecture templates, a portal, and processes. The EATK aims to enable transparency, create a proactive architecture process, and empower architects by surfacing relevant information through its architecture repository.
User-centric enterprise architecture (EA) focuses on developing useful and usable information products and governance services for end users. It captures key business and technical information across an organization to support better decision-making. User-centric EA provides information that is relevant, easy to understand, and accessible to all stakeholders, not just IT. The author developed a user-centric approach at the Secret Service and Coast Guard by focusing EA products on clear user needs and ensuring information is presented at multiple layers of detail and perspectives to be understandable and useful for a wide audience.
Master data management executive mdm buy in business case (2)Maria Pulsoni-Cicio
The document provides guidance on gaining executive support for master data management (MDM) projects. It recommends quantifying the hidden costs of bad data, conducting interviews with stakeholders across business units to understand data issues, and analyzing the findings to build a business case that shows the specific financial benefits of implementing MDM. Key steps include identifying stakeholders in IT and business functions, preparing interview questions tailored to different roles, interviewing a wide range of staff, and using the results to quantify savings and improved revenues from reducing data problems.
This document provides an introduction and overview of master data management (MDM). It begins with defining MDM as managing an organization's critical data. The agenda then outlines an overview of MDM, how it helps businesses succeed, and risks and challenges. It provides examples of master data and how MDM systems work. Key benefits of MDM include a single source of truth, reduced costs, and increased customer satisfaction by avoiding duplicate or inconsistent data across systems. Risks include data inconsistencies from mergers and acquisitions. Challenges involve determining what data to manage, ensuring consistency, and establishing appropriate data governance and information systems.
The document introduces the Microsoft Enterprise Architecture Toolkit (EATK), which aims to help organizations address challenges in enterprise architecture. The EATK provides tooling to assist enterprise architects, solution architects, and domain architects in performing functions that span IT, business, and operations to optimize an enterprise and fulfill its strategic mission. This is done within governance frameworks, operating models, and review boards to produce information supporting enterprise decisions, define processes for new technologies, create an architecture roadmap, and develop standards and patterns.
The use of an architecture–centered development process for delivering information technology began with
the introduction of client / server based systems. Early client/server and legacy mainframe applications did not
provide the architectural flexibility needed to meet the changing business requirements of the modern
publishing organization. With the introduction of Object Oriented systems, the need for an architecture–
centered process became a critical success factor. Object reuse, layered system components, data
abstraction, web based user interfaces, CORBA, and rapid development and deployment processes all
provide economic incentives for object technologies. However, adopting the latest object oriented technology,
without an adequate understanding of how this technology fits a specific architecture, risks the creation of an
instant legacy system.
Publishing software systems must be architected in order to deal with the current and future needs of the
business organization. Managing software projects using architecture–centered methodologies must be an
intentional step in the process of deploying information systems – not an accidental by–product of the
software acquisition and integration process.
Data-Ed Webinar: Design & Manage Data Structures DATAVERSITY
This document discusses different data structures and their appropriate usage. It begins with an overview of data structures and how they enable efficient data storage and organization. The webinar will cover various available data structures and when each should be used, with the goal of helping attendees apply the correct structures to fit their business needs and maximize business value. Learning objectives include understanding how different structures create different business value and applying the right structures to business requirements. The webinar will be presented on July 8, 2014 by Dave Marsh and Peter Aiken.
Data-Ed Online: Unlock Business Value through Reference & MDMDATAVERSITY
In order to succeed, organizations must realize what it means to utilize reference and MDM in support of business strategy. This presentation provides you with an understanding of the goals of reference and MDM, including the establishment and implementation of authoritative data sources, more effective means of delivering data to various business processes, as well as increasing the quality of information used in organizational analytical functions, e.g. BI. We also highlight the equal importance of incorporating data quality engineering into all efforts related to reference and master data management.
Learning objectives include:
What is Reference & MDM and why is it important?
Reference & MDM Frameworks and building blocks
Guiding principles & best practices
Understanding foundational reference & MDM concepts based on the Data Management Body of Knowledge (DMBOK)
Utilizing reference & MDM in support of business strategy
Data Architecture Process in a BI environmentSasha Citino
The document discusses the role of data architects in a business intelligence (BI) environment. It begins with an introduction to the author and their experience. It then provides an overview of what BI is and how it relates to data warehousing. The main roles and responsibilities of data architects are then outlined, including dimensional modeling, integration design, and defining architecture standards. Finally, it describes the typical steps in the data architecture process, from requirements gathering to data profiling, conceptual modeling, and physical design.
The Importance of Master Data ManagementDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and Master Data Management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI). To that end, attendees of this webinar will learn how to:
Structure their Data Management processes around these principles
Incorporate Data Quality engineering into the planning of reference and MDM
Understand why MDM is so critical to their organization’s overall data strategy
Discuss foundational MDM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Analytics Organization Modeling for Maturity Assessment and Strategy DevelopmentVijay Raj
The paper discusses Business Intelligence Organization Modeling as a concept along with practical implementation aspects with reference to Analytics and Business Intelligence Strategy in large enterprises. BI organization modeling revolves around the ability to model the patterns of BI prevalent within a corporate structure to assess organizational capability and maturity, and there by contributing towards BI strategy development and implementation. The paper also details Analytics & BI organization modeling in a predominantly SAP based enterprise ecosystem and is demonstrated with BI systems based on the SAP NetWeaver Business Warehouse (BW) using data discovery and machine learning techniques. The data discovery process for Analytics & BI organization modeling is carried out using SAP Lumira Data Visualization tool connected to an SAP NetWeaver BW based Global Enterprise Data Warehousing and Reporting System.
The document discusses various aspects of IT governance including:
1) IT governance provides a framework to link IT resources and information to business goals and strategies and institutionalizes best practices.
2) IT governance is important for effective enterprise governance as businesses are increasingly dependent on IT.
3) Good IT governance establishes a framework to assess costs/benefits of IT investments and has oversight committees review projects and hold parties accountable.
This document discusses BP's data modelling challenges and solutions. BP has over 100,000 employees operating in over 100 countries with 250 data centers and over 7,000 applications. Their challenges included decentralized management of data modelling, lack of standards and governance, and models getting lost after projects. Their solution included a self-service DMaaS portal for ER/Studio licensing and model publishing. It provides automated reporting, judicious use of macros, and a community of interest. Next steps include promoting data modelling to SAP architects and expanding training, certification and the online community.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
This document discusses trends in business intelligence (BI) and how adopting an agile approach can help address challenges in BI initiatives. It identifies a lack of flexibility as a key reason why many BI initiatives fail despite investments. The document advocates for adopting agile BI best practices like having automated and unified BI technologies that are pervasive and limitless. It recommends that organizations structure themselves to support agile BI with a hub-and-spoke model and business ownership of governance. Overall, the document argues that agility will be crucial for BI over the next decade to enable flexibility in responding to changing business needs.
A Reference Process Model for Master Data ManagementBoris Otto
This document presents an overview of a reference process model for master data management. It includes an introduction discussing business requirements for master data and challenges in managing master data quality. It also describes the research methodology used to develop an iterative reference process model. The results section provides an overview of the reference process model and discusses its evaluation through three case studies. The conclusion recognizes the model's contribution in explicating the design process for master data management organizations.
In business, master data management is a method used to define and manage the critical data of an organization to provide, with data integration, a single point of reference.
The document discusses the importance of IT architecture for organizations. It states that IT architecture defines a company's IT infrastructure and provides a framework for technology investments and use. It also describes the key components of an effective IT architecture, including an inventory of hardware, software, data and communication links, as well as the functional use of IT and a strategic plan for technology use.
Applying reference models with archi mateBas van Gils
This is the slidedeck for a webinar that I presented for the opengroup. It presents a high-level overview of the use of reference model in the field of EA. Even more I present with some tips on how to use BiZZdesign architect to effectivdely implement reference models in organizations
This document discusses analytics and information architecture. It begins by describing how analytics workloads are moving away from data warehouses to more specialized platforms. It then discusses what distinguishes analytics from reporting, including that analytics involve complex summaries of information and linking analyses to business actions. The document examines various data platforms used for analytics and contends that ParAccel Analytic Database is well-suited for analytics workloads due to its columnar structure, compression, SQL support, and ability to utilize Hadoop data without replication. It concludes by proposing an information architecture with Hadoop for big data, ParAccel for analytics, and data warehouses for operational support.
Making Information Management The Foundation Of The Future (Master Data Manag...William McKnight
More complex and demanding business environments lead to more heterogeneous systems environments. This, in turn, results in requirements to synchronize master data. Master Data Management (MDM) is an essential discipline to get a single, consistent view of an enterprise\’s core business entities – customers, products, suppliers, and employees. MDM solutions enable enterprise-wide master data synchronization. Given that effective master data for any subject area requires input from multiple applications and business units, enterprise master data needs a formal management system. Business approval, business process change, and capture of master data at optimal, early points in the data lifecycle are essential to achieving true enterprise master data.
The Academia de Español D'Amore offers intensive Spanish language immersion programs that include small group classes, cultural activities, homestays, and excursions. Their methodology focuses on communicating in Spanish from the first day of class through activities and total immersion. They provide beginner through advanced levels of instruction tailored to individual student needs and Spanish proficiency. Specialized courses are also available in areas like business, law, and medical Spanish.
Data quality is important for business success. This document outlines a 6-step approach to measuring data quality ROI: 1) Inventory systems relying on data, 2) Determine data quality rules, 3) Profile data to measure rule compliance, 4) Score each rule and system, 5) Measure impact of improved data quality, 6) Implement improvements. The approach is demonstrated by analyzing a targeted marketing system and identifying areas of non-compliance to improve data quality and ROI.
The Lightning Whelk is a species of large predatory sea snail found along the southeastern coast of North America. It lives in sandy or muddy substrates in shallow coastal waters, where it preys primarily on bivalves. Lightning Whelks have a distinctive shell that is sinistral, or left-handed coiled, with low spires and dull knobs. They can reach sizes over 5 inches in length and have been an important resource for Native American communities.
Information management is key to business growth. It is a competitive advantage with the same merit as product knowledge and inventory availability. These once-held corporate competitive advantages are now considered “tickets to entry” and rather indistinguishable. Regulatory protections are largely gone, and when comparing your company’s features and functions, “demo parity” is the norm, especially within the larger industries.
Spanish Language Programs For Teens In Manuel Antoniodamoreschool
The document describes a Spanish language immersion program in Manuel Antonio, Costa Rica for teens and high school students. It offers Spanish classes from Monday to Friday with small class sizes of 5 students or less. Students live with local host families and are immersed in Spanish. The program includes cultural activities on weekends and aims to provide a full Spanish learning experience through language classes and cultural experiences. It provides details on program dates and costs.
The Lightning Whelk is a species of large predatory sea snail found in southeastern North America. It lives in sandy or muddy substrates in shallow coastal waters, where it preys primarily on bivalves. Lightning Whelks have been around for thousands of years and were used as tools and food by Native Americans. They have a left-handed or sinistral shell with low spires and knobs that are usually dull in coloration.
The document summarizes La Academia de Español D'Amore, a Spanish language school located in Manuel Antonio, Costa Rica. It offers group and private Spanish classes at six proficiency levels in a beautiful tropical setting. Students are immersed in Spanish through four hours of daily classes, homestays with local families, and cultural activities. The school uses a total immersion teaching methodology to help students achieve fluency.
China Medical City is a planned 25 square kilometer area in Taizhou, China focused on developing the biological medicine industry. It includes five functional zones: an R&D zone with centers focused on areas like biochemistry and cell therapy; a production zone with industrial bases for vaccines, medical devices, and biological products; an exhibition and trade zone; a healthcare and medical treatment zone with teaching hospitals and medical facilities; and an integrated matching projects zone to provide services and infrastructure support. Since being founded in 2006, China Medical City has established over 100 companies and research institutes and seen 100 pharmaceutical research results declared.
Compiling a Monolingual Dictionary for Native Speakersmostlyharmless
The document discusses compiling a monolingual dictionary for native speakers. It covers topics such as what constitutes a lexical database, the social functions of dictionaries for native languages, researching words and their histories, and incorporating corpus data while allowing for other types of information. The role of dictionaries is to provide an authoritative inventory of a language while explaining meanings and usages.
This paper details the process of DMC at eight different organizations while capturing the keys to success from each. These case studies were specifically selected to demonstrate several variations on the concept of consolidation. While there is no such thing as a �cookie-cutter� DMC process, there are common best practices and lessons to be shared.
White Paper-2-Mapping Manager-Bringing Agility To Business IntelligenceAnalytixDataServices
The document discusses how AnalytiXTM Mapping ManagerTM can help organizations build Business Intelligence solutions in an agile way. It does this by managing requirements, generating logical data models, automating source to target mappings, versioning changes, and providing visibility into the data integration process. This allows organizations to focus on high priority requirements, prove solutions early, and rapidly deploy Business Intelligence while managing changes.
1. Information architecture is the organization, labeling, and navigation of information on websites, intranets, and software to support users and business goals.
2. The TOGAF framework provides a standard process for developing an enterprise architecture, including information architecture. It includes the Architecture Development Method (ADM) and models for technical and information infrastructure.
3. Applying information architecture and frameworks like TOGAF can help businesses reduce costs, increase customer satisfaction and support innovation by providing a systematic approach to organizing online information.
The document discusses the role and structure of enterprise architecture. Enterprise architecture provides a conceptual blueprint that defines an organization's structure and operations, with the goal of helping an organization effectively achieve its current and future goals. It examines the current state, helps develop and evaluate designs, and creates a vision for the future. Popular enterprise architecture frameworks include the Zachman Framework and TOGAF, which provide standardized approaches to understanding an organization holistically.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
Data center engineering is a critical function for businesses, and great data center engineers play a significant role in ensuring that data centers operate efficiently and effectively. They must possess the technical expertise, attention to detail, problem-solving skills, communication skills, flexibility, and passion to excel in their role. If you're interested in pursuing a career in data center engineering, these qualities are essential to develop to become a great data center engineer.
Building An Information Technology And Information SystemsNicole Savoie
Enterprise Systems Architecture Of An Organization
Enterprise Systems Architecture (ESA) is the overall IT architecture of an organization that manages and evolves business operations. It consists of individual system architectures and their relationships. ESA provides guidelines for implementation and makes people
The document discusses the need for business modeling tools that go beyond traditional business intelligence (BI) capabilities like reporting and data access. While BI has improved data availability, tools for analyzing and manipulating data have not progressed as quickly. Spreadsheet use remains high despite data warehousing investments. The document argues that effective business modeling requires separating physical and semantic data models to make the data more understandable and usable for business users. It also requires the ability to create and update models over time in a standardized, integrated way.
This draft paper throws light on data center technology trends of 2016. This paper also suggest ways to enhance the competitiveness of Data Center. We have tried to carve out a strategy that can help decision makers to decide whether a technology adoption will prove beneficial for them or they will end up spending more without any significant ROI.
This document describes the process of designing and implementing a data warehouse and business intelligence application as part of a group project. Key aspects discussed include selecting team members, analyzing requirements, choosing a top-down or bottom-up design approach, resolving conflicts, and implementing the final system architecture. The author reflects on learning outcomes from participating in the project, including technical skills gained and a reinforced understanding of data warehousing concepts.
IRJET- Search Improvement using Digital Thread in Data AnalyticsIRJET Journal
This document discusses the use of digital thread in data analytics to improve search and provide end-to-end visibility across product lifecycles. Digital thread is a communication system that connects manufacturing process elements and provides a complete view of each element throughout the lifecycle. It allows sharing of information across organizations and suppliers. Digital thread brings quality gains by managing large amounts of data and complex supply chains. It helps enterprises quickly redesign products and meet timelines while maintaining visibility of each component's journey. The document proposes using a Neo4j graph database hosted on AWS cloud to implement a digital thread that links product data. This would provide security, performance, and analytics benefits across the overall manufacturing process.
Unifying the big data analytics stack by enabling ETL, OLAP, visualization, and collaboration via a single interface. Get an End To End implementation of The Modern Analytics Architecture.
How Does Data Architecture Training Shape Information Infrastructure?EW Solutions
As data volume, variety, and velocity expand exponentially across systems, proactively governing architecture and infrastructure has become a strategic priority. While technologies and analytics promise valuable insights, realizing potential relies on standardized, interoperable information environments purpose-built to support use cases.
Watch full webinar here: https://buff.ly/2mHGaLA
What started to evolve as the most agile and real-time enterprise data fabric, data virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.
Attend this session to learn:
• What data virtualization really is
• How it differs from other enterprise data integration technologies
• Why data virtualization is finding enterprise-wide deployment inside some of the largest organizations
The term Actionable Architecture moves the EA from a static project to a central platform for the capture and dissemination of IT and business process information.
In essence, EA becomes a strategic foundation for knowledgeable decision making and is based on traceable facts in a repository.
Analytics, machine e deep learning, data/event streaming
- Big data streaming: abilitare la macchina del tempo
- Real time event streaming e nuovi paradigmi concettuali: transazioni distribuite, consistenza eventuale, proiezioni materializzate
- Real time event streaming e nuovi paradigmi architetturali: Enterprise service bus, Event store, Database delle proiezioni
- Cenni di Domain Driven Design: una visione strategica della modellazione del proprio dominio di business nell'era dei Big Data
Analytics, machine e deep learning, data/event streaming
Big data streaming: abilitare la macchina del tempo
Real time event streaming e nuovi paradigmi concettuali:
- Transazioni distribuite
- Consistenza eventuale
- Proiezioni materializzate
Real time event streaming e nuovi paradigmi architetturali:
- Enterprise service bus
- Event store
- Database delle proiezioni
Cenni di Domain Driven Design: una visione strategica della modellazione del proprio dominio di business nell'era dei bi Data.
Intro to big data and applications - day 2Parviz Vakili
The document provides an introduction and references for a presentation on big data and applications. It includes sections on data architecture, data governance, data modeling and design, and reference architectures for big data analytics. The presentation template was created by Slidesgo and credits are provided.
Watch full webinar here: https://buff.ly/2XXbNB7
What started to evolve as the most agile and real-time enterprise data fabric, Data Virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.
Attend this session to learn:
*What data virtualization really is
*How it differs from other enterprise data integration technologies
*Why data virtualization is finding enterprise wide deployment inside some of the largest organizations
This presentation provides a high-level overview for the practice of IT Architecture in today's enterprise. It is the first in several IT Architecture presentations we will be providing.
Enterprise Systems Architecture Of An OrganizationNicole Jones
Enterprise Systems Architecture Of An Organization discusses enterprise system architecture, which is the overall IT system architecture of an organization. It consists of the architectures of individual systems and their relationships in the perspective of an organization. The paper aims to discuss key questions about enterprise architecture, present guidelines for implementation, and make people aware of possible consequences. Establishing Architecture For Large Enterprise Solutions addresses challenges in establishing architecture and design for large enterprise solutions in agile environments. Iterative architecture and design could result in rework and redundancy, especially for systems with many integration points across applications. The paper proposes a process for architecture development for enterprise applications in agile environments. Implementing Enterprise Architecture For A Private Bank discusses a project to implement enterprise architecture for a private bank to
Similar to Building the Architecture for Analytic Competition (20)
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
Assessment and Planning in Educational technology.pptxKavitha Krishnan
In an education system, it is understood that assessment is only for the students, but on the other hand, the Assessment of teachers is also an important aspect of the education system that ensures teachers are providing high-quality instruction to students. The assessment process can be used to provide feedback and support for professional development, to inform decisions about teacher retention or promotion, or to evaluate teacher effectiveness for accountability purposes.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Building the Architecture for Analytic Competition
1. Building the Architecture
for Analytic Competition:
Why the Architecture Foundation
is so Critical to Success
Prepared by
William McKnight
www.mcknightcg.com
Sponsored by
2. EB-7592 > 0513 > PAGE 2 OF 11
Contents Information Architecture Defined
Information Architecture Defined. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Definition of Information Architecture. . . . . . . . . . . . . . . . . . . . 2
An Architecture Framework: Teradata’s Approach. . . . . . . . . 3
Design Patterns and Implementation Alternatives. . . . . . . . 4
Architecture Principles and Advocated Positions. . . . . . . . . . 5
Balancing Acts: Delivery Versus Architecture. . . . . . . . . . . . . . 6
Architecture Development and Information
Management Possibilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Harnessing Workloads . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
What Determines the Success of a Workload?. . . . . . . . . . . . . 7
Platform Selection Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
The No-Reference Architecture. . . . . . . . . . . . . . . . . . . . . . . . . . . 8
The Analytic Ecosystem. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
The Building Blocks of Analytic Competition. . . . . . . . . . . . . . . . . . 10
Teradata Analytic Architecture Technology. . . . . . . . . . . . . . 10
Teradata Analytic Architecture Solution Model. . . . . . . . . . . 10
A Consistent Approach Ensures Delivery. . . . . . . . . . . . . . . . . 11
Definition of Information Architecture
Lost amid the conversation on big data and the accelerating
advancement of just about every aspect of enterprise software
that manages information are the things that hold it all together.
Yet this is critical: information-management components must
come together in a meaningful fashion or there will be unneeded
redundancy and waste and opportunities missed. Considering that
optimizing the information asset goes directly to the organiza-
tion’s bottom line, it behooves us to play an exceptional game—
not a haphazard one—with our technology building blocks.
The glue that brings information management components
together is called “architecture”—the high-level plan for the data
stores, the applications that use the data, and everything in-
between. The “everything in-between” can be quite extensive as
that relates to data transport, middleware, and transformation.
Architecture dictates the level of data redundancy, summarization,
and aggregation since data can be consolidated or distributed
across numerous data stores optimized for parochial needs,
broad-ranging needs and innumerable variations in between.
There must be a true north for enterprise information architec-
ture. There needs to be a process to vet practices and ideas that
accumulate in the industry and the enterprise, and assess their
applicability to the architecture. We define this body of possibili-
ties in terms of “Design Patterns,” “Implementation Alternatives,”
“Architecture Principles” and “Advocated Positions.” These
concepts will be defined later in this paper, but what is important
to understand upfront is that analytic success requires focused
attention on information architecture.
Analytics, not reporting, is forming the basis of
competition today. Rearview-mirror reporting can be
essential in support of operational needs. However, the
large payback from information undoubtedly comes in
the form of analytics.
3. EB-7592 > 0513 > PAGE 3 OF 11
An Architecture Framework: Teradata’s Approach
Architecture is immensely important to information success—
and thus the recipe for that success begins with a good, well-
rounded and complete architectural approach. You can architect
an environment in a way that encourages data use by making
it perform well, putting up the architecture/data quickly, and
having minimal impact on users and budgets for ongoing mainte-
nance by building it well from the beginning.
Any or all of these requirements can quickly send users retreating
to the safety of status quo information usage, instead of taking on
what might seem like a formidable challenge of progressive usage.
But consider that in the small windows of time most users have
to engage with available data, they can only reach a certain level
of depth with the information. If the data is architected well, that
analysis will be deep, insightful and profitable. That is the power
of architecture.
If your service provider’s approach does not reflect this, the
result will be less than successful. Conversely, let’s look at
Teradata’s approach.
Teradata defines its Architecture Framework using the BIAS
approach, which consists of a focus on four key components
that comprise architecture, as well as two components that make
it all work together:
1. Business Architecture
2. Information Architecture
3. Application Architecture
4. Systems Architecture
5. Enablement
6. Program Management
Teradata defines the Business Architecture as understanding the
business requirements and providing vision to those requirements.
It has to do with defining the organizational business model,
structures, missions, goals, and processes, and understanding
which business fundamentals are vital for organizational success.
You can architect for known requirements effectively only by
understanding the context of eventual requirements.
The trajectory of systems in an organization is never a linear pro-
jection from a near-recent state to a current state through known
requirements. It must include contingencies for the unknown and
for the forked paths that systems can take in an organization. It
must impute vision derived from similar organizations, especially
more advanced and progressive ones. You do not invest in archi-
tecture to be status quo—you expect business success, supported
by architecture. Business Architecture is supported by Informa-
tion Architecture and Application Architecture.
Teradata’s Information Architecture supports Business Archi-
tecture through storing or otherwise processing the data that is
required, both internally and externally generated. Information
Architecture must take into consideration the numerous avenues
for data today.
Data must be put in the best place to succeed, which primarily
means it must be enabled quickly, well-performing, and scalable.
Information Architecture identifies the data (and the state of
the data) needed to support the Business Architecture and
includes logical and physical data models, and is supported by
Systems Architecture.
Like Information Architecture, Teradata’s Application Architecture
can subdivide applications in many ways. Application Architecture
uses Information Architecture and Systems Architecture to support
Business Architecture. While applications execute the functional
side of the Business Architecture, effective cross-referencing of
applications to the required tools and other applications is an
important component of the Application Architecture challenge.
Where the architecture rubber meets the road in Teradata’s
approach is Systems Architecture. This is the physical mani-
festation of architecture—the base upon which Information
Architecture and Applications Architecture reside and deliver
for the Business Architecture. Like in other areas, Systems
Architecture has the issues of subdivision and optimization.
4. EB-7592 > 0513 > PAGE 4 OF 11
Business, Information, Applications and Systems Architecture
are each disciplines unto themselves and may be optimized
individually. But they must be prioritized through Enablement.
Enablement evaluates cultural and organizational readiness
for architectural advances and prioritizes resources and work
effort accordingly.
According to Teradata, “Enablement evaluates cultural and orga-
nizational readiness for the architectural advances and prioritizes
resources and work effort accordingly. Enablement adds data
management capabilities with each implementation, such as a
data quality improvement program, a data governance capabil-
ity or one of the ones reviewed below, that support current and
future information initiatives.”
Much of the work building architecture for analytic competition
should include “soft” factors like Enablement, especially early in
the process.
Finally, according to Teradata, it is overall Program Management
that will intelligently bring everything together into meaningful
interim points that deliver analytics to address organizational
goals in an agile fashion. Program Management extends through-
out all implementations and ensures consistency and continuity
among many projects and players.
In summary, Teradata has a comprehensive approach to informa-
tion architecture. It acknowledges the importance of architecture
and skillfully decomposes architecture into layers that can be
discretely worked on in context of a full approach.
Design Patterns and Implementation Alternatives
In daily information management activity, decisions are made
with high frequency and major decisions are never far away.
In order to support those decisions with program context and
unbiased wisdom, it is necessary to make and implement design
choices. To accomplish this, Teradata suggests addressing what it
calls Design Patterns and Implementation Alternatives.
Design Patterns, according to Teradata, are a set of proven
architectural options for meeting an array of requirements. They
are reusable approaches to solve commonly occurring problems,
whether they are affecting a program at present or are those that
should be anticipated. It is important to have alternatives laid out
for different situations that are likely to be encountered, and plan
them out with an appropriate level of nuance and understanding
of the pros and cons of architectural decisions.
While leaving room for personal judgment, which is always
necessary, Teradata’s Design Patterns and its physical side—
Implementation Alternatives—provide a strong basis for
decision-making. This basis can be very beneficial in aligning
people with ultimate decisions. If left to an unsupported process,
decisions would not only take longer, they would be less accepted.
Design Patterns and Implementation Alternatives enable pro-
gram agility and appropriately shift some balance in what consti-
tutes success away from simply decision-making to the execution
of decisions.
Teradata’s Design Patterns and Implementation Alternatives
reduce the chances of failure by enabling a shop with alternatives
thought out in advance, without the pressure of an impending
sprint deadline. So why fail, even if it is “fast”? Well thought-out
Design Patterns and Implementation Alternatives enable speed
and reduce the chances for failure.
Enablement addresses where organizations are weak and
reasons they may fail.
5. EB-7592 > 0513 > PAGE 5 OF 11
Architecture Principles and Advocated Positions
While Design Patterns and Implementation Alternatives
are actionable, they are built upon what Teradata refers to
as Architectural Principles and Advocated Positions.
These beliefs about information and how things should be done
will change less frequently and may be advocated from higher
company positions than the Design Patterns and Implementa-
tion Alternatives. Advocated Positions help balance between
short- and long-term tradeoffs. They are the bedrock upon which
everything in the program flows; it is essential to get these right,
then ensure that the Design Patterns and Implementation Alter-
natives are a correct interpretation of the positions.
One of Teradata’s most important Advocated Positions is to
prioritize data access over data loading. Although both areas can
have performance issues, users (customers) of the analytic infra-
structure will always prioritize the time they are interfacing with
the data over the currency of the data. While layers of intake and
distribution may be physically separated in a data warehouse, and
thus able to be optimized for purpose, it is the overall architecture
that should first be optimized for data access. Today, that analytic
architecture extends well beyond the data warehouse, increasing
the need for architecture.
You need a process to make decisions as much as you need
the decisions themselves. With Architectural Principles and
Advocated Positions, Teradata has completely encapsulated the
necessary decision-making side of analytic architecture.
Architectural decision-making during development occurs with
high frequency, but peaks at the beginning of an effort when deci-
sions are made about what will be done in the sprint, and how.
The team then should be able to know what is needed from previ-
ous architecture decisions about their work and be empowered
to deliver. Architecture provides proven, reusable components to
accelerate development time.
Architecture is about facilitating prioritized data access,
not done for its own sake or to satisfy an abstract
standard.
Teradata’s Advocated Positions Include:
• Load everything into the core physical data model
• Touch it, take it (extract all columns)
• Reversibility of data errors out of the core physical data
model
• Reusability of common components
• Traceability of core data to its originating source
system
• Collect metadata, both technical and business
• Abstracted core physical data model from business
usage
• Include acquisition/staging layer in the architecture
• No production reporting from non-production systems
• Integrated logical and physical data models
• Permanently archive everything
• Enforce referential integrity
• Prioritize data access over data loading
• Full copy of source data objects in acquisition area
• A single route for data to flow into the core physical
model
6. EB-7592 > 0513 > PAGE 6 OF 11
Balancing Acts: Delivery Versus Architecture
Even business leaders can tend to take a tactical approach to the
execution of the requirements. However, it does not necessarily
take longer to satisfy information requirements in an architected
fashion. If architecture principles and technology possibilities are
not on the table beforehand, the means to satisfy the last require-
ment may be used to satisfy a new requirement. This may or may
not be appropriate.
This also disconnects the solution from prior solutions that may
lead the way to requirement satisfaction. For example, shops with
countless multidimensional structures—and with more being
built on almost a daily basis—can readily attest to a need for
architecture. By taking a disciplined architectural approach, we
have found that we are in a better position to solve the next busi-
ness problem now.
Teradata Unified Data
Architecture™
When organizations put all their data to work, they make
smarter decisions and create a new data-driven approach
to improving their business. Through deeper insights
about customers and operations, the data delivers
competitive advantage for leading organizations that are
able to compete on analytics by leveraging all their data.
Companies should exploit this market opportunity
to compete on analytics by creating a strong analytic
foundation based on a comprehensive data architecture
that leverages existing, new, and emerging technologies.
This architecture should contain three main capabilities:
• Data Warehousing—Integrated and shared data
environments to manage the business, and deliver
strategic and operational analytics to the extended
organization
• Data Discovery—Discovery analytics to rapidly unlock
insights from big data through rapid exploration using
a variety of analytic techniques that are accessible by
mainstream business analysts.
• Data Staging—Loading, storing, and refining data in
preparation for analytics
Teradata has responded to this market need by
developing Teradata® Unified Data Architecture™
that
allows organizations to leverage the complementary
values of the Teradata® Database, Teradata Aster
SQL-MapReduce®, and open-source Hadoop® technologies.
This Unified Data Architecture™ helps companies
define and deploy an architecture that makes use of
these best-of-breed technologies in a way that unleashes
the value of their data. Companies can apply the right
technology to the right analytical opportunities
so business users can isolate intelligent signals—
and have an architecture for analytic decisions.
7. EB-7592 > 0513 > PAGE 7 OF 11
Architecture Development and Information Management Possibilites
There is a need for architecture that falls outside of captive project
timeframes and may seem somewhat removed from user require-
ments—at least to users. However, the architecture requirements
outlined here play a vital role in delivering user requirements.
They are a skillful interpretation of user requirements.
The best way to look at an analytics program is as a series of
architecture sprints. Taking on analytics as architecture means
analytics will be done to internally adjudicated current standards
and built to company priorities.
Architecture requires its own codified efforts. The continuous
activity of information management is architecture. With disci-
pline, Teradata Design Patterns and Implementation Alternatives
as well as Architecture Principles and Advocated Positions will be
continually used over time, providing ongoing value by limiting
risk and not reinventing the wheel.
Without architecture, analytic development is destined for high
levels of wasted effort, restarts, redundancy and, most damaging,
missed opportunity.
Harnessing Workloads
Workloads comprise functionality necessary to achieve with data,
as well as the management of the data itself. Harnessing work-
loads for allocation to an architecture component is both an art
and a science. There are user communities with a list of require-
ments upon a set of data. There are other user communities
with their own list of requirements on the same data. Is this one
workload? If ultimately it is best to store the data in one location
and use the same tool(s) to satisfy the requirements, the practical
answer is “yes.”
When does the “set of data” end and become a different workload?
It could, practically speaking, be when a new data store is appro-
priate. Harnessing workloads can be puzzling, but ultimately
workloads need to be ring-fenced for architecture purposes.
What Determines the Success of a Workload?
Many technology types have emerged in recent years to support
the idea that analytic data needs to perform—the primary means
of judging the success of a workload. As previously mentioned, it
is the performance of the data access that constitutes the perfor-
mance of a workload.
Getting to fast performance quickly is the second measure of the
success of an analytic workload. In the end, if the good perfor-
mance goes away quickly because the application is not scaling,
all would be for naught. The third measure of workload success
is scale. Note that this does not mean the initial Systems Archi-
tecture must last forever untouched. It does mean that Systems
Architecture is maintained without user impact. As far as they
are concerned, it hums along. Architecture component selection
is more important than ever because it must scale with exponen-
tially increasing data volumes and user requirements.
Information Management is nothing more than the
continuous activity of architecture.
8. EB-7592 > 0513 > PAGE 8 OF 11
Platform Selection Process
Many companies are not having success with their workloads
due to a lack of focus on architecture. Specifically, if the analytic
architecture possibilities are not known or considered for a work-
load, it is quite likely that the platform used for the last workload
will be used again for the new workload. The more the platform
possibilities are considered for the workload, the better the chance
for success of that workload.
There are many platform categories (each designed for specific
types of workloads) for storing data in the analytic architecture.
These will be discussed in the next section. There is no “one size
fits all” when it comes to platform selection. There is a best plat-
form for each workload and the odds of workload success go up
tremendously if the correct platform is selected.
The No-Reference Architecture
We are in the post-reference architecture era of information man-
agement. The 1990s were the decade of vendors going in and out
of shops holding up laminated, uncustomized reference architec-
tures and convincing clients to strive to attain that picture.1
Once
they did, it was assumed, all their problems would be solved. It
was also more palatable to the technology manager to hold out
a technical standard to hit, as opposed to suggesting he must hit
business goals with architecture.
An analytic architecture approach keeps business goals foremost
in mind. This also means that all shops will manifest different
architectures. That “reference” architecture will also continu-
ally change. Leadership must have an agile mindset to keep it
updated. This is the essence of “no-reference” architecture. It is
not definable in laminate. It is empowered with support compo-
nents to meet all foreseeable business goals and it will change to
meet those goals. And it considers all possibilities, knowing that it
is controls of one of the most important assets that the company
has—information—and one of the most important means of
modern competition—analytics.
The Analytic Ecosystem
Analytics do not solely exist in the post-operational world. As
a matter of fact, the whole notion of a hard boundary between
operational (characterized by the ERP) and the post-operational
(characterized by the data warehouse) is going away. Analytics
certainly can be operational. So can Business Intelligence (BI). So
much of what we’ve learned with post-operational BI is now being
applied to the operational environment in the form of operational
BI like operational dashboards, stream processing, and master
data management.
However, we must distinguish between creating and using analyt-
ics. Analytics are used everywhere and should be generated from
data created everywhere.
We must get beyond making that default data store selection
discussed earlier. We must have knowledge of, and consider, a list
of usual suspects for analytic workloads. It includes:
1. The relational data warehouse, augmented with columnar
capabilities
2. An analytic database management system
3. A data warehouse appliance
Leading examples of these data stores will be examined the next
section. For now, let us emphasize the interplay of the analytic
components. There are no set rules for how data will flow in the
analytic architecture.
Architecture is important, practical, and holistic, and
drives analytic and organizational success.
Proceeding with analytics without an architecture
approach is like trying to solve a Rubik’s Cube
blindfolded. Sure, some extraordinary people, with
extensive practice, can do it, but why make it so hard?
1. Some vendors still do this
9. EB-7592 > 0513 > PAGE 9 OF 11
It is important to work with a company that understands
the methodology and components of architecture,
and has the experience to help create an analytic
organization.
While directionally the data warehouse will feed data marts, there
will be marts that do the reverse and stand alone. There are appli-
cations that need unadulterated source data—not data that has
gone through the data warehouse first. Even if the data warehouse
certifiably does not alter the data, applications in audit, security,
and the like will prefer the nondependent (on the data warehouse)
data mart.
This is not to say that nondependent data marts do not happen
otherwise. They do. If the architecture is not sound and a focus
of the program, the value-add of data passing through the data
warehouse will not be clear. Architecture, and therefore ulti-
mately business, may take a hit in these environments.
Analytic database management systems such as Teradata Aster’s
(discussed in the next section) may also play a strong role in the
post-operational analytic environment. Though these systems
do not replace the data warehouse, they store the increasingly
important unstructured and semi-structured data of an organi-
zation. This is data that largely has been ignored or force fit into
relational structure over the years, to mixed results.
Obviously all of this big data will not be replicated into the data
warehouse, so interplay between the warehouse and the analytic
database management system is a must. This gets back to the sup-
port components mentioned earlier.
Data warehouse appliances, however, could play the role of the
data warehouse—minimally in terms of intake and distribution
in the analytic environment, and storing history data. The data
warehouse appliance, in some circumstances, could play this data
warehouse-like role.
The other role necessary in the analytic environment is access.
The role of access is perhaps the most complex. Data is distributed
from the data warehouse and other platforms to the best platform
for the data access in an architected environment.
10. EB-7592 > 0513 > PAGE 10 OF 11
The Building Blocks of Analytic Competition
Understanding the meaning and importance of architecture is not
enough. It is imperative to implement the analytic environment
with an architecture focus. This doesn’t happen by accident.
Likewise, moving forward in an analytic program with agility
means bringing support components to the table. And just as we
need to leverage the support components, we need to leverage our
partner for the analytic architecture. The partner should bring
extensive architectural understanding and experience, and the right
components to bear to create the proper analytic environment.
These components include not only technology, but also a port-
folio of “jump starts” for the use of the technology. In the case of
Teradata, all needed components are already in place, integrated,
and delivering world-class analytic organizations all over the
world with the BIAS approach.
Teradata Analytic Architecture Technology
Teradata’s offerings undoubtedly stand out for data warehouse
and data mart appliance platforms. Its Active Enterprise Data
Warehouse line, based on the Teradata®
Database, supports more
than 50 percent of large-scale data warehouses today. All database
functions in Teradata systems are always done in parallel, using
multiple server nodes and disks with all units of parallelism par-
ticipating in each database function.
Teradata Optimizer is grounded in the knowledge that every
query will be executing on a massively parallel processing system.
Teradata manages contending requirements for resources through
dynamic resource prioritization that is customizable by the cus-
tomer. The server-nodes interconnect was designed specifically
for a parallel processing multi-node environment. This inter-
connect is a linearly scalable, high-performance, fault-tolerant,
self-configuring, multi-stage network.
In Teradata 14, Teradata added columnar structure to a table,
effectively mixing row, column, and multi-column structures
directly in the DBMS. With intelligent exploitation of Teradata
Columnar, there is no longer the need to go outside the data
warehouse DBMS for the power of performance that columnar
provides, and it is no longer necessary to sacrifice robustness and
support in the DBMS that holds the post-operational data to get
the advantages of columnar.
Teradata has extended its leadership from their EDWs into their
appliance family for midmarket enterprise EDWs, as well as data
marts for large companies.
The Teradata Data Warehouse Appliance supports the EDW
approach to building the data warehouse and is the Teradata appli-
ance family flagship product. It is suitable for an upper-midmarket
true EDW or as the platform for a focused application. The
Teradata Data Mart Appliance is a more limited-capacity equiva-
lent of the Teradata Data Warehouse Appliance and is ideal for the
departmental or midmarket platform. The Teradata Extreme Data
Appliance is also part of the Teradata appliance family and repre-
sents affordability for the management of large quantities of data.
Teradata Aster’s analytic database management system, has
patent-pending In-Database MapReduce (MR), a hybrid row/col-
umn store with an MR approach. Its MPP architecture makes it
work for predictable as well as ad-hoc analytic use cases. It blends
the performance of a relational database (i.e., indexes, optimizers,
and more) with the programming flexibility of MapReduce (Java,
Perl, Python, .Net, etc.)
Teradata Analytic Architecture Solution Model
A semantic data model is a set of symbols and text describing the
information needed to answer a defined set of business ques-
tions. It is a representation of the access layer whose purpose is to
improve the simplicity, security, and speed of the data warehouse.
Its characteristics are:
• Usually dimensional
• Often implemented through views
• Easy and quick access to data
• Variety of ways to look at the same data
• Primary point of entry for BI tools
11. EB-7592 > 0513 > PAGE 11 OF 11
The semantic data model is usually dimensional but can also
represent other types such as Analytical Data Sets. There are two
data modeling mindsets: relational and dimensional. A relational
model captures the business rules. A dimensional data model cap-
tures the navigation paths and focuses on evaluating the meaning
of the business being monitored through metrics such as Gross
Sales Amount and Number of Customers. Most semantic data
models are dimensional because such models support business
questions that follow the pattern of:
• What do I want to see?
• What do I want to see it by?
• What constraints are there on the results?
The semantic data model is often implemented through views.
A semantic data model can be shown at conceptual, logical, and
physical levels of detail. At a physical level, it is often implemented
as views over the integrated data layer. The semantic data model
also provides quick and easy access to data—users and BI tools
need to be able to answer business questions quickly and easily.
In addition, the semantic data model must be designed to support
a variety of ways to look at the same data. Although an order
may be depicted just one way in the integrated data layer, it can
be shown in multiple ways across multiple semantic data models
depending on business needs. Also the semantic data model is the
primary point of entry for BI tools.
A Consistent Approach Ensures Delivery
Architecture is not easy to come by without focused effort. It can
easily be shortchanged if it is not understood that it is the direct
cause of analytic success. Architecture is a way of life for deliver-
ing analytics and a consistent approach ensures that delivery.
Teradata’s consistent approach features:
• A multi-component approach—BIAS—to architecture
• A sound, repeatable, and successful methodology
• Use of Architectural Principles and Advocated Positions
• Use of Design Patterns and Implementation Alternatives
• World-class technology building blocks
• Use of architecture solution model building blocks
Teradata provides the building blocks for the analytic
architecture solution model.
The Best Decision Possible and Unified Data Architecture are trademarks, and Teradata, the Teradata logo and SQL-MapReduce are registered trademarks of Teradata Corporation and/
or its affiliates in the U.S. and world-wide. Apache and Hadoop are registered trademarks of the Apache Software Foundation.
William McKnight
William is a consultant specializing in information management. His company, McKnight Consulting Group, has served clients such as
Fidelity Investments, Teva Pharmaceuticals, Scotiabank, Samba Bank, Pfizer, France Telecom, and Verizon—in total, 16 of the Global
2000. William is also a very popular speaker worldwide and a prolific writer who has published hundreds of articles and white papers. An
Ernst&Young Entrepreneur of the Year Finalist, William is a former Fortune 50 technology executive and software engineer. He provides
clients with action plans, architectures, strategies, complete program, and vendor-neutral tool selection to manage information. He can
be reached at 214-514-1444 or through his website at www.mcknightcg.com.