This document introduces the MIKE2.0 methodology for information governance. MIKE2.0 is an open source methodology that provides a comprehensive framework for enterprise information management. It addresses the growing complexity of managing exponential data growth across increasingly federated organizations. The methodology promotes standards and transparency to improve data quality and business insights while increasing efficiency.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
This describes a conceptual model approach to designing an enterprise data fabric. This is the set of hardware and software infrastructure, tools and facilities to implement, administer, manage and operate data operations across the entire span of the data within the enterprise across all data activities including data acquisition, transformation, storage, distribution, integration, replication, availability, security, protection, disaster recovery, presentation, analytics, preservation, retention, backup, retrieval, archival, recall, deletion, monitoring, capacity planning across all data storage platforms enabling use by applications to meet the data needs of the enterprise.
The conceptual data fabric model represents a rich picture of the enterprise’s data context. It embodies an idealised and target data view.
Designing a data fabric enables the enterprise respond to and take advantage of key related data trends:
• Internal and External Digital Expectations
• Cloud Offerings and Services
• Data Regulations
• Analytics Capabilities
It enables the IT function demonstrate positive data leadership. It shows the IT function is able and willing to respond to business data needs. It allows the enterprise to meet data challenges
• More and more data of many different types
• Increasingly distributed platform landscape
• Compliance and regulation
• Newer data technologies
• Shadow IT where the IT function cannot deliver IT change and new data facilities quickly
It is concerned with the design an open and flexible data fabric that improves the responsiveness of the IT function and reduces shadow IT.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
This describes a conceptual model approach to designing an enterprise data fabric. This is the set of hardware and software infrastructure, tools and facilities to implement, administer, manage and operate data operations across the entire span of the data within the enterprise across all data activities including data acquisition, transformation, storage, distribution, integration, replication, availability, security, protection, disaster recovery, presentation, analytics, preservation, retention, backup, retrieval, archival, recall, deletion, monitoring, capacity planning across all data storage platforms enabling use by applications to meet the data needs of the enterprise.
The conceptual data fabric model represents a rich picture of the enterprise’s data context. It embodies an idealised and target data view.
Designing a data fabric enables the enterprise respond to and take advantage of key related data trends:
• Internal and External Digital Expectations
• Cloud Offerings and Services
• Data Regulations
• Analytics Capabilities
It enables the IT function demonstrate positive data leadership. It shows the IT function is able and willing to respond to business data needs. It allows the enterprise to meet data challenges
• More and more data of many different types
• Increasingly distributed platform landscape
• Compliance and regulation
• Newer data technologies
• Shadow IT where the IT function cannot deliver IT change and new data facilities quickly
It is concerned with the design an open and flexible data fabric that improves the responsiveness of the IT function and reduces shadow IT.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall Enterprise Architecture for enhanced business value and success.
It is clear that Data Management best practices exist and so does a useful process for improving existing Data Management practices. The question arises: Since we understand the goal, how does one design a process for Data Management goal achievement? This program describes what must be done at the programmatic level to achieve better data use and a way to implement this as part of your data program. The approach combines DMBoK content and CMMI/DMM processes – permitting organizations with the opportunity to benefit from the best of both. It also permits organizations to understand:
- Their current Data Management practices
- Strengths that should be leveraged
- Remediation opportunities
In business, master data management is a method used to define and manage the critical data of an organization to provide, with data integration, a single point of reference.
Forget Big Data. It's All About Smart DataAlan McSweeney
This proposes an initial smart data framework and structure to allow the nuggets of value contained in the deluge of largely irrelevant and useless data to be isolated and extracted. It enables your organisation to ask the questions to understand where it should be in terms of its data state and profile and what it should do to achieve the desired skills level across the competency areas of the framework.
Every organisation operates within a data landscape with multiple sources of data relating to its activities that is acquired, transported, stored, processed, retained, analysed and managed. Interactions across the data landscape generate primary data. When you extend the range of possible interactions business processes outside the organisation you generate a lot more data.
Smart data means being:
• Smart in what data to collect, validate and transform
• Smart in how data is stored, managed, operated and used
• Smart in taking actions based on results of data analysis including organisation structures, roles, devolution and delegation of decision-making, processes and automation
• Smart in being realistic, pragmatic and even skeptical about what can be achieved and knowing what value can be derived and how to maximise value obtained
• Smart in defining an achievable, benefits-lead strategy integrated with the needs business and in its implementation
• Smart in selecting the channels and interactions to include – smart data use cases
Smart data competency areas comprise a complete set of required skills and abilities to design, implement and operate an appropriate smart data programme.
Introduction to Data Governance
Seminar hosted by Embarcadero technologies, where Christopher Bradley presented a session on Data Governance.
Drivers for Data Governance & Benefits
Data Governance Framework
Organization & Structures
Roles & responsibilities
Policies & Processes
Programme & Implementation
Reporting & Assurance
Data Governance Takes a Village (So Why is Everyone Hiding?)DATAVERSITY
Data governance represents both an obstacle and opportunity for enterprises everywhere. And many individuals may hesitate to embrace the change. Yet if led well, a governance initiative has the potential to launch a data community that drives innovation and data-driven decision-making for the wider business. (And yes, it can even be fun!). So how do you build a roadmap to success?
This session will gather four governance experts, including Mary Williams, Associate Director, Enterprise Data Governance at Exact Sciences, and Bob Seiner, author of Non-Invasive Data Governance, for a roundtable discussion about the challenges and opportunities of leading a governance initiative that people embrace. Join this webinar to learn:
- How to build an internal case for data governance and a data catalog
- Tips for picking a use case that builds confidence in your program
- How to mature your program and build your data community
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
Data Profiling, Data Catalogs and Metadata HarmonisationAlan McSweeney
These notes discuss the related topics of Data Profiling, Data Catalogs and Metadata Harmonisation. It describes a detailed structure for data profiling activities. It identifies various open source and commercial tools and data profiling algorithms. Data profiling is a necessary pre-requisite activity in order to construct a data catalog. A data catalog makes an organisation’s data more discoverable. The data collected during data profiling forms the metadata contained in the data catalog. This assists with ensuring data quality. It is also a necessary activity for Master Data Management initiatives. These notes describe a metadata structure and provide details on metadata standards and sources.
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
LDM Slides: How Data Modeling Fits into an Overall Enterprise ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as it relates to data and its business impact across the organization.
Join this webinar for a discussion on how a data model can be combined with an overall enterprise architecture for enhanced business value and success.
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
Digital Transformation is a top priority for many organizations, and a successful digital journey requires a strong data foundation. Creating this digital transformation requires a number of core data management capabilities such as MDM, With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
The Chief Data Officer Agenda: Metrics for Information and Data ManagementDATAVERSITY
Welcome to The Chief Data Officer Agenda, a DATAVERSITY monthly webinar focused on the emerging priorities of the Chief Data Officer (CDO). What issues are CDOs facing now, and what should be on their Agenda. The webinar series is moderated by DATAVERSITY CEO and Founder, Tony Shaw, who will be joined each month by guest experts to discuss the requirements and demands on the burgeoning CDO role.
This month in the series:
The value proposition of enterprise information management is founded on Information being treated as an Asset. Information management professionals concur, but CxOs will say "So what?" In most organizations, they are both right! The conflict starts with one group thinking metaphorically, and the other literally. CDOs know that “Information asset” needs to be more than a metaphor…it has to be actionable. When you’re in charge of the application and value of data, how do you measure that? How do you measure progress? What types of metrics are there and which ones actually work? There is a lot more to measuring the value of information than common ROI.
This presentation will give you some starting points for real information asset management and information economics. You’ll learn some of the techniques being used successfully today, and considerations for quantifying the value and progress of information management. There is a means of reconciliation between the metaphors and reality, and this talk will outline a vision for the future, but with practical steps to help you get there.
This presentation explains Information Governance. Learn what it takes to improve the value of information, manage information risks, and reduce information costs.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall Enterprise Architecture for enhanced business value and success.
It is clear that Data Management best practices exist and so does a useful process for improving existing Data Management practices. The question arises: Since we understand the goal, how does one design a process for Data Management goal achievement? This program describes what must be done at the programmatic level to achieve better data use and a way to implement this as part of your data program. The approach combines DMBoK content and CMMI/DMM processes – permitting organizations with the opportunity to benefit from the best of both. It also permits organizations to understand:
- Their current Data Management practices
- Strengths that should be leveraged
- Remediation opportunities
In business, master data management is a method used to define and manage the critical data of an organization to provide, with data integration, a single point of reference.
Forget Big Data. It's All About Smart DataAlan McSweeney
This proposes an initial smart data framework and structure to allow the nuggets of value contained in the deluge of largely irrelevant and useless data to be isolated and extracted. It enables your organisation to ask the questions to understand where it should be in terms of its data state and profile and what it should do to achieve the desired skills level across the competency areas of the framework.
Every organisation operates within a data landscape with multiple sources of data relating to its activities that is acquired, transported, stored, processed, retained, analysed and managed. Interactions across the data landscape generate primary data. When you extend the range of possible interactions business processes outside the organisation you generate a lot more data.
Smart data means being:
• Smart in what data to collect, validate and transform
• Smart in how data is stored, managed, operated and used
• Smart in taking actions based on results of data analysis including organisation structures, roles, devolution and delegation of decision-making, processes and automation
• Smart in being realistic, pragmatic and even skeptical about what can be achieved and knowing what value can be derived and how to maximise value obtained
• Smart in defining an achievable, benefits-lead strategy integrated with the needs business and in its implementation
• Smart in selecting the channels and interactions to include – smart data use cases
Smart data competency areas comprise a complete set of required skills and abilities to design, implement and operate an appropriate smart data programme.
Introduction to Data Governance
Seminar hosted by Embarcadero technologies, where Christopher Bradley presented a session on Data Governance.
Drivers for Data Governance & Benefits
Data Governance Framework
Organization & Structures
Roles & responsibilities
Policies & Processes
Programme & Implementation
Reporting & Assurance
Data Governance Takes a Village (So Why is Everyone Hiding?)DATAVERSITY
Data governance represents both an obstacle and opportunity for enterprises everywhere. And many individuals may hesitate to embrace the change. Yet if led well, a governance initiative has the potential to launch a data community that drives innovation and data-driven decision-making for the wider business. (And yes, it can even be fun!). So how do you build a roadmap to success?
This session will gather four governance experts, including Mary Williams, Associate Director, Enterprise Data Governance at Exact Sciences, and Bob Seiner, author of Non-Invasive Data Governance, for a roundtable discussion about the challenges and opportunities of leading a governance initiative that people embrace. Join this webinar to learn:
- How to build an internal case for data governance and a data catalog
- Tips for picking a use case that builds confidence in your program
- How to mature your program and build your data community
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
Data Profiling, Data Catalogs and Metadata HarmonisationAlan McSweeney
These notes discuss the related topics of Data Profiling, Data Catalogs and Metadata Harmonisation. It describes a detailed structure for data profiling activities. It identifies various open source and commercial tools and data profiling algorithms. Data profiling is a necessary pre-requisite activity in order to construct a data catalog. A data catalog makes an organisation’s data more discoverable. The data collected during data profiling forms the metadata contained in the data catalog. This assists with ensuring data quality. It is also a necessary activity for Master Data Management initiatives. These notes describe a metadata structure and provide details on metadata standards and sources.
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
LDM Slides: How Data Modeling Fits into an Overall Enterprise ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as it relates to data and its business impact across the organization.
Join this webinar for a discussion on how a data model can be combined with an overall enterprise architecture for enhanced business value and success.
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
Digital Transformation is a top priority for many organizations, and a successful digital journey requires a strong data foundation. Creating this digital transformation requires a number of core data management capabilities such as MDM, With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
The Chief Data Officer Agenda: Metrics for Information and Data ManagementDATAVERSITY
Welcome to The Chief Data Officer Agenda, a DATAVERSITY monthly webinar focused on the emerging priorities of the Chief Data Officer (CDO). What issues are CDOs facing now, and what should be on their Agenda. The webinar series is moderated by DATAVERSITY CEO and Founder, Tony Shaw, who will be joined each month by guest experts to discuss the requirements and demands on the burgeoning CDO role.
This month in the series:
The value proposition of enterprise information management is founded on Information being treated as an Asset. Information management professionals concur, but CxOs will say "So what?" In most organizations, they are both right! The conflict starts with one group thinking metaphorically, and the other literally. CDOs know that “Information asset” needs to be more than a metaphor…it has to be actionable. When you’re in charge of the application and value of data, how do you measure that? How do you measure progress? What types of metrics are there and which ones actually work? There is a lot more to measuring the value of information than common ROI.
This presentation will give you some starting points for real information asset management and information economics. You’ll learn some of the techniques being used successfully today, and considerations for quantifying the value and progress of information management. There is a means of reconciliation between the metaphors and reality, and this talk will outline a vision for the future, but with practical steps to help you get there.
This presentation explains Information Governance. Learn what it takes to improve the value of information, manage information risks, and reduce information costs.
Making the Case for Information Governance: 10 Reasons Information Governance...Barclay T. Blair
The Economist Intelligence Unit, in a recent study on information governance, found that the single biggest worldwide challenge to successful adoption of information governance is the difficulty of identifying its benefits and costs.1 In other words, the difficulty of making the case for IG.
This eBook, in a small way, is designed to help readers with this big problem. There is no magic formula, no perfect argument for information governance. But, there are many reasons why IG makes sense today, and will make sense well into the future.
This piece doesn’t try to advance an airtight argument, nor does it propose a detailed financial model. The former doesn’t exist, and the latter is beyond the scope of this work. This doesn’t mean that you shouldn’t develop and use such models at your organization, as they are essential.
This eBook is based on a series of observations that I have made as a consultant, advisor, and author in this space over the course of the last decade. I hope to point out some simple reasons we need information governance. I hope it helps you on your journey to manage information better.
Information Governance -- Necessary Evil or a Bridge to the Future?John Mancini
How the world is changing -- Old paradigms are being stretched to the breaking point
How we usually think about governance -- It’s not just about what you keep
How should we respond? -- Building an action plan for the next 2-3 years
Planning Information Governance and Litigation ReadinessRich Medina
Presentation on Information Governance, Litigation Readiness, E-Discovery, and Records Management. Given at the AIIM-Wisconsin / Milwaukee Bar Association 6th Annual Electronic Discovery Conference on November 1, 2013.
Building the Information Governance Business Case Within Your CompanyAIIM International
Information Governance is a critical component in today’s business world to ensure that ALL information is visible, organized, and compliant. This solution can help your business to gain a competitive edge through the strategic and economic use of information. Despite the critical need, many companies still struggle to get funding and buy-in from upper management to move initiatives forward. This presentation will highlight key focus points for IG advocates to get internal stakeholders on board.
BSI British Standards Information Governance Workshop Presentation. Information Governance Workshop: Where next for Standards? Examines data protection and the role of standards, including BS 10012 for data protection.
Real-World DG Webinar: A Data Governance Framework for Success DATAVERSITY
A Data Governance Framework must include best practices, a practical set of roles & responsibilities for Data Governance built specifically for your organization, a plan for communicating with the entire organization and an action plan for applying governance in effective and measurable ways.
Join Bob Seiner for this Real-World Data Governance webinar as he discusses how to stay practical and work within the culture of your organization to develop and deliver a Data Governance Framework to meet your specifications and the business’ expectations.
This session will focus on:
Defining a Non-Invasive Operating Model of Roles & Responsibilities
Clearly Stating the Difference between Executive, Strategic, Tactical, Operational & Supporting Roles
Defining Data Stewards, Data Stewardship and How to Steward the Data
Recognizing & Identifying People into Roles Rather than Handing them to People as New Responsibilities
Leveraging the Framework to Implement a Successful Data Governance Program
The CIA Triad - Assurance on Information SecurityBharath Rao
Confidentiality, Integrity and Availability of Data are the basis for providing assurance on IS Security. This document gives a small overview of the impact of confidentiality, integrity and availability on the data and the need of securing the CIA.
Data Governance strategist and author James Orr defines data governance for executives and business leaders, and provide insight into the value it delivers. He discusses the benefits of implementing a data governance framework and explains how it improves business performance and mitigates risk.
First San Francisco Partner's Managing Director, Kelle O'Neal spoke to group of 150+ people at Oracle Open World, October, 2009 about Data Governance and its imperative use of technology to support data quality in large organizations.
Big Data in Financial Services: How to Improve Performance with Data-Driven D...Perficient, Inc.
Most banking and financial services organizations have only scratched the surface of leveraging customer data to transform their business, realize new revenue opportunities, manage risk and address customer loyalty. Yet a business’s digital footprint continues to evolve as automated payments, location-based purchases, and unstructured customer communications continue to influence the technology landscape for financial services.
Information management is key to business growth. It is a competitive advantage with the same merit as product knowledge and inventory availability. These once-held corporate competitive advantages are now considered “tickets to entry” and rather indistinguishable. Regulatory protections are largely gone, and when comparing your company’s features and functions, “demo parity” is the norm, especially within the larger industries.
Karya develops mobile application services that fits the unique needs of your business. Our Mobile Application Services helps the users to better utilize the power of Mobile Technology.
Estuate - Control Application Data GrowthEstuate, Inc.
Data growth is a significant challenge for most Fortune 1000 companies. The drivers of data growth are organic business growth, mergers and acquisitions, data retention requirements and something the author of this white paper refers to as the “Data Multiplier Effect.”
The Business Data Catalogue provides a method of integrating business data from back-end server applications, such as SAP or Siebel or other line of business applications, into Microsoft Office SharePoint Server 2007, without writing any code. Business Intelligence with Office SharePoint Server 2007 provides a framework for accessing that data, and for providing a toolset for business decision makers to turn the raw data into critical business information.
This presentation is designed to provide the audience with an overview of how BI can be used to create visual dashboards that assemble and display business information from multiple sources (e.g. Excel Services, SQL Reporting) using built-in web parts.
This is a business session and does not cover the technical implementation of BDC.
Adapt or Die: What It Means to Be The Marketer of The FutureJason Heller
Presented at eConsultancy JUMP NYC Jan 30, 2013 -- As agencies focus on what it means to be the agency of the future, marketers are facing an evolution of their own. Winning the hearts and minds of the the empowered and connected consumer requires the development and integration of a broad base of digital capabilities and cross-functional processes, embracing data and technology, attracting key talent, evolving legacy culture, and managing a portfolio of agencies, media and technology partners. Jason works closely with CMO's and marketing leaders to assess, develop and guide the organizations' digital marketing operations, and will share some of the best practices and common pitfalls that marketers experience on the journey to becoming the marketer of tomorrow.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Leading Change strategies and insights for effective change management pdf 1.pdf
Mike2.0 Information Governance Overview
1. Information Governance Solution Offering
Overview
Introducing MIKE2.0 (Method for Integrated Knowledge Environment)
Sean McClowry
IM Solution Suite Architecture and Delivery Lead
BearingPoint
November 2007
2. Contents
This presentation covers the following
Information Management
─ Our View
─ Why Governance is Important
Information Governance
─ Where it fits in the Overall Model
─ Guiding Principles
─ Key Activities
Getting Started: Information Maturity (IM) QuickScan
Information Governance Organisational Models
Advanced Techniques: Networked Information Governance
2
MIKE2.0 Methodology A Methodology for Information Development
3. Our View: IM is a “Complexity Problem”
Exponential growth of raw data and
information
Complexity of data and information is
not appreciated; they are in constant
flux across the enterprise 24 hours a day
Efforts to increase visibility and access to
relevant data and information are
expensive, with insufficient ROI
Better standards and transparency are
needed to increase confidence and
enable opportunities
Federation is a significant factor in
complexity
True business insight still very hard to
attain and quality is a huge problem
3
MIKE2.0 Methodology A Methodology for Information Development
4. Our View: The Solution Demands a Standard
Processes and standards for managing
and reporting data and information have
not kept pace – everyone has “their”
way
Many problems are solved through
informal networks – we need to link
formal structures to these networks
We want organisations to begin to
develop a competency for “Information
Development”
We aim to re-shape the industry by
creating the standard
An open and collaborative approach
is the key to delivering a standard for
such a complex problem
4
MIKE2.0 Methodology A Methodology for Information Development
5. Our View: Time to Act
The problem has been growing for years. Here is why IM is now a
Mainstream Issue:
High Impact: What is a business without its customers, its products and
its employees?
Federation: Organisations are becoming increasingly federated and even
minor issues with data cause viral problems when propagated across the
enterprise.
Globalisation: multi-lingual and multi-character set issues, 24x7 data
availability, support for multi-channels
Compliance initiatives: the War on Terror and corporate scandals in the
US have put additional pressures on the enterprises.
It’s a big, complex problem: There is much to be gained for vendors
of applications, information/integration technology and systems
integrators.
Information is an Asset: Organisations increasingly see the importance
of Information Development. Its not just functions and infrastructure.
5
MIKE2.0 Methodology A Methodology for Information Development
6. Our View: What is Information Governance?
What is Information Governance?
“Governancequot; is what information management is mostly all about. Information
management is the process by which those who set policy guide those who
follow policy. Governance concerns power, and applying an understanding of
the distribution and sharing of power to the management of information
technologies” [i]
What is the right way to apply it?
Governance can involve “centralised” power, but traditional push-down
models of architecture and standards only provide part of the solution.
Implemented the wrong way, governance can hamper innovation and
agility.
Some standards are needed or we cannot be agile or innovative – we’re
always fighting fires.
With a foundation of standards, we can distribute power and empower a
community to be far more productive.
[1] Strausmann, Paul A. Information, Information Management and Governance. (2001)
6
MIKE2.0 Methodology A Methodology for Information Development
7. Our Approach: Integrated Solutions
Info Mgmt
Business Info Asset Access, Search Enterprise Data Enterprise Info Architecture,
Intelligence Management and Delivery Management Content Mgmt Strategy & Gov
Corp Performance Information Lifecycle Enterprise Portals Document
Data Warehousing Information Governance
Management Management & Info Delivery Management
Metric & Dashboard Master Records, Contracts, Service Oriented, EII &
Information Security Enterprise Search
Design Data Mgmt and IP Management Model Driven Architecture
Profitability, Value & Metadata, Taxonomy Customer Data ERP Document Mgmt Enterprise Data
Mobile Device Access
Pricing Mgmt Cataloging Integration Integration Management Strategy
Real Time Workflow Information Data Quality Enterprise Content
Digital Asset
Customer Decisioning Management Improvement Management Strategy
Management
Operational Access Monitoring Content Management- Enterprise Information
Data Migration
Performance Mgmt & Control Web Content Assessment
Balance Business Data Center
Collaboration Information Mgmt COE
Scorecard Management
Environments, COI Organisation and
Knowledge Capture Shared Service Model
HR Performance Mgmt Information System
Rewards Usability
Data Mining, Analytics
Modeling & Simulation
Business Activity
Monitoring
Composite Solution Offerings
Info Mgmt Data Driven Information Networked Info Agile Info
Enterprise 2.0
Strategy IT Transformation Sharing Governance Development
7
MIKE2.0 Methodology A Methodology for Information Development
8. Our Approach: An Open Source Methodology
MIKE2.0 (Method for an Integrated Knowledge Environment)
MIKE2.0 (Method for an Integrated Knowledge Environment)
Information Management Framework
A comprehensive approach to Enterprise
Information Management
Much more than a classic methodology:
architecture, tools, code
Helping to shape new theories on Information
Management
Core methodology with formal release cycle
Governance council
Framework for any open method
Web / Enterprise 2.0
Developed as part of an open community
Can be integrated to internally held and shared
content
The goal is to develop “the standard” that
everyone can map to and help create
Open Source (software and content):
All content is freely available under the Create
Commons (Attribution) License
MediaWiki based
Have extended MediaWiki and contributed to the
community
Providing an organizing framework for
www.openmethodology.org development of open source IM technologies
8
MIKE2.0 Methodology A Methodology for Information Development
9. Our Approach: Open Source + Internal Assets
Enterprise 2.0 Mashups
Open Methodology site
Assessment Tools
Integrated Approach
9
MIKE2.0 Methodology A Methodology for Information Development
10. Our Approach: Collaborative Solutions
Information Management Solution Suite
Delivered through a Collaborative Approach
Enterprise Information Management
Commercial & Open Source
Core Solution Offerings by Solution Capabilities
Business Solutions
Product Solutions
Information Access, Search and
Business Intelligence
Asset Management Content Delivery
Enterprise Data Management Enterprise Content Management
Information Strategy, Architecture and Governance
Sets the new standard for Information Development through an Open Source Offering
10
MIKE2.0 Methodology A Methodology for Information Development
11. Our Approach: Supported through a Foundation
Information Management Solution Suite
Delivered through a Collaborative Approach
Enterprise Information Management
Commercial & Open Source
Solution Capabilities that provide a foundation for Suite Delivery
Architecture Framework
Business Solutions
Governance Framework
Product Solutions
Overall Information Access, Search and
Business Intelligence Usage Model
ImplementationAsset Management
Guide Content Delivery
Enterprise Data Management Content Management
Enterprise
Foundational Solutions
Information Strategy, Architecture and Governance
Supporting Assets
Sets the new standard for Information Development through an Open Source Offering
11
MIKE2.0 Methodology A Methodology for Information Development
12. Information Governance: Guiding Principles
Build an Information Centric Organisation
1. Accountability. Due the nature of information capture and how it flows
across the enterprise, everyone has a role to play in how it is governed.
Key roles are filled by senior executives such as the CIO, Information
Architects and Data and Content Stewards.
2. Efficient Operating Models. Common standards, methods,
architecture and collaborative techniques allow the Governance model to
be implemented in a physically central, virtual or offshore model.
3. Senior Leadership. Senior Leaders must align and work towards a
common goal of improved information, while appreciating Information
Management is still immature as a discipline and be ready for challenges.
12
MIKE2.0 Methodology A Methodology for Information Development
13. Information Governance: Guiding Principles
Treat Information as an Asset
4. Historical Quantification. Common architectural models and tools-
based quantitative assessments of data and content are key aspects of
establishing a known baseline to move forward.
5. Information Value Assessment. Organizations should provide a
mechanism to assign an economic value to the information assets and
the resulting impacts of Information Governance practices.
6. A Common Methodology. An Information Governance programme
should include a common set of activities, tasks and deliverables to build
a competency
7. Standard Models A common definition of terms, domain values and
their relationships is one of the fundamental building blocks of
Information Governance.
8. Governance Tools. Measuring the effectiveness of an Information
Governance program requires tools to capture assets and performance.
13
MIKE2.0 Methodology A Methodology for Information Development
14. Information Governance: Guiding Principles
Be Pragmatic in a Strategic Context
9. Strategic Approach. Improvements will typically be measured over
months and years, not days. This model must allow for tactical
improvements.
10.Comprehensive Scope. An Information Governance approach should
be comprehensive in its scope, covering structured data, unstructured
content and the whole lifecycle of information.
11.Architecture. An Information Management architecture should be
defined for the current-state, transition points and target vision.
12.Continuous Improvement. It is not always cost-effective to fix all
issues in a certain area, but to instead follow the “80/20 rule”. It should
re-factor a baseline through audits, monitoring, technology re-factoring
and personnel training.
13.Flexibility for Change. While an Information Governance program
involves putting standards in place, it must have an inbuilt pragmatism
and flexibility for change.
14
MIKE2.0 Methodology A Methodology for Information Development
15. Networked Information Governance
Apply Web2.0/Enterprise.2.0 Principles for Better Governance
14. Collaborative Community. Collaborative technologies can streamline
communications to capture content in informal network as well as build the formal.
15. Organizing the Informal Network. Build a content model that is easily
populated through user-driven categorization, informal collaboration begins to take
on more formal structures.
16. Aggregation of Ideas. Not all good ideas have to come from the inside. Social
Computing techniques provide an easy way to bring linked content together.
17. Linking the Informal to Formal. The same principle of applying content
categories can be applied to formal governance processes.
18. Searching the Knowledge Network. Enterprise Search techniques should be
implemented to make this information easily accessible.
19. Collaborative Asset Management. The maturity of your business and technology
assets should be a known quantity and this information easily shared across the
organization.
20. Global Standards Bodies. Having an external perspective through a central
authority can help to balance competing interests and work to a similar approach.
15
MIKE2.0 Methodology A Methodology for Information Development
16. The 5 Phases of MIKE2.0
Information Development through the 5 Phases of MIKE2.0
Continuous Implementation Phases
Strategic Programme
Blueprint is done once
2 3
1 ent ent
ent
crem Increm Increm
In
Design
Develop
Roadmap &
Phase 1 Phase 2
Foundation
Business Assessment Technology Assessment
Activities
Deploy
Improve
Begin
Next
Increment
Phase 3, 4, 5
Improved Governance and Operating Model
16
MIKE2.0 Methodology A Methodology for Information Development
17. Key Governance Activities
The MIKE2.0 approach for improving Data Governance goes across all 5 phases of the methodology. The most
critical activities for improving Data Governance are as follows:
Activity 1.4 Organisational QuickScan
Activity 1.6 Information Governance Sponsorship and Scope
Activity 1.7 Initial Information Governance Organisation
Activity 2.7 Information Governance Policies
Activity 2.8 Information Standards
Activity 3.5 Business Scope for Improved Information Governance
Activity 3.6 Enterprise Information Architecture
Activity 3.7 Root Cause Analysis on Information Governance Issues
Activity 3.8 Data Governance Metrics
Activity 3.11 Data Profiling
Activity 3.12 Data Re-Engineering
Activity 5.11 Continuous Improvement - Compliance Auditing
Activity 5.12 Continuous Improvement - Standards, Policies and Processes
Activity 5.13 Continuous Improvement - Data Quality
Activity 5.14 Continuous Improvement - Infrastructure
Activity 5.15 Continuous Improvement - Information Development Organization
Activity 5.16 Continuous Improvement – MIKE2.0 Methodology
Other MIKE2.0 Activities are also relevant, but these are particularly focused on Data Governance
17
MIKE2.0 Methodology A Methodology for Information Development
18. Key Governance Activities
Phase 1. Business Assessment and Strategy Definition Blueprint
Quickly Understand Issues Establish Leadership Establish Team
Organisational IG Sponsorship Initial IG
QuickScan and Scope Organisation
• Conduct Information Maturity • Confirm scope of Data Governance • Establishment Data Governance
Assessment Program Council
• Build Inventory of Information • Confirm in-scope data subject • Assignment of roles and
Assets areas responsibilities
• Determine Economic Value of • Assign Data Stewards to each • Definition of communications model
Information subject area and tracking mechanism
• Assess organizational structure, • Re-alignment of Business and
people and their skills Technology Strategy
An initial gap analysis is developed by assessing the organisation’s current-state issues and vision for the future-
state. Data Governance scope driven by high-level information requirements and complemented by the
definition of a strategic conceptual architecture.
18
MIKE2.0 Methodology A Methodology for Information Development
19. Key Governance Activities
Phase 2. Technology Assessment and Selection Blueprint
Deliver Policy Framework Standards for Implementation Metadata Management
Info Governance Info Governance Initiate Metadata-
Policies Standards Driven Approach
• Definition of Information • Info Specification Standards Metadata Management goes across
Governance Policy Requirements multiple activities in MIKE2, through a
• Info Modelling Standards metadata-driven architecture
• Definition of Information
Governance Policies • Info Capture Standards
• Get some form of repository and
base meta-model in place from the
• Approval and Distribution of • Info Security Standards
onset
Information Governance Policies
• Info Reporting Standards
• Metadata management for improved
DG is more than a data dictionary
• The goal is Active Metadata
Integration
Driven by information management guiding principles, a Policy Framework and common set of Data Standards
are created that will be used throughout the implementation program. MIKE2 starts with a reference model for
metadata management
19
MIKE2.0 Methodology A Methodology for Information Development
20. Key Governance Activities
Phase 3. Roadmap and Foundation Activities
Determine Key Data Elements Overall KDE Architecture Determine Process Issues
Business Scope
Enterprise Root Cause
for Improved
Information Analysis of DG
Information
Architecture Issues
Governance
• Prevent Issues related to Source
• Define Business Process Scope for • Overlay System Architecture on
Increment Enterprise Data Model System Edits
• Determine KDEs and Prioritize by • Define Master Data Management • Prevent Issues related to Business
Business Impact Architecture Process
• Capture Recommend Business • Define BusinessTime Model for • Prevent Issues related to
Process Changes KDEs Technology Architecture
• Define Data Definitions and • Summarize Root Cause Issues and
Business Rules Recommend Changes
The MIKE2.0 governance approach focused around Key Data Elements (KDEs). These are the subset of data
elements that are used to make the most critical business decisions. The Enterprise Information Architecture is
built out over time using these KDEs to define a framework for Master Data Management.
20
MIKE2.0 Methodology A Methodology for Information Development
21. Key Governance Activities
Phase 3. Roadmap and Foundation Activities (continued)
Assess issues with KDEs Quantitatively Understand DQ Iteratively fix DQ issues
Data Governance Data Re-
Data Profiling
Metrics Engineering
• Define Metric Categories and • Prepare for Assessment • Prepare for Re-Engineering
Measurement Techniques
• Perform Column Profiling • Perform Data Standardization
• Gather Current-State Metrics on
• Perform Table Profiling • Perform Data Correction
each KDE
• Perform Multi-Table Profiling • Perform Data Matching and
• Define Target Metrics on each KDE
Consolidation
• Finalize Data Quality Report
• Perform Data Enrichment
• Finalize Business Summary of Data
Quality Impacts
Metrics are defined for how data will be measured initially as well as target measures. Data Profiling is used for
quantitative estimates and data is re-engineered in an iterative fashion. Artifacts stored in a metadata model.
21
MIKE2.0 Methodology A Methodology for Information Development
22. Key Governance Activities
Phase 5. Develop, Test, Deploy and Improve
Continuous Improvement Continuous Improvement Continuous Improvement
Standards,
Compliance
Policies and Data Quality
Auditing
Processes
• Attain Sponsorship of Data • Review and Revise Data • Conduct Ongoing Data Quality
Governance Policies Monitoring
Governance Board
• Review and Revise Data • Associate Data Quality Issues with
• Define Compliance Auditing
Governance Metrics Root Causes
Processes
• Review and Revise Data • Execute Issue Prevention Process
• Train Staff on Compliance
Governance Standards
Standards
• Review and Revise Data
• Conduct Auditing Processes
Governance Processes
•Present Auditing Results and
• Implement Changes as Required
Recommendations
The MIKE2.0 Methodology is based around the Continuous Improvement. That means that we are continually re-
factoring towards the strategic vision and there are planned activities to revisit the existing implementation.
22
MIKE2.0 Methodology A Methodology for Information Development
23. Key Governance Activities
Phase 5. Develop, Test, Deploy and Improve (continued)
Continuous Improvement Continuous Improvement Continuous Improvement
Information Contribute to
Infrastructure Development Open MIKE2.0
Organization Methodology
• Re-factor Integration • Move to a Central Architecture and Help improve the overall approach to
Infrastructure Delivery Model Data Governance used by our
community:
• Progressively Automate Processes • Develop Staff and their Skills
• Help complete wanted assets
• Review and Recommend Physical • Implement Data Governance
Infrastructure Changes Incentives • Assist with Peer reviews
• Move to a Metadata-Driven • Review and Revise • Propose new core supporting assets
Architecture Communications Model
• Recommend extensions to overall
methodology
Be an active collaborator
Users of MIKE2.0 are encouraged to be part of an active community. The collaborative environment for MIKE2
allows the core method to be improved over time, whilst within a release cycle and product roadmap for stability.
23
MIKE2.0 Methodology A Methodology for Information Development
24. Getting Started: QuickScan Assessment
Information Development through the 5 Phases of MIKE2.0
Continuous Implementation Phases
Strategic Programme
Responsible Status
Activity 1.4 Organisational QuickScan for Information
Blueprint is done once
Development
nt 2 nt 3
nt 1
eme In creme In creme
In cr
1.4.1 Assess Current-State Application Portfolio
1.4.2 Assess Information Maturity
Design
1.4.3 Assess Economic Value of Information
Develop
Roadmap &
Phase 1 Phase 2
Foundation
1.4.4 Assess Infrastructure Maturity
Business Assessment Technology Assessment
Activities
Deploy
1.4.5 Assess Key Current-State Information
Improve
Processes
Begin Next
Increment 1.4.6 Define Current-State Conceptual
Phase 3, 4, 5
Architecture
Improved Governance and Operating Model
1.4.7 Assess Current-State People Skills
1.4.8 Assess Current-State Organisational
Structure
Phase 1 – Business Assessment and Strategy Definition Blueprint
1.4.9 Assemble Findings on People, Organization
1.1Strategic 1.2 Enterprise 1.3 Overall Business and its Capabilities
Mobilisation Information Strategy for
Management Information
Awareness Development
1.4 Organisational 1.5 Future State 1.6 Data Governance
QuickScan for Vision for Sponsorship and
Information Information Scope
Development Management
1.7 Initial Data 1.9 Programme
1.8 Business
Governance Review
Blueprint Completion
Organisation
24
MIKE2.0 Methodology A Methodology for Information Development
25. Getting Started: QuickScan Assessment
Task 1.4.2 is used to conduct an object Information Governance Assessment
Task 1.4.2 is used to conduct an object Information Governance Assessment
25
MIKE2.0 Methodology A Methodology for Information Development
26. Getting Started: QuickScan Assessment
Information Maturity Model: Measure Your Data Governance Maturity Level
Information Maturity Model: Measure Your Data Governance Maturity Level
META Group developed a 5-level Information Maturity
Model (IMM) to use as an information maturity
guideline. We have extended this model as part of
MIKE2.0.
High Level 5
It is similar to the Software Capability Maturity Model
Optimised
(CMM) and focuses initially on data quality.
Information Development maturity
The key criteria for assessing information maturity is
Level 4
being able to measure it.
Information Development is
Managed a strategic initiative, issues
are either prevented or
Level 3 corrected at the source,
and best-in-class solution
Proactive Information managed as
architecture is
enterprise asset
implemented. Focus is on
Level 2 and well-developed
engineering processes and continuous improvement.
Reactive Information
organization structure
Development is part of
exists.
the IT charter and
Level 1
enterprise management
Aware Awareness and processes & exist.
action occur in
response to issues.
Action is either
There is awareness system- or
that problems exist department-specific. MIKE2.0 uses an objective assessment of your current
but the organization
and desired information maturity levels to construct a
has taken little action
program for improving Data Governance.
regarding how data is
managed.
Low
Information Accuracy & Organizational Confidence
High
26
MIKE2.0 Methodology A Methodology for Information Development
27. Getting Started: QuickScan Assessment
Level 1 Data Governance Organisation – Aware. An Aware Data Governance Organisation knows that the organisation has
issues around Data Governance but is doing little to respond to these issues. Awareness has typically come as the result of some
major issues that have occurred that have been Data Governance-related. An organisation may also be at the Aware state if they
are going through the process of moving to state where they can effectively address issues, but are only in the early stages of
the programme.
Level 2 Data Governance Organisation – Reactive. A Reactive Data Governance Organisation is able to address some of its
issues, but not until some time after they have occurred. The organisation is not able to address root causes or predict when
they are likely to occur. quot;Heroesquot; are often needed to address complex data quality issues and the impact of fixes done on a
system-by-system level are often poorly understood.
Level 3 Data Governance Organisation – Proactive. A Proactive Data Governance Organisation can stop issues before they
occur as they are empowered to address root cause problems. At this level, the organisation also conducts ongoing monitoring of
data quality to issues that do occur can be resolved quickly.
Level 4 Data Governance Organisation – Managed. A Managed Data Governance Organisation has a mature set of
information management practices. This organisation is not only able to proactively identify issues and address them, but defines
its strategic technology direction in a manner focused on Information Development.
Level 5 Data Governance Organisation – Optimal. An Optimal Data Governance Organisation is also referred to as the
Information Development Centre of Excellence. In this model, Information Development is treated as a core competency across
strategy, people, process, organisation and technology. a
27
MIKE2.0 Methodology A Methodology for Information Development
28. Data Governance Maturity
Moving Up the Maturity Model
To formulate, communicate, pilot and deploy a centralised organisation for
Information Development is a significant undertaking. The following artifacts
from MIKE2.0 can be used to assist in this effort:
A comprehensive Role Inventory across aspects of the organisation with associated
competencies and metrics
A set of Position Descriptions based upon the Role Inventory
Organisational Structures populated with these Position Descriptions
Create assessment material to support manager and staff assessment of individual
competencies
Formulate a Gap Analysis based on target Organizational Structure and Role
competencies vs. current capabilities
To validate the processes and structures of the organization via a pilot script
A Training profile for staff
A Recruiting profile recommending to fill typical recruiting needs
An Organisational Transition Plan across the Data Governance Maturity Model
28
MIKE2.0 Methodology A Methodology for Information Development
29. Data Governance Organisational Model
Level 2 Data Governance Team (FS Institution Example)
There is a minimum team structure that should be used for data governance on any project. The
example model shows this data governance structure for a Data Warehouse implementation, where
the core focus is for risk management.
Executive
Sponsor
Program
Manager
Source Data
Risk Modeling
IT
System Warehouse
Coordinator Team Rep.
Managers Delivery
Manager
Data
Quality
Data Quality
Manager
Working Group
29
MIKE2.0 Methodology A Methodology for Information Development
30. Data Governance Organisational Model
Level 2 Data Governance Team – Roles and Responsibilities
Role Responsibility
Executive Sponsor Strategic oversight of program and related data issues
Stakeholders
Governance
Sponsorship of business cases for remediation efforts
Ownership of legacy system-specific issue resolution
Legacy System Manager
Provision of system SMEs for issue remediation
Management of issue escalations to business executives and source system owners
program Manager
Provision of resources for issue verification and remediation
IT Coordinator Overall guidance for technical issue resolution
Ensures remediation efforts align with overall data asset architecture
Management of internal trouble ticket process for source system remediation
Governance Working Group
Data Modeling Team Rep Overall guidance for issue prioritisation and functional resolution
Provision of risk modeling SMEs for data issue management
Management level oversight of data environment, data cleansing activities and deployment
Data Asset Delivery Manager Provision of technical data resources
Management responsibility for technical deliverables
Data Quality Manager Definition of the overall approach for short and long term DQ activities
Identification and management of critical DQ issues
Coordination of DQ resources
Oversight of the execution of DQ testing and reporting
30
MIKE2.0 Methodology A Methodology for Information Development
31. Data Governance Organisational Model
Level 3 Data Governance Team (FS Institution Example)
Focused on Data Investigation and Re-Engineering
Focused on Data Investigation and Re-Engineering
Executive
Data Governance Council Sponsor Enterprise Data Warehouse
Steering Committee
Data Quality Leader
Executive Steering
Data Strategy & Queue
Technical Analysts Overall Coordination of DQM
Management (DSQ) Committee Strategy Program
Department 5 Function 1
Department 1 Department 4
Department 3
Department 2
(IBD) (eg. Risk)
(eg. Equities) (MCD)
(IMD)
(eg. FID)
Data Stewards (End-to-end Responsibility for these Subject Areas)
Technical Analysts DQ Analysts
Business
Analysts
Define Standards Compliance Auditing
Define Standards Compliance Auditing
• Specification • Data Standard
Source Data Collaboration • Specification • Data Standard
Source Data Collaboration • Data Capture • Business Rule
• Data Capture
• Source Analysis • Business Rule
• Reporting
• Source Analysis • Data Management Process
• Reporting • Data Management Process
• Target Analysis
• Target Analysis
Define Business Rules Establish Metrics
Define Business Rules Establish Metrics
Data Modelling • Define • Metric Categories
Data Modelling • Define • Metric Categories
Collaboration • Test Compliance • Target Ratings
Collaboration • Test Compliance • Target Ratings
• Source to Logical
• Source to Logical
Business Process Definition
• Volume and performance Issue Management
Business Process Definition
• Volume and performance Issue Management
• Document & Model • Monitor & Report
• Document & Model • Monitor & Report
Physical Design
Physical Design
Collaboration
Definitions
Collaboration Profile & Measure
Definitions Profile & Measure
• Entities • Track Results
• Performance • Entities • Track Results
• Performance
Characteristics • Attributes • Facilitate Root Cause Analysis
Characteristics • Attributes • Facilitate Root Cause Analysis
31
MIKE2.0 Methodology A Methodology for Information Development
32. Data Governance Organisational Model
Level 3 Data Governance Team – Roles and Responsibilities
Role Description Time Commitment
Full time
Executive Sponsor The Executive Sponsor sets initial direction and goals for the program. In an ongoing basis,
the Executive Sponsor approves information policy and tracks the progress of quality
initiatives compared to target plan.
Full time
Data Strategy & Queue The DSQ has responsibility for developing Data Quality strategy and policies, as well as
Management (DSQ) leadership and supervision for the overall program. Additional responsibilities include
approval of identified business process improvements and the communication plan.
Full time
Data Quality Leader (DQL) The DQL provides day to day leadership over the DQM program. The DQL has significant
DQM expertise and is deeply involved in all aspects of the program while also participating
in the DQM Executive Steering Committee (which includes considerably approval
responsibility). The DQL is also responsible for managing business process improvement
and the communication plan.
Full time
Data Steward Data stewards act as the conduit between IT and the business and accept accountability for
data definition, data management process definition, and information quality levels for
specific data subject areas. Data stewardship involves taking responsibility for data
elements for their end-to-end usage across the enterprise.
Full time
Technical Analyst Technical Analysts are members of existing project teams that are assigned to the DQM
project when specific activities in their project areas are impacted. They provide the
technical expertise required to implement new tools or to improve existing systems.
Full time
Business Analyst Business Analysts are members of the existing Business Units that are assigned to the DQM
project when specific activities in their business areas are impacted. They provide the
business expertise required to define the usage of key data elements and to improve
business processes.
Full time
Data Quality Analyst Data Quality Analysts are fully dedicated to the DQM project. Their responsibility is to
provide expertise on quality improvement best practices and to perform auditing to ensure
projects are complying with data quality management processes and standards.
Full time
Data Owner Data Owners are responsibility for the accuracy of the data in their area of responsibility.
For credit-related data, the Account Officers are the data owners. Ideally the data owners
would have a single interface into the source systems where key data elements reside.
32
MIKE2.0 Methodology A Methodology for Information Development
33. Data Governance Organisational Model
Level 4 Data Governance Team (FS Institution Example)
View Focused on Data Stewardship and Ownership, other teams would include technology and Roles from Level 3 Org
View Focused on Data Stewardship and Ownership, other teams would include technology and Roles from Level 3 Org
Executive Sponsor DG Steering Committee (Finance, Credit,
Enterprise Data Architect, Audit, Retail,
C-Level
Wholesale, etc.
DATA GOVERNANCE
COUNCIL Chief Architect
XBR Program Manager
MDM Enterprise Data Warehouse SYSTEM & PROCESS SYSTEM OF RECORD OWNERS
OWNERS
BUS DATA CONCEPT IT
OWNERS DATA STEWARDS
MDM BUS: tbd BUS: tbd BUS: tbd
BUS: tbd
Business IT: tbd IT: tbd IT: tbd
IT: tbd
Owner
Classification Product
Classification Product PRMS CRS
New Position New Position BUS: tbd
tbd tbd BUS: tbd BUS: tbd BUS: tbd
#4 #4
IT: tbd
MDM Business IT: tbd IT: tbd IT: tbd
Analyst
Involved Party Hierarchy
Involved Party Hierarchy
New Position New Position BUS: tbd BUS: tbd BUS: tbd BUS: tbd
tbd tbd
#5 #4 IT: tbd IT: tbd IT: tbd IT: tbd
Business
Analyst – Arrangement Resource Item Arrangement
Credit Reports Resource Item
New Position New Position tbd BUS: tbd BUS: tbd BUS: tbd
BUS: tbd
#5 #5
IT: tbd IT: tbd IT: tbd
IT: tbd
IT Steward Event Event
New Position (To Be
BUS:tbd BUS: tbd BUS: tbd
#5 Assigned)
IT: tbd IT: tbd IT: tbd
Data Quality Lead
Business and Technical Analysts (Pool of Data Quality Analysts
Business and Technical Analysts (Pool of
resources to be assigned) (Pool of resources to be assigned)
resources to be assigned)
33
MIKE2.0 Methodology A Methodology for Information Development
34. Data Governance Organisational Model
Level 4 Data Governance Team – Roles and Responsibilities
Role Responsibilities Time Commitment
– The Executive Sponsor will set the initial direction and goals for the program. On an ongoing basis, the
Executive – <5%
Executive Sponsor approves budgets, establishes highest level policies, and monitors information policy
Sponsor
setting and tracks progress of quality initiatives compared to target plan.
– Develop and monitor an overall strategic plan for data quality improvement encompassing all affected
Data – Quarterly
systems. Plan to include linkage and convergence of existing data warehouse’s and data marts.
Governance
– Adhoc
– Sponsor and champion for data quality initiatives for all systems, LOBs and functions. Ensure scheduling
Council
meetings as
and resource allocation across LOBs
needed
– Provide data quality feedback and progress across all LOBs, systems and functions
– Provide approval, prioritization, sign-off of major data quality initiatives.
– Communicate with business segments to ensure expectations for data quality initiatives are in-line with
what can be delivered.
– Oversight of business planning and requirements process to ensure data quality needs are appropriately
addressing the needs of the users.
– Resolution of escalated issues.
– Responsible for developing Data Governance strategy and policies, as well as leadership and supervision
Data – Monthly
for the overall program.
Governance initially
– Active working committee of the Data Governance board. Accountability for executing Board
Steering
– Move to
responsibilities.
Committee
quarterly
– Provide periodic data quality updates to the ITEC and policy committee
basis for the
– Definition and signoff of project scope, requirements and test results.
future
– Estimates high level funding needs, requests budget from the executive sponsor.
– Approval of identified data quality improvement initiatives.
– Will include members of the Lines of Business (Wholesale, Mortgage, Retail, PCS), Finance, IT, Credit
Risk Mgt, Company Quality Mgt, Audit, and the Enterprise Data Architect.
34
MIKE2.0 Methodology A Methodology for Information Development
35. Data Governance Organisational Model
Level 4 Data Governance Team – Roles and Responsibilities
Role Responsibilities Time Commitment
Data Quality – Provides day to day leadership over the data quality program. – Full time
Program – Focal point for coordinating System of Record (SOR) owners. – Staff support
Manager will be
– Guide and support requirements and testing of data quality initiatives
needed as
– Owner of scorecard process and execution. Provide scorecard feedback to all involved parties including
data
SOR owners, data concept owners, data stewards and to the Board
governance
– Ensures execution of policies and strategies of the Data Governance Board and Steering Committee.
grows
– Review and prioritizes projects, determine funding needs and requests funding approval from the Data
Quality Steering Committee
– Coordinate the release management program with LOBs and scheduling of data quality and technical
projects.
– Facilitates the development and training of best practice data quality policies, procedures and
methodologies.
– Monitors enterprise data quality milestones and performance measures to ensure enterprise-level data
quality. Provides feedback to ITEC and all LOBs
Enterprise Data – Provides single point of architectural coordination for all Enterprise Data Warehouse related approved – Full time
Architect initiatives
– Focuses on planning for infrastructure efficiencies, and linkage, cleansing and usage of data, ensures
implementation of remediation and the priority of issues
– Ensures the compliance and execution of the data governance program policies, processes and
procedures across data stewards
– Reconciliation, re-creation, metadata design and maintenance
Enterprise Data – Ensures the Enterprise Data Warehouse collectively meets the requirements of the business – Full time
Warehouse – Coordinates the resolution of issues identified by data concept owners and data stewards.
System and – Identifies new funding requirements, assists in prioritizing requests and submits to the data
Process Owner governance board for approval
– Coordinates on-going data integrity and linkage/usage with source system changes
– Coordinates efficient infrastructure investments
35
MIKE2.0 Methodology A Methodology for Information Development
36. Data Governance Organisational Model
Level 4 Data Governance Team – Roles and Responsibilities
Role Responsibilities Time Commitment
– The data concept owners initially will be senior credit risk management representatives responsible for enforcement of
Data Concept – Full time
common, enterprise wide business concepts for credit risk data.
Owner
– Provide business side leadership of data quality improvement initiatives.
(Business)
– Responsible for business concept definition, requirements definition and sign off, and testing review and sign off.
– They are responsible for prioritizing data quality projects and the appropriate use of data elements.
– Facilitate coordination required to resolve cross LOB naming and definition issues.
– Focuses on administering data policies, defining business rules, defining procedures for the data processes
– Responsible for on-going settlement of the Enterprise Data Warehouse with the SOR data.
– Oversight of one or more areas of an organization’s information models
Data Steward – 50%
– Will focus on a particular subject areas
(IT)
– Provide leadership on the IT side of data quality improvement initiatives by leading combined teams of technical,
business and quality analysts
– Participate, influence and sign off on data requirements and design of data quality related projects and applications.
– Determine how data will be managed
– Executes data quality scorecard for data subject areas across affected systems
– Provides technology direction for DQ improvement initiatives
– Documents and maintains data quality definitions and usage at the concept and data element level on Enterprise Data
Warehouse
– Accountable to the Data Governance Program Manager for planning and implementing data quality policies, strategies
System of – No changes
and initiatives at the application level
Record (SOR) required to
– Shapes, defines, manages and implements initiatives to improve data quality based upon data quality feedback
Owners existing
– Builds data quality projects into application strategic plan and LOB project funding plans commitment
– Provides business analysts and technical analysts to support data quality analysis and implementation levels
– Coordinates source system changes
– Responsible to exert influence and oversee input processes that feed system ensure consistent inputs in compliance
with standards and policies
– Partner with Enterprise Data Warehouse System and Process Owner to perform on-going reconciliations of their
systems with the Enterprise Data Warehouse
36
MIKE2.0 Methodology A Methodology for Information Development
37. Data Governance Organisational Model
Level 4 Data Governance Team – Roles and Responsibilities
Role Responsibilities Time Commitment
CDM Business – Responsible for assessment of data quality, remediation requirements and implementation of CDM – Full time
Owner – Provides requirements for extensions of Enterprise Data Warehouse data concepts and additional definitions
– Identify data quality issues and interacts with the Data Governance Lead for resolution
– Assessing the needs of end-users and to ensure the data is collected, aggregated, & reported accurately
– Coordinates prioritization of projects for self assessment gaps with Basel Steering Committee
– Responsible for on-going settlement of the cubes to the Enterprise Data Warehouse
– Initially: 100%
CDM IT – Improve and maintain the quality, accessibility and reusability of data and information
Steward – Focuses on administering data policies, defining business rules, defining procedures for the data processes
– Participate, influence and sign off on data requirements and project design on data quality related projects and
application, Executes data quality scorecard for data subject areas across affected systems
Data Quality – Manage the data quality analysts and coordinates the tasks for the business and technical analysts. – Full time
Lead – Point of contact to the data stewards/owners and the system owners. Will identify the data quality, business and
technical analysts needed to execute the data quality policies, processes, etc.
– Act as point of contact to the CDM, Enterprise Data Warehouse Stewards, and Systems of Record for small and
everyday changes required. Provide expertise on quality improvement best practices and to perform auditing to
ensure projects are complying with data quality management processes and standards.
Business – Business Analysts are members of the existing LOBs that are assigned to the Governance team when specific activities – As requested
Analysts in their business areas are impacted.
– Articulate the usage of data elements based on definitions and guidelines by data concept owners
– Validate and maintain business rules with the appropriate lines of business
– Define data field names, definitions, standards, will be assigned to work with the Data Stewards as necessary.
Accountable to the concept owners and/or the system owners
Technical – Technical Analysts are members of existing project teams that are assigned to the Governance team when specific – As requested
Analysts activities in their project areas are impacted.
– Understand data structure
– Provide technical expertise required to implement new tools and improve existing systems
37
MIKE2.0 Methodology A Methodology for Information Development
38. Data Governance Organisational Model
Roles of Data Stewards and Data Owners
Issue Escalation
DG Steering
Issue Escalation Committee
DATA CONCEPT OWNERS AND
Input and STEWARDS
coordination (TRAFFIC COPS)
with LOB’s on
precise data
Feedback to
definitions
Data Concept System Owners
Business
Owners
Close
COMPANY Business-IT
SOR Owners
coordination
LOBs
on data
definitions,
quality and
standards
DQ Lead
Data Concept IT
Stewards
DG
Improvement
Opportunities
New Opportunity
Definition
38
MIKE2.0 Methodology A Methodology for Information Development
39. Data Governance Organisational Model
Level 5 - Information Development Centre of Excellence
Organisation Framework: Balance of Power
In moving to the centralized model for information and
infrastructure development, Leadership, Architecture and
Delivery must represented on the team.
Leadership
The key team members across the areas must actively
collaborate through formal and informal reporting relationships
Architecture Delivery
to guide a strategic idea to its realization. It is an organizational
model that provides a “balance of power” whilst providing an
enabler to:
• Align Business and Technology Strategy
• Align Strategic and Tactical Objectives
• Technology procurement efficiencies
• Justify spend based on business case
• Balance risk with speed of delivery
• A common set of technology standards and policies
• Reuse at an enterprise level
This has shown to be a very successful model for contemporary
IT organizations and complements a centralized approach for the
Technology Backplane. It is a model focused on providing
solutions for the Business, driven by the needs of the Business.
39
MIKE2.0 Methodology A Methodology for Information Development