This document summarizes a presentation on clinical information governance at GlaxoSmithKline (GSK). GSK is combining data modelling, master data management, enterprise service bus, data stewardship, and enterprise architecture to simplify managing clinical study information. They have established different levels of data stewardship accountability and are implementing a clinical data stewardship framework. Their goal is to transform how clinical trial data is collected, reported, archived and retrieved to make trials more efficient and enhance patient safety.
Glossaries, Dictionaries, and Catalogs Result in Data GovernanceDATAVERSITY
Data catalogs, business glossaries, and data dictionaries house metadata that is important to your organization’s governance of data. People in your organization need to be engaged in leveraging the tools, understanding the data that is available, who is responsible for the data, and knowing how to get their hands on the data to perform their job function. The metadata will not govern itself.
Join Bob Seiner for the webinar where he will discuss how glossaries, dictionaries, and catalogs can result in effective Data Governance. People must have confidence in the metadata associated with the data that you need them to trust. Therefore, the metadata in your data catalog, business glossary, and data dictionary must result in governed data. Learn how glossaries, dictionaries, and catalogs can result in Data Governance in this webinar.
Bob will discuss the following subjects in this webinar:
- Successful Data Governance relies on value from very important tools
- What it means to govern your data catalog, business glossary, and data dictionary
- Why governing the metadata in these tools is important
- The roles necessary to govern these tools
- Governance expected from metadata in catalogs, glossaries, and dictionaries
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Enterprise Data Management Framework OverviewJohn Bao Vuu
A solid data management foundation to support big data analytics and more importantly a data-driven culture is necessary for today’s organizations.
A mature Data Management Program can reduce operational costs and enable rapid business growth and development. Data Management program must evolve to monetize data assets, deliver breakthrough innovation and help drive business strategies in new markets.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Reference data is something we often encounter in our projects. In our experience, it is often underestimated and does not get enough attention. In the webinar, we want to make you aware of some interesting aspects of ‘reference data’ such as how it relates to MDM, which it’s often mixed with.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
Glossaries, Dictionaries, and Catalogs Result in Data GovernanceDATAVERSITY
Data catalogs, business glossaries, and data dictionaries house metadata that is important to your organization’s governance of data. People in your organization need to be engaged in leveraging the tools, understanding the data that is available, who is responsible for the data, and knowing how to get their hands on the data to perform their job function. The metadata will not govern itself.
Join Bob Seiner for the webinar where he will discuss how glossaries, dictionaries, and catalogs can result in effective Data Governance. People must have confidence in the metadata associated with the data that you need them to trust. Therefore, the metadata in your data catalog, business glossary, and data dictionary must result in governed data. Learn how glossaries, dictionaries, and catalogs can result in Data Governance in this webinar.
Bob will discuss the following subjects in this webinar:
- Successful Data Governance relies on value from very important tools
- What it means to govern your data catalog, business glossary, and data dictionary
- Why governing the metadata in these tools is important
- The roles necessary to govern these tools
- Governance expected from metadata in catalogs, glossaries, and dictionaries
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Enterprise Data Management Framework OverviewJohn Bao Vuu
A solid data management foundation to support big data analytics and more importantly a data-driven culture is necessary for today’s organizations.
A mature Data Management Program can reduce operational costs and enable rapid business growth and development. Data Management program must evolve to monetize data assets, deliver breakthrough innovation and help drive business strategies in new markets.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Reference data is something we often encounter in our projects. In our experience, it is often underestimated and does not get enough attention. In the webinar, we want to make you aware of some interesting aspects of ‘reference data’ such as how it relates to MDM, which it’s often mixed with.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
Data Governance and Metadata ManagementDATAVERSITY
Metadata is a tool that improves data understanding, builds end-user confidence, and improves the return on investment in every asset associated with becoming a data-centric organization. Metadata’s use has expanded beyond “data about data” to cover every phase of data analytics, protection, and quality improvement. Data Governance and metadata are connected at the hip in every way possible. As the song goes, “You can’t have one without the other.”
In this RWDG webinar, Bob Seiner will provide a way to renew your energy by focusing on the valuable asset that can make or break your Data Governance program’s success. The truth is metadata is already inherent in your data environment, and it can be leveraged by making it available to all levels of the organization. At issue is finding the most appropriate ways to leverage and share metadata to improve data value and protection.
Throughout this webinar, Bob will share information about:
- Delivering an improved definition of metadata
- Communicating the relationship between successful governance and metadata
- Getting your business community to embrace the need for metadata
- Determining the metadata that will provide the most bang for your bucks
- The importance of Metadata Management to becoming data-centric
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Data Architecture Best Practices for Today’s Rapidly Changing Data LandscapeDATAVERSITY
With the rise of the data-driven organization, the pace of innovation in data-centric technologies has been tremendous. New tools and techniques are emerging at an exponential rate, and it is difficult to keep track of the array of technological choices available to today’s data management professional.
At the same time, core fundamentals such as data quality and metadata management remain critical in order for organizations to obtain true business value from their data. This webinar will help demystify the options available: from data lake to data warehouse, to graph database, to NoSQL, and more, and how to integrate these new technologies with core architectural fundamentals that will help your organization benefit from the quick wins that are possible from these exciting technologies, while at the same time build a longer-term sustainable architecture that will support the inevitable change that will continue in the industry.
DAS Slides: Data Quality Best PracticesDATAVERSITY
Tackling data quality problems requires more than a series of tactical, one off improvement projects. By their nature, many data quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control data quality issues in your organization.
Implementing the Data Maturity Model (DMM)DATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s Data Management capabilities. This model—based on the Capability Maturity Model pioneered by the U.S. Department of Defense for improving software development processes—allows an organization to evaluate its current-state Data Management capabilities, discover gaps to remediate, and identify strengths to leverage. In doing so, this assessment method reveals organizational priorities, business needs, and a clear path for rapid process improvements.
In this webinar, we will:
Describe the DMM model, its purpose and evolution, and how it can be used as a roadmap for assessing and improving organizational Data Management and Data Management Maturity
Discuss how to get the most out of a DMM assessment, including its dependencies and requirements for use
Discuss foundational DMM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Metadata is hotter than ever, according a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
The Business Value of Metadata for Data GovernanceRoland Bullivant
In today’s digital economy, data drives the core processes that deliver profitability and growth - from marketing, to finance, to sales, supply chain, and more. It is also likely that for many large organizations much of their key data is retained in application packages from SAP, Oracle, Microsoft, Salesforce and others. In order to ensure that their foundational data infrastructure runs smoothly, most organizations have adopted a data governance initiative. These typically focus on the people and processes around managing data and information. Without an actionable link to the physical systems that run key business processes, however, governance programs can often lack the ‘teeth’ to effectively implement business change.
Metadata management is a process that can link business processes and drivers with the technical applications that support them. This makes data governance actionable and relevant in today’s fast-paced and results-driven business environment. One of the challenges facing data governance teams however, is the variety in format, accessibility and complexity of metadata across the organization’s systems.
Data Quality Management - Data Issue Management & Resolutionn / Practical App...Burak S. Arikan
One of the key stepping stones to turn the theoretical Data Governance concept to reality is the implementation of data issue management and resolution (IMR) process which includes tools, processes, governance and most importantly persistence to get to the bottom of the each data quality issue.
This presentation lays down the basic components of IMR process and tries to guide practitioners. This process was applied along with an in-house configured SharePoint management tool with workflows.
Lessons in Data Modeling: Data Modeling & MDMDATAVERSITY
Master Data Management (MDM) can create a 360 view of core business assets such as Customer, Product, Vendor, and more. Data modeling is a core component of MDM in both creating the technical integration between disparate systems and, perhaps more importantly, aligning business definitions & rules.
Join this webcast to learn how to effectively apply a data model in your MDM implementation.
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
This introduction to data governance presentation covers the inter-related DM foundational disciplines (Data Integration / DWH, Business Intelligence and Data Governance). Some of the pitfalls and success factors for data governance.
• IM Foundational Disciplines
• Cross-functional Workflow Exchange
• Key Objectives of the Data Governance Framework
• Components of a Data Governance Framework
• Key Roles in Data Governance
• Data Governance Committee (DGC)
• 4 Data Governance Policy Areas
• 3 Challenges to Implementing Data Governance
• Data Governance Success Factors
Architect’s Open-Source Guide for a Data Mesh ArchitectureDatabricks
Data Mesh is an innovative concept addressing many data challenges from an architectural, cultural, and organizational perspective. But is the world ready to implement Data Mesh?
In this session, we will review the importance of core Data Mesh principles, what they can offer, and when it is a good idea to try a Data Mesh architecture. We will discuss common challenges with implementation of Data Mesh systems and focus on the role of open-source projects for it. Projects like Apache Spark can play a key part in standardized infrastructure platform implementation of Data Mesh. We will examine the landscape of useful data engineering open-source projects to utilize in several areas of a Data Mesh system in practice, along with an architectural example. We will touch on what work (culture, tools, mindset) needs to be done to ensure Data Mesh is more accessible for engineers in the industry.
The audience will leave with a good understanding of the benefits of Data Mesh architecture, common challenges, and the role of Apache Spark and other open-source projects for its implementation in real systems.
This session is targeted for architects, decision-makers, data-engineers, and system designers.
EMC Perspective: Big Data Transforms the Life Science Commercial ModelEMC
This EMC Perspective discusses EMC Global Services and its use of Big Data to transform the way life sciences companies develop their commercial sales models.
Data Governance and Metadata ManagementDATAVERSITY
Metadata is a tool that improves data understanding, builds end-user confidence, and improves the return on investment in every asset associated with becoming a data-centric organization. Metadata’s use has expanded beyond “data about data” to cover every phase of data analytics, protection, and quality improvement. Data Governance and metadata are connected at the hip in every way possible. As the song goes, “You can’t have one without the other.”
In this RWDG webinar, Bob Seiner will provide a way to renew your energy by focusing on the valuable asset that can make or break your Data Governance program’s success. The truth is metadata is already inherent in your data environment, and it can be leveraged by making it available to all levels of the organization. At issue is finding the most appropriate ways to leverage and share metadata to improve data value and protection.
Throughout this webinar, Bob will share information about:
- Delivering an improved definition of metadata
- Communicating the relationship between successful governance and metadata
- Getting your business community to embrace the need for metadata
- Determining the metadata that will provide the most bang for your bucks
- The importance of Metadata Management to becoming data-centric
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Data Architecture Best Practices for Today’s Rapidly Changing Data LandscapeDATAVERSITY
With the rise of the data-driven organization, the pace of innovation in data-centric technologies has been tremendous. New tools and techniques are emerging at an exponential rate, and it is difficult to keep track of the array of technological choices available to today’s data management professional.
At the same time, core fundamentals such as data quality and metadata management remain critical in order for organizations to obtain true business value from their data. This webinar will help demystify the options available: from data lake to data warehouse, to graph database, to NoSQL, and more, and how to integrate these new technologies with core architectural fundamentals that will help your organization benefit from the quick wins that are possible from these exciting technologies, while at the same time build a longer-term sustainable architecture that will support the inevitable change that will continue in the industry.
DAS Slides: Data Quality Best PracticesDATAVERSITY
Tackling data quality problems requires more than a series of tactical, one off improvement projects. By their nature, many data quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control data quality issues in your organization.
Implementing the Data Maturity Model (DMM)DATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s Data Management capabilities. This model—based on the Capability Maturity Model pioneered by the U.S. Department of Defense for improving software development processes—allows an organization to evaluate its current-state Data Management capabilities, discover gaps to remediate, and identify strengths to leverage. In doing so, this assessment method reveals organizational priorities, business needs, and a clear path for rapid process improvements.
In this webinar, we will:
Describe the DMM model, its purpose and evolution, and how it can be used as a roadmap for assessing and improving organizational Data Management and Data Management Maturity
Discuss how to get the most out of a DMM assessment, including its dependencies and requirements for use
Discuss foundational DMM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Metadata is hotter than ever, according a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
The Business Value of Metadata for Data GovernanceRoland Bullivant
In today’s digital economy, data drives the core processes that deliver profitability and growth - from marketing, to finance, to sales, supply chain, and more. It is also likely that for many large organizations much of their key data is retained in application packages from SAP, Oracle, Microsoft, Salesforce and others. In order to ensure that their foundational data infrastructure runs smoothly, most organizations have adopted a data governance initiative. These typically focus on the people and processes around managing data and information. Without an actionable link to the physical systems that run key business processes, however, governance programs can often lack the ‘teeth’ to effectively implement business change.
Metadata management is a process that can link business processes and drivers with the technical applications that support them. This makes data governance actionable and relevant in today’s fast-paced and results-driven business environment. One of the challenges facing data governance teams however, is the variety in format, accessibility and complexity of metadata across the organization’s systems.
Data Quality Management - Data Issue Management & Resolutionn / Practical App...Burak S. Arikan
One of the key stepping stones to turn the theoretical Data Governance concept to reality is the implementation of data issue management and resolution (IMR) process which includes tools, processes, governance and most importantly persistence to get to the bottom of the each data quality issue.
This presentation lays down the basic components of IMR process and tries to guide practitioners. This process was applied along with an in-house configured SharePoint management tool with workflows.
Lessons in Data Modeling: Data Modeling & MDMDATAVERSITY
Master Data Management (MDM) can create a 360 view of core business assets such as Customer, Product, Vendor, and more. Data modeling is a core component of MDM in both creating the technical integration between disparate systems and, perhaps more importantly, aligning business definitions & rules.
Join this webcast to learn how to effectively apply a data model in your MDM implementation.
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
This introduction to data governance presentation covers the inter-related DM foundational disciplines (Data Integration / DWH, Business Intelligence and Data Governance). Some of the pitfalls and success factors for data governance.
• IM Foundational Disciplines
• Cross-functional Workflow Exchange
• Key Objectives of the Data Governance Framework
• Components of a Data Governance Framework
• Key Roles in Data Governance
• Data Governance Committee (DGC)
• 4 Data Governance Policy Areas
• 3 Challenges to Implementing Data Governance
• Data Governance Success Factors
Architect’s Open-Source Guide for a Data Mesh ArchitectureDatabricks
Data Mesh is an innovative concept addressing many data challenges from an architectural, cultural, and organizational perspective. But is the world ready to implement Data Mesh?
In this session, we will review the importance of core Data Mesh principles, what they can offer, and when it is a good idea to try a Data Mesh architecture. We will discuss common challenges with implementation of Data Mesh systems and focus on the role of open-source projects for it. Projects like Apache Spark can play a key part in standardized infrastructure platform implementation of Data Mesh. We will examine the landscape of useful data engineering open-source projects to utilize in several areas of a Data Mesh system in practice, along with an architectural example. We will touch on what work (culture, tools, mindset) needs to be done to ensure Data Mesh is more accessible for engineers in the industry.
The audience will leave with a good understanding of the benefits of Data Mesh architecture, common challenges, and the role of Apache Spark and other open-source projects for its implementation in real systems.
This session is targeted for architects, decision-makers, data-engineers, and system designers.
EMC Perspective: Big Data Transforms the Life Science Commercial ModelEMC
This EMC Perspective discusses EMC Global Services and its use of Big Data to transform the way life sciences companies develop their commercial sales models.
Moving to the Cloud: Modernizing Data Architecture in HealthcarePerficient, Inc.
Constant changes in the healthcare industry continue to drive innovation in technology and serve as a catalyst for cloud adoption. This trend will continue to evolve and accelerate in the coming years with the increasing need to store and analyze vast amounts of information for personal and population health initiatives.
We joined guest speaker from HIMSS Analytics, James Gaston, to discuss the impact of the cloud on data architecture in healthcare. Topics included:
-The benefits and risks of moving data and analytics environments to the cloud
-Main healthcare use cases for cloud migration
-Deep dive into two leading healthcare organizations’ cloud journeys including drivers, challenges, benefits, and lessons learned
IRM Data Governance Conference February 2009, London. Presentation given on the Data Governance challenges being faced by BP and the approaches to address them.
This presentation was held by Professor Christine Legner (HEC Lausanne) at the Swiss Day on November 8, 2017, in Lausanne, Switzerland. It addresses the need for organisations to think about data and its management in new ways, as many corporations engage in the digital and data-driven transformation of their business. It concludes with three recommendations: 1) assess data's business value and impact, 2) measure and improve data quality, and 3) democratize data and support data citizenship.
Will I see you in Philadelphia next week? In case you don’t already know, I’ve been invited to speak at CBI’s Risk-Based Trial Management and Monitoring Conference.
I’m going to be sharing real world, pragmatic guidance that you can implement immediately to effectively influence your clinical trial performance.
My presentation, Practical Usage of KRIs and QTLs in Clinical Trials, will take place next Thursday, November 14th at 9:45am. I’m going to share with you:
• How to identify and close the gaps between risks and KRIs
• What the difference is between KRIs and QTLs, and how to use them effectively
• Useful examples of Centralized Monitoring findings from open data
• How to detect, combat and prevent fraud and sloppiness at an early stage
• How AI and ML advance risk-based approaches
So I can’t wait to see you at this informative and fun-filled industry expert forum,
– Artem Andrianov, CEO Cyntegrity
Enterprise-Level Preparation for Master Data Management.pdfAmeliaWong21
Master Data Management (MDM) continues to play a foundational role in the Data Management Architecture of every 21st century enterprise. In a forward-looking organization, MDM is significant in the Enterprise Integration Hub.
Mindtree provides cloud services to help believe that digital transformation of healthcare is only possible by embracing & adopting the cloud. Click her to know more.
8 must haves for modern Clinical Data IntegrationCitiusTech
The shift from volume based to value-based payment model has made the need for more and accurate clinical data all-important for payers. Today, a clinical data integration (CDI) platform that can enable payers to acquire, access and share clinical data to improve patient outcomes, reduce cost, and increase revenue is crucial.
What is Big Data?
Big Data Laws
Why Big Data?
Industries using Big Data
Current process/SW in SCM
Challenges in SCM industry
How Big data can solve the problems?
Migration to Big data for an SCM industry
Enterprises are faced by information overload. Big data appears as an opportunity, but has no relevance until enterprises can put it in context of their activities, processes, and organizations, Applying MDM principles to Big Data is therefore an opportunity that enterprises should target.
This presentation covers the following topics :
- what is MDM and Information Management
- what is Big Data and what are the use cases
- why and how Big Data can take advantage of MDM ? why and how MDM can take advantage of Big Data ?
apidays LIVE Helsinki & North - Product data ecosystem in the digital dental ...apidays
apidays LIVE Helsinki & North 2021 - APIs, Platforms, And Ecosystems - Transforming Industries And Experiences
March 15 & 16, 2021
Product data ecosystem in the digital dental industry
Sujoy Kumar Saha, Data Architect at 3Shape
Similar to Data Governance for Clinical Information (20)
Paper which discusses the notion that Data is NOT the "new Oil". We hear copious amounts said that Data is an asset, it's got to be managed, few people in the business understand it & so on. The phrase "Data is the new Oil" gets used many times, yet is rarely (if ever) justified. This paper is aimed to raise the level of debate from a subliminal nod to a conscious examination of the characteristics of different "assets" (particularly Oil) and to compare them with those of the 'Data asset".
Written by Christopher Bradley, CDMP Fellow, VP Professional Development DAMA International & 38 years Information Management experience, much of it in the Oil & Gas industry.
Information Management Training Courses & Certification approved by DAMA & based upon practical real world application of the DMBoK.
Includes Data Strategy, Data Governance, Master Data Management, Data Quality, Data Integration, Data Modelling & Process Modelling.
Dubai training classes covering:
An Introduction to Information Management,
Data Quality Management,
Master & Reference Data Management, and
Data Governance.
Based on DAMA DMBoK 2.0, 36 years practical experience and taught by author, award winner CDMP Fellow.
Are you ready for Big Data? This assessment review from Data Management Advisors will provide pragmatic recommendations & actionable transition steps to help you achieve your Big Data goals & deliver actionable insights.
info@dmadvisors.co.uk
Information Management Training & Certification from Data Management Advisors.
info@dmadvisors.co.uk
Courses available include:
Information Management Fundamentals,
Data Governance,
Data Quality Management,
Master & Reference Data,
Data Modelling,
Data Warehouse & Business Intelligence,
Metadata Management,
Data Security & Risk,
Data Integration & Interoperability,
DAMA CDMP Certification,
Business Process Discovery
A Data Management Advisors discussion paper comparing the characteristics of different types of "assets" and asking the question "Is the data asset REALLY different"?
A 3 day examination preparation course including live sitting of examinations for students who wish to attain the DAMA Certified Data Management Professional qualification (CDMP)
chris.bradley@dmadvisors.co.uk
DAMA BCS Chris Bradley Information is at the Heart of ALL architectures 18_06...Christopher Bradley
Information is at the heart of ALL architectures and the business.
Presentation by Chris Bradley to BCS Data Management Specialist Group (DMSG) and DAMA at the event "Information the vital organisation enabler" June 2015
Information is at the heart of all architecture disciplinesChristopher Bradley
Information is at the Heart of ALL the business & all architectures.
A white paper by Chris Bradley outlining why Information is the "blood" of an organisation.
Information Management training developed by Chris Bradley.
Education options include an overview of Information Management, DMBoK Overview, Data Governance, Master & Reference Data Management, Data Quality, Data Modelling, Data Integration, Data Management Fundamentals and DAMA CDMP certification.
chris.bradley@dmadvisors.co.uk
Data modelling for the business half day workshop presented at the Enterprise Data & Business Intelligence conference in London on November 3rd 2014
chris.bradley@dmadvisors.co.uk
Information Management Fundamentals DAMA DMBoK training course synopsisChristopher Bradley
The fundamentals of Information Management covering the Information Functions and disciplines as outlined in the DAMA DMBoK . This course provides an overview of all of the Information Management disciplines and is also a useful start point for candidates preparing to take DAMA CDMP professional certification.
Taught by CDMP(Master) examiner and author of components of the DMBoK 2.0
chris.bradley@dmadvisors.co.uk
This is a 3 day advanced course for students with existing data modelling experience to enable them to build quality data models that meet business needs. The course will enable students to:
* Understand and practice different requirements gathering approaches.
* Recognise the relationship between process and data models and practice capturing requirements for both.
* Learn how and when to exploit standard constructs and reference models.
*Understand further dimensional modelling approaches and normalisation techniques.
* Apply advanced patterns including "Bill of Materials" and "Party, Role, Relationship, Role-Relationship"
* Understand and practice the human centric design skills required for effective conceptual model development
* Recognise the different ways of developing models to represent ranges of hierarchies
This is a 3 day introductory course introducing students to data modelling, its purpose, the different types of models and how to construct and read a data model. Students attending this course will be able to:
Explain the fundamental data modelling building blocks. Understand the differences between relational and dimensional models.
Describe the purpose of Enterprise, conceptual, logical, and physical data models
Create a conceptual data model and a logical data model.
Understand different approaches for fact finding.
Apply normalisation techniques.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Welocme to ViralQR, your best QR code generator.ViralQR
Welcome to ViralQR, your best QR code generator available on the market!
At ViralQR, we design static and dynamic QR codes. Our mission is to make business operations easier and customer engagement more powerful through the use of QR technology. Be it a small-scale business or a huge enterprise, our easy-to-use platform provides multiple choices that can be tailored according to your company's branding and marketing strategies.
Our Vision
We are here to make the process of creating QR codes easy and smooth, thus enhancing customer interaction and making business more fluid. We very strongly believe in the ability of QR codes to change the world for businesses in their interaction with customers and are set on making that technology accessible and usable far and wide.
Our Achievements
Ever since its inception, we have successfully served many clients by offering QR codes in their marketing, service delivery, and collection of feedback across various industries. Our platform has been recognized for its ease of use and amazing features, which helped a business to make QR codes.
Our Services
At ViralQR, here is a comprehensive suite of services that caters to your very needs:
Static QR Codes: Create free static QR codes. These QR codes are able to store significant information such as URLs, vCards, plain text, emails and SMS, Wi-Fi credentials, and Bitcoin addresses.
Dynamic QR codes: These also have all the advanced features but are subscription-based. They can directly link to PDF files, images, micro-landing pages, social accounts, review forms, business pages, and applications. In addition, they can be branded with CTAs, frames, patterns, colors, and logos to enhance your branding.
Pricing and Packages
Additionally, there is a 14-day free offer to ViralQR, which is an exceptional opportunity for new users to take a feel of this platform. One can easily subscribe from there and experience the full dynamic of using QR codes. The subscription plans are not only meant for business; they are priced very flexibly so that literally every business could afford to benefit from our service.
Why choose us?
ViralQR will provide services for marketing, advertising, catering, retail, and the like. The QR codes can be posted on fliers, packaging, merchandise, and banners, as well as to substitute for cash and cards in a restaurant or coffee shop. With QR codes integrated into your business, improve customer engagement and streamline operations.
Comprehensive Analytics
Subscribers of ViralQR receive detailed analytics and tracking tools in light of having a view of the core values of QR code performance. Our analytics dashboard shows aggregate views and unique views, as well as detailed information about each impression, including time, device, browser, and estimated location by city and country.
So, thank you for choosing ViralQR; we have an offer of nothing but the best in terms of QR code services to meet business diversity!
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Data Governance for Clinical Information
1.
2. Presenters Chris Bradley Business Consulting Director chris.bradley@ipl.com +44 1225 475000 Colin Wood Enterprise Information Architect Colin.s.wood@gsk.com +44 1438 766671
3. Clinical Information Governance ..... .... Keeping the Data Healthy How GlaxoSmithKline are combining the disciplines of Data Modelling Master Data Management Enterprise Service Bus Data Stewardship Enterprise Architecture and IT Governance ... to simplify the management of clinical study information
17. GSK Business Process - Description - Processes that create value by transforming ideas or raw materials into revenue generating assets and products. Business Processes are the core of the Enterprise B GSK Business Process - Processes that identify molecular targets and validate their association with relevant disease processes and find potentially therapeutic compounds with acceptable developability characteristics. B1 Develop Targets & Leads - Processes that refine the synthesis and delivery of a therapeutic compound and demonstrate safety, efficacy, and manufacturability within regulatory limits and economic feasibility. B2 Develop Drugs and Products - Positions new products for distribution in relevant world markets and secures legal authority to market and sell these products for appropriate levels of reimbursement. External participants of these processes are regulatory authorities, and government or private payers. Internal customers are country sales and marketing groups and global manufacturing. B3 Launch New Product - Direct generation of revenue by marketing and selling products that GSK has manufactured. Customers of this process are consumers, managed care organisations, hospitals, and 3rd party payers. B4 Develop Markets and Manage Customers - Scale-up and transfer of production specifications and technologies to deliver the capability and capacity to manufacture sufficient quantities of new product to meet market demands. Direct customers of this process are the manufacturing sites that will produce the new products. B5 Supply New Product - Supply Chain processes that generate revenue directly by manufacturing and distributing the products that GSK sells. Customers of this process include wholesale distributors and other organisations that supply and use GSK products. B6 Manufacture and Distribute Products
18. B2: Develop Drugs and Products - Description - Processes that refine the synthesis and delivery of a therapeutic compound and demonstrate safety, efficacy, and manufacturability within regulatory limits and economic feasibility. B2 Develop Drugs and Products - Optimise trade-offs among Risk, Value, Resources and Timing in order to drive decisions relating to Prioritisation of high value assets, Project progression, and management of the progression of opportunities from candidate selection through to launch B2.1 Manage Project Portfolio - Develop all necessary aspects of the large-scale manufacturing process of an active pharmaceutical ingredient. B2.2 Refine Synthetic Route - Develop all necessary aspects of the formulation of a drug product. B2.3 Develop Formulation - Develop all aspects of the manufacture of sufficient quantities of drug products for pre-cliical and clinical evaluation. B2.4 Deliver Physical Product - Pre-clinical and early clinical studies are performed on each candidate selected for progression to FTIM and PoC studies. These studies help to determine that the product profile is achievable and that there is therapeutic potential B2.5 Perform Preclinical Evaluation - Prove the clinical value, efficacy and safety of the product B2.6 Test Human Safety & Efficacy - Monitor the safety of the drug and provide further data to support the market. Evaluate the possibilities for new indications, formulations and presentations. B2.7 Manage Product Lifecycle - Define, communicate and manage regulatory strategy. Compile, review, submit, and maintain regulatory documentation. Influence regulatory policy by interfacing with external bodies. B2.8 NPD6 Manage regulatory activities
20. Simplifying Clinical Information Environment: Objective: Transform the way we collect, report, archive and retrieve clinical trials data More Effective Use of People’s Time Difficulty finding & accessing Information Lack of Information Re-use More Efficient Clinical Trials Multiple, Complex interfaces Difficulty Integrating Information Enhance Patient Safety & Risk Management Double Productivity by 2015
21. Clinical Data Steward Business Unit Stewards Asset Stewards CDS Governance + Support Study Stewards Team Members Current Clinical Data Stewardship Framework Levels of accountabilities have been established. Undertaken by existing roles. Based on existing SOPs and guidance wherever possible. Implementation driven within Business Unit by Business Unit Steward. Communities of practice established. Everyone who generates, transforms, uses, stores, archives and/or discards data or documents pertaining to GSK clinical trials is a Steward of clinical data and must understand their responsibilities and act accordingly. Similar frameworks in progression elsewhere within R&D
24. R&D Master Data Management Roadmap Draft Chart represents cross-organisational mastering of data – does not reflect on quality of information in individual solutions
25. SCIE Information Blueprint has been crucial to understanding the Information landscape Governance process Data Dictionary Application vs. Data Matrix Data structures – Logical Data Model Application Information Flow
26. Building a common understanding using layered information models Implementation focus Communication focus (Low) (High) (High) (Low)
27. Information Principles Data and data models are a critical business asset in GSK and will be managed as a shared asset. All data will be subject to data ownership and governance principles. A strong preference for the reuse and elaboration of existing data models should be exercised. Key data models should be communicated throughout the organisation. The GSK Data Model Repository will be the Source of Record for data models. Integrate Data Modelling with Application Life Cycle. Link models from Enterprise through to Physical.
30. Data Models linking IT and Business Data Governance IT Focus Business Focus BA Training Data Modelling (complete) Info Blueprints (planned) Business Training Use of Info Blueprints (planned) IT Project Governance Embed use of data models into architecture reviews Data Governance Cross-org teams actively engaged in data management Development Process Automation to embed models into the development process Data Stewardship All data with clear accountability for definitions and data quality Master Data Roadmap Shared master and reference data, built into IT Project Portfolio Data Quality Plans Defined for all shared master and reference data
51. Important in an SoA World. Definition of data & consequently calls to / results from services is vital. Straight through processing can exacerbate the issue what does the data mean? which definition of X (e.g. “cost of goods”)? need to utilise the logical model and ERP models definitions Models & SOA
52. XML messages are at the core Data Warehouse Claims Management RedBrick DB2 Unix Adapter OS/390 Adapter MessageQueues MessageQueues Message Broker MessageQueues MessageQueues Windows Adapter Unix Adapter SQLServer Oracle Billing Document Management
53.
54. Generally XML messagesMessage: Book details Book ISBN code Amazon URL Book name Category Publication date Publisher Book Recommended price Book Authorship
55. XML messages need models! DBMS B DBMS A Enterprise Service Bus System A System B XML message
62. Establish a Corporate Repository Share models across the Enterprise Enterprise Conceptual Industry Standard Project Move from data “mine”ing to data “ours”ing Extend the data architecture to incorporate Data Governance Training and mentoring Bake data considerations into the SDLC Data models are NOT just for new developments
63. Establish a Community of Interest Purpose Share best practices inside company Exchange ideas across projects Represent company on vendor user forum Charter ALL internal data management users Invited consultants & contractors Subjects Standards & guidelines Training & education “Best practices” Part of OUR job IS Marketing!
64. Measure Data Management Maturity Ideal, Obtaining Optimal Value from Data Delivering broad Quality & Re-use Obtaining Limited Benefits Operating in “Fire Fighting” Mode Level 5 - Optimised Undesirable Level 4 - Managed Level 3 - Defined Aspiration Level 2 - Repeatable Data Principles Level 1 - Initial As-Is To-Be As-Is To-Be As-Is To-Be As-Is To-Be
65. Beware this is not “fire & forget” Current position Avoid the abyss via investment in “sustain” activities Data Governance Visibility Typical Gartner “hype cycle” TechnologyTrigger Peak of inflated expectations Trough of disillusionment Slope of enlightenment Plateau of productivity Maturity @ your company
66. Conclusion Understand roles and motivations and work within the organization Federated governance model Avoid silo mentality Communicate Obtain buy in by starting small & document success Make it easy to get hold of Market, market, market! Follow up with a robust architecture Common repository Models appropriate for the audience Defined stewardship Unique definitions “Repurpose” data for various audiences: via the web, Excel, DDL, XML, etc. It’s the data that’s important, not the format.
67. Data Governance 2.0 Conclusion Data Governance and Modeling need to get out of the “old school” Use new technologies to reach users Approach users in their language Don’t forget the fundamentals
68. Chris Bradley Business Consulting Director Chris.Bradley@ipl.com +44 7501 224230 Intelligent Business My blog: Information Management, Life & Petrol http://infomanagementlifeandpetrol.blogspot.com Colin Wood Enterprise Information Architect Colin.s.wood@gsk.com +44 1438 766671 36
Editor's Notes
The theme of this presentation is how GSK is combining data modelling with master data management, an Enterprise Service Bus, Business Data Stewardship and Enterprise Architecture to build a simplified environment for the management of clinical studies and related data.
Intro slide to GSK.Turnover of £28.4bn in 2009R&D spend of £4.1bn
Intent of this slide is to give some context of what R&D do and where clinical studies fit. Might redraw to make this simpler.
This slide gives a more detailed breakdown of where clinical studies fit and gives a hint of why pharma R&D is complex. Clinical Studies are created to test 3 main parameters – is the product safe within humans, is it effective (i.e. does it cure or alleviate the medical condition that it was intended for) and does it provide value. All pharmaceutical products will go through a number of clinical studies before the product can be launched on the market. This is an ongoing process and we continually run clinical studies on our products to support ongoing licensing of the product. GSK currently runs clinical studies across more than 140 countries. This is a complex activity involving multiple organisations who recruit patients and run the clinical studies on behalf of GSK. There’s also a significant amount of complexity in collecting and reporting on the study results themselves.However the clinical study process sits within an overall Pharmaceutical Enterprise with a diverse and complex set of processes. To a large extent business solutions supporting these different parts of the business have grown up independently because of their highly specialised nature. Each area also has multiple IT solutions, each supporting some part of what is a very complex scientific business.For example Project Planning – plans the development of new products, including budgeting and planning studies. These are very large scale projects that run over many years. Deliver Physical Product – embedded within here is a supply chain that is used first to deliver materials for internal testing, but then out to healthcare organisations conducting clinical studies. An added complexity is the fact that the products are blinded – for example we may ship both Placebo’s and active products that appear identical. Manage Safety of the Product – a large part of clinical studies is to establish that the product is safe for human use. There is continual monitoring of the safety of the product, including the use of Manage Regulatory Activities – there are many regulatory activities related to the conduct and disclosure of clinical studies. These can be very demanding and compliance is critical to the ongoing operation of the company. Chemistry - there are also links back to earlier discovery operations (not shown here). For example the identification of products used within GSK clinical studies is derived from early chemistry activities which are focused on the identification of a molecule. Key point is that this is a complex business.Again may re-draw to make this simpler.
The complexity can be seen within the application landscape. GSK R&D has around 3,000 business applications. Whilst we’re working hard to reduce this, it does represent the diverse and complex nature of a pharmaceutical organisation. The figures here aren’t untypical for any pharmaceutical organisation and represents the diversity and scientific complexity of pharmaceutical operations. Over the years we’ve acquired lots of specialised applications focused on specific business needs – these are highly optimised for the specific scientific function, but don’t integrate well to support the overall operation of the enterprise.The diagram on the left illustrates the complex point to point world in which we currently operate (and remember there are 3,000 of these applications). In reality though there are many examples of where we have no direct interfaces between systems and it’s common to see that some elements of reference or master data are manually re-entered from one system into another. This is something our IT organisation is tackling on a strategic level as part of our Re-Wire R&D programme. We are implementing an infrastructure that supports the implementation of a full SOA environment across the organisation and are putting a lot of training focus both on the technology and the definition of business services. Some of the technologies supporting this are an Enterprise Service Bus (IBM) supporting reliable messaging and web services a Master Data Management solution (Siperian). We are also investing in a claims based security environment. Put all these together and we will have a environment that should support plug and play type integration. We’ve even put together an integration center of excellence to support delivery, so technically all of the pieces would appear to be there.Those of you with an interest in data governance will know that this isn’t sufficient in of its self of course. If we are going to achieve data consistency and information availability we are also going to have to invest a lot of time and effort in understanding and governing the data.
So why is this particularly relevant to GSK’s clinical information environment?GSK is currently making a major investment into it’s clinical information environment with an aim to double productivity by 2015. You can see some of the issues noted on the previous slide are very prevalent here. We have a complex environment that makes it difficult to find and re-use information across systems.The implementation is rationalising, simplifying and in many cases replacing the current IT solutions we have in place to support clinical information.The solutions are all being progressed as part of GSK R&D’s re-wire strategy – so the use of services, ESB, MDM etc will all be part of the mix.
What about master data management? This is very much part of our strategy for Re-Wire and moves us to a situation where we have one version of the truth on the ESB. Diagram shows the status of the major subject areas of interest within R&D at present. Within this model we’re using the following definitions… Data Custodian – we have agreement at a senior level for a specific organisation to manage the data on behalf of the rest of the organisation. This is critical to ensure that the relevant data stewardship roles are in place. Data Stewards – we have individuals assigned to the management of the data. This can include a variety of roles. Master Source identified – we can agree on a single source for the master data. Note that there are a lot of instances where we have only partial data. Can be a variety of reasons, including not all of the data entities we’d consider in scope are available or that the master data only represents a subset of the full organisations master data for this entity. Data Quality Plan defined – means that we have a formally defined document that describes how the master data will be sourced and managed. The document also defines expected quality characteristics for the data. As you can see there’s plenty more to do. A look down the subject area’s shows that there are a number of things that are unique to our industry – Clinical Study, Compound which represents a molecule, and Medical Condition which represents a disease or indication for example. You can also see some things that are familiar within other industries. Products for example, we also have our own version of customer data – GSK makes payments to health care professionals to conduct clinical studies on its behalf; there is increasing legislation requiring us to report these payments, particularly in the US. As you can see we’re not in a position where we have true global master data in place. The last 2 items span the entire GSK organisation – spanning multiple solutions including global SAP and several local Siebel implementations.
Not surprisingly good old data modelling is a key component of our solutions. Perhaps no surprise there – but this is starting from an environment where there was virtually no data modelling within our IT organisation. Any data modelling that was completed was seen as supporting the physical design of the database only. The presence of multiple vendor solutions within our environment made this all the more difficult, with the frequent assumption that we didn’t need a data model because the vendor was supplying one. I wont go into detail here – but main points… We are building a full logical data model for this area, using ER/Studio as our data modelling environment. The model is heavily influenced from external industry standard models such as BRIDG. We are constructing CRUD matrices that allow us to map the data to systems and business process and to identify master sources. We are building definitions into the models as a data dictionary and have automated the Last but not least we plan to use the models to automatically generate the message definitions that flow between systems.
What’s different about our approach to data modelling; compared to traditional approaches. Firstly we are attempting to approach the modelling within the context of an overall R&D and Enterprise data models – the aim being to ensure that we really do understand the common data that needs to be shared and flow across the organisation. We already have an R&D level conceptual data model, that describes all of the data entities of relevance to R&D. We are now seeking to embed the use of these models into our logical data modelling efforts. This will allow us to link common concepts into the our data models. We are also planning further automation for ER/Studio, so that we can automate the import of common entity definitions from a master data model. For example any data model that references clinical study would be able to import a master definition.So why is this such an issue for our organisation. Think about the highly specialised nature of the pharma R&D organisation and fragmented solutions we have in place currently. We really are seeking to integrate across a set of organisations who each have their own perspective and terminology relating to master data. Take one concept that is used right across pharmaceutical R&D – compound. Everyone thinks they know what they mean by this, but when you span the organisation you find that there are all sorts of slightly different interpretations and meanings. Also many different terms are used to describe the same thing – Active Ingredient, API, Investigational New Drug could all mean the same thing. I recently found more than 30 terms that could be used to describe this same entity. What we’re seeing are specialised functions that are representing the role of the data entity, rather than a common definition. As we’re generally dealing with scientific communities all with PhD’s and their own specialised terminology, it’s very hard to introduce generic terminology like party or material. A second area that we’re giving a lot of focus to is the generation of XML schema for messaging directly from the models. We don’t have this fully worked out yet – but it is our aim to be able to step straight from the logical data model to a defined XML message schema. By linking from a enterprise through to a physical level we anticipate much higher consistency for interfaces and hopefully higher quality of information moving across the organisation.
As part of our strategy we’ve defined a set of information principles that are being used direct our approach to Information Architecture and associated governance. You’ll note that there’s a strong emphasis on the use and re-use of data models. Each of these has an assigned set of actions. For example “Key data models should be communicated throughout the organisation” tells us that we need to set up a communication programme to ensure that the role of the R&D level model is understood and where appropriate is used within the definition of Information Architectures.
The diagram shows some of the steps we are taking to link business and IT data governance. A core part is the fact that we have data models – each linked to an R&D level model - as a key component of the environment. The models are managed within an ER/Studio repository. We also see the MDM – managing both master data and reference data, like lists of values, as a key component of our strategy. Shown on the diagram are a set of activities that we are planning or implementing to enable our vision. These are colour coded – items in green show that these are already underway, items in yellow are planned or progressing in a limited part of the organisation. Items in grey are under discussion, but not yet addressed or possibly even agreed to. The items in the top right hand side show that we still have a lot of challenges in bringing together a common view of data governance across the organisation.A few things to point out. We are currently training all of our Business Analysts and Tech Leads in the use of ER/Studio. Around 80 people have now been trained within R&D. As we develop the R&D models and Information Blueprints more fully we’ll also conduct focused communication and training on the use of these models. We are also beginning to plug the use of the data models into our IT Project Governance processes – over time it will become an expectation that IT projects demonstrate that they understand where their data fits and that they are aligned with any existing domain wide Information Blueprints. We of course do this using data models. Further activities are being planned, including linking into the software development process. I’ll say a little bit about this on the next slide.On the business side things are more patchy, but this represents the diversity of our business. We have some strong pockets of success and we are seeking to broaden this across the organisation. Once again we are looking to make full use of the data models to establish common definitions, ownership and responsible data stewards. One other gap (not shown) is the implementation of a data quality toolset which supports business data profiling. We do have a data quality service in house, but as yet we’ve not been able to fully explore business data profiling. I personally have a belief that data profiling tools will be a key enabler to business data stewardship.
Key point here is to make the message appropriate to the audience in question.