Copyright 2013 by Data Blueprint
1
Unlock Business Value through Data Quality Engineering
Organizations must realize what ...
Copyright 2013 by Data Blueprint
Get Social With Us!
Live Twitter Feed
Join the conversation!
Follow us:
@datablueprint
@p...
Copyright 2013 by Data Blueprint
3
Peter Aiken, PhD
• 25+ years of experience in data
management
• Multiple international ...
Unlock Business Value through
Data Quality Engineering
Presented by Peter Aiken, Ph.D.
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Data Program
Coordination
Feedback
Data
Development
Copyright 2013 by Data Blueprint
Standard
Data
Organizational DM Pract...
Data Program
Coordination
Feedback
Data
Development
Copyright 2013 by Data Blueprint
Standard
Data
Organizational DM Pract...
Data Program
Coordination
Feedback
Data
Development
Copyright 2013 by Data Blueprint
Standard
Data
Five Integrated DM Prac...
Copyright 2013 by Data Blueprint
Five Integrated DM Practice Areas
10
Manage data coherently.
Share data across boundaries...
Copyright 2013 by Data Blueprint
• 5 Data Management
Practices Areas / Data
Management Basics
• Are necessary but
insuffic...
Copyright 2013 by Data Blueprint
Data Management
Body of
Knowledge
12
Data
Management
Functions
• Published by DAMA International
– The professional association for
Data Managers (40 chapters worldwide)
– DMBoK organiz...
Copyright 2013 by Data Blueprint
Overview: Data Quality Engineering
14
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Copyright 2013 by Data Blueprint
Data
Data
Data
Information
Fact Meaning
Request
A Model Specifying Relationships Among Im...
Copyright 2013 by Data Blueprint
Definitions
• Quality Data
– Fit for use meets the requirements of its authors, users,
an...
Copyright 2013 by Data Blueprint
Improving Data Quality during System Migration
18
• Challenge
– Millions of NSN/SKUs
main...
Unmatched
Items
Ignorable
Items
Items
Matched
Week # (% Total) (% Total) (% Total)
1 31.47% 1.34% N/A
2 21.22% 6.97% N/A
3...
Time needed to review all NSNs once over the life of the project:Time needed to review all NSNs once over the life of the ...
Copyright 2013 by Data Blueprint
Data Quality Misconceptions
1. You can fix the data
2. Data quality is an IT problem
3. T...
The Blind Men and
the Elephant
• It was six men of Indostan, To learning much inclined,
Who went to see the Elephant
(Thou...
Copyright 2013 by Data Blueprint
No universal conception of data
quality exists, instead many differing
perspective compet...
Copyright 2013 by Data Blueprint
Structured Data Quality Engineering
1. Allow the form of the
Problem to guide the
form of...
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Copyright 2013 by Data Blueprint
Mizuho Securities
• Wanted to sell 1 share for
600,000 yen
• Sold 600,000 shares for 1
ye...
Copyright 2013 by Data Blueprint
Four ways to make your data sparkle!
1.Prioritize the task
– Cleaning data is costly and ...
Copyright 2013 by Data Blueprint
• Deming cycle
• "Plan-do-study-act" or
"plan-do-check-act"
1. Identifying data issues th...
Copyright 2013 by Data Blueprint
The DQE Cycle: (1) Plan
• Plan for the assessment of
the current state and
identification...
Copyright 2013 by Data Blueprint
The DQE Cycle: (2) Deploy
30
• Deploy processes for
measuring and improving
the quality o...
Copyright 2013 by Data Blueprint
The DQE Cycle: (3) Monitor
• Monitor the quality of data
as measured against the
defined ...
Copyright 2013 by Data Blueprint
The DQE Cycle: (4) Act
• Act to resolve any
identified issues to
improve data quality
and...
Copyright 2013 by Data Blueprint
DQE Context & Engineering Concepts
• Can rules be implemented stating that no data can be...
Copyright 2013 by Data Blueprint
Data quality is now acknowledged as a major source
of organizational risk by certified ri...
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Copyright 2013 by Data Blueprint
Two Distinct Activities Support Quality Data
36
• Data quality best practices depend on b...
Copyright 2013 by Data Blueprint
Practice-Oriented Activities
37
• Stem from a failure to rigor when capturing/manipulatin...
Copyright 2013 by Data Blueprint
Structure-Oriented Activities
38
• Occur because of data and metadata that has been arran...
Copyright 2013 by Data Blueprint
Quality Dimensions
39
Copyright 2013 by Data Blueprint
A congratulations
letter from another
bank
Problems
• Bank did not know it
made an error
...
Copyright 2013 by Data Blueprint
4 Dimensions of Data Quality
41
An organization’s overall data quality is a function of f...
Copyright 2013 by Data Blueprint
Effective Data Quality Engineering
42
Data
Representation
Quality
As presented to
the use...
Copyright 2013 by Data Blueprint
Full Set of Data Quality Attributes
43
Copyright 2013 by Data Blueprint
Difficult to obtain leverage at the bottom of the falls
44
Copyright 2013 by Data Blueprint
Frozen Falls
45
Copyright 2013 by Data Blueprint
New York Turns to Big
Data to Solve Big Tree
Problem
• NYC
– 2,500,000 trees
• 11-months ...
Copyright 2013 by Data Blueprint
NYC's Big Tree Problem
• Question
– Does pruning trees in one year reduce the
number of h...
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Copyright 2013 by Data Blueprint
Letter from the Bank
… so please continue to open your
mail from either Chase or Bank One...
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Copyright 2013 by Data Blueprint
Data acquisition activities Data usage activitiesData storage
Traditional Quality Life Cy...
restored data
Metadata
Creation
Metadata Refinement
Metadata
Structuring
Data Utilization
Copyright 2013 by Data Blueprint...
restored data
Metadata Refinement
Metadata
Structuring
Data Utilization
Copyright 2013 by Data Blueprint
Data Manipulation...
Copyright 2013 by Data Blueprint
Starting
point
for new
system
development
data performance metadata
data architecture
dat...
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
Copyright 2013 by Data Blueprint
Profile, Analyze and Assess DQ
• Data assessment using 2 different approaches:
– Bottom-u...
Copyright 2013 by Data Blueprint
Define DQ Measures
• Measures development occurs as part of the strategy/
design/plan ste...
Copyright 2013 by Data Blueprint
Set and Evaluate DQ Service Levels
• Data quality inspection and
monitoring are used to
m...
Measure, Monitor & Manage DQ
Copyright 2013 by Data Blueprint
• DQM procedures depend on
available data quality measuring
...
Copyright 2013 by Data Blueprint
Overview: Data Quality Tools
4 categories of
activities:
1) Analysis
2) Cleansing
3) Enha...
Copyright 2013 by Data Blueprint
DQ Tool #1: Data Profiling
• Data profiling is the assessment of
value distribution and c...
Copyright 2013 by Data Blueprint
DQ Tool #1: Data Profiling, cont’d
• Data profiling vs. data quality-business context and...
Copyright 2013 by Data Blueprint
Courtesy GlobalID.com
63
Copyright 2013 by Data Blueprint
DQ Tool #2: Parsing & Standardization
• Data parsing tools enable the definition
of patte...
Copyright 2013 by Data Blueprint
DQ Tool #3: Data Transformation
• Upon identification of data errors, trigger data rules ...
Copyright 2013 by Data Blueprint
DQ Tool #4: Identify Resolution & Matching
• Data matching enables analysts to identify r...
Copyright 2013 by Data Blueprint
DQ Tool #5: Enhancement
• Definition:
– A method for adding value to information by accum...
Copyright 2013 by Data Blueprint
DQ Tool #6: Reporting
• Good reporting supports:
– Inspection and monitoring of conforman...
Copyright 2013 by Data Blueprint
1. Data Management Overview
2. DQE Definitions (w/ example)
3. DQE Cycle & Contextual Com...
• Develop and promote data quality awareness
• Define data quality requirements
• Profile, analyze and asses data quality
...
Copyright 2013 by Data Blueprint
Concepts and Activities
• Data quality expectations provide the inputs necessary
to defin...
Copyright 2013 by Data Blueprint
Summary: Data Quality Engineering
72
1/26/2010 © Copyright this and previous years by Dat...
10124 W. Broad Street, Suite C
Glen Allen, Virginia 23060
804.521.4056
Copyright 2013 by Data Blueprint
Questions?
74
+ =
It’s your turn!
Use the chat feature or Twitter (#dataed) to submit
you...
Developing a Data-centric Strategy & Roadmap
Enterprise Data World
April 28, 2014 @ 8:30 AM CT
Data Architecture Requireme...
Copyright 2013 by Data Blueprint
References & Recommended Reading
76
• The DAMA Guide to the Data Management Body of Knowl...
Copyright 2013 by Data Blueprint
Data Quality Dimensions
77
Copyright 2013 by Data Blueprint
Data Value Quality
78
Copyright 2013 by Data Blueprint
Data Representation Quality
79
Copyright 2013 by Data Blueprint
Data Model Quality
80
Copyright 2013 by Data Blueprint
Data Architecture Quality
81
Copyright 2013 by Data Blueprint
Guiding Principles
• Manage data as a core organizational asset.
• Identify a gold record...
Copyright 2013 by Data Blueprint
Goals and Principles
data quality control into the
system development life cycle
• To pro...
Copyright 2013 by Data Blueprint
Primary Deliverables
• Improved Quality Data
• Data Management Operational
Analysis
• Dat...
Copyright 2013 by Data Blueprint
Roles and Responsibilities
85
1/26/2010 © Copyright this and previous years by Data Bluep...
Upcoming SlideShare
Loading in...5
×

Data-Ed Webinar: Data Quality Engineering

1,183

Published on

Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from re-occurring.

Takeaways:

Understanding foundational data quality concepts based on the DAMA DMBOK
Utilizing data quality engineering in support of business strategy
Data Quality guiding principles & best practices
Steps for improving data quality at your organization

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,183
On Slideshare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
86
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Data-Ed Webinar: Data Quality Engineering

  1. 1. Copyright 2013 by Data Blueprint 1 Unlock Business Value through Data Quality Engineering Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar focuses on obtaining business value from data quality initiatives. I will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from re- occurring. Date: April 8, 2014 Time: 2:00 PM ET/11:00 AM PT Presenter: Peter Aiken, Ph.D. Time: • timeliness • currency • frequency • time period Form: • clarity • detail • order • presentation • media Content: • accuracy • relevance • completeness • conciseness • scope • performance Time: • timeliness • currency • frequency • time period Form: • clarity • detail • order • presentation • media Content: • accuracy • relevance • completeness • conciseness • scope • performance
  2. 2. Copyright 2013 by Data Blueprint Get Social With Us! Live Twitter Feed Join the conversation! Follow us: @datablueprint @paiken Ask questions and submit your comments: #dataed 2 Like Us on Facebook www.facebook.com/datablueprint Post questions and comments Find industry news, insightful content and event updates. Join the Group Data Management & Business Intelligence Ask questions, gain insights and collaborate with fellow data management professionals
  3. 3. Copyright 2013 by Data Blueprint 3 Peter Aiken, PhD • 25+ years of experience in data management • Multiple international awards & recognition • Founder, Data Blueprint (datablueprint.com) • Associate Professor of IS, VCU (vcu.edu) • President, DAMA International (dama.org) • 8 books and dozens of articles • Experienced w/ 500+ data management practices in 20 countries • Multi-year immersions with organizations as diverse as the US DoD, Nokia, Deutsche Bank, Wells Fargo, and the Commonwealth of Virginia 2
  4. 4. Unlock Business Value through Data Quality Engineering Presented by Peter Aiken, Ph.D.
  5. 5. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 5 Tweeting now: #dataed
  6. 6. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 6 Tweeting now: #dataed
  7. 7. Data Program Coordination Feedback Data Development Copyright 2013 by Data Blueprint Standard Data Organizational DM Practices and their Inter-relationships Organizational Strategies Goals Business Data Business Value Application Models & Designs Implementation Direction Guidance 7 Organizational Data Integration Data Stewardship Data Support Operations Data Asset Use Integrated Models
  8. 8. Data Program Coordination Feedback Data Development Copyright 2013 by Data Blueprint Standard Data Organizational DM Practices and their Inter-relationships Organizational Strategies Goals Business Data Business Value Application Models & Designs Implementation Direction Guidance Identifying, modeling, coordinating, organizing, distributing, and architecting data shared across business areas or organizational boundaries. Ensuring that specific individuals are assigned the responsibility for the maintenance of specific data as organizational assets, and that those individuals are provided the requisite knowledge, skills, and abilities to accomplish these goals in conjunction with other data stewards in the organization. Initiation, operation, tuning, maintenance, backup/recovery, archiving and disposal of data assets in support of organizational activities. 8 Specifying and designing appropriately architected data assets that are engineered to be capable of supporting organizational needs. Organizational Data Integration Data Stewardship Data Support Operations Data Asset Use Integrated Models Defining, coordinating, resourcing, implementing, and monitoring organizational data program strategies, policies, plans, etc. as coherent set of activities.
  9. 9. Data Program Coordination Feedback Data Development Copyright 2013 by Data Blueprint Standard Data Five Integrated DM Practice Areas Organizational Strategies Goals Business Data Business Value Application Models & Designs Implementation Direction Guidance 9 Organizational Data Integration Data Stewardship Data Support Operations Data Asset Use Integrated Models Leverage data in organizational activities Data management processes and infrastructure Combining multiple assets to produce extra value Organizational-entity subject area data integration Provide reliable data access Achieve sharing of data within a business area
  10. 10. Copyright 2013 by Data Blueprint Five Integrated DM Practice Areas 10 Manage data coherently. Share data across boundaries. Assign responsibilities for data. Engineer data delivery systems. Maintain data availability. Data Program Coordination Organizational Data Integration Data Stewardship Data Development Data Support Operations
  11. 11. Copyright 2013 by Data Blueprint • 5 Data Management Practices Areas / Data Management Basics • Are necessary but insufficient prerequisites to organizational data leveraging applications (that is Self Actualizing Data or Advanced Data Practices) Basic Data Management Practices – Data Program Management – Organizational Data Integration – Data Stewardship – Data Development – Data Support Operations http://3.bp.blogspot.com/-ptl-9mAieuQ/T-idBt1YFmI/ AAAAAAAABgw/Ib-nVkMmMEQ/s1600/ maslows_hierarchy_of_needs.png Advanced Data Practices • Cloud • MDM • Mining • Analytics • Warehousing • Big Data Management Practices Hierarchy (after Maslow)
  12. 12. Copyright 2013 by Data Blueprint Data Management Body of Knowledge 12 Data Management Functions
  13. 13. • Published by DAMA International – The professional association for Data Managers (40 chapters worldwide) – DMBoK organized around • Primary data management functions focused around data delivery to the organization (dama.org) • Organized around several environmental elements • CDMP – Certified Data Management Professional – DAMA International and ICCP – Membership in a distinct group made up of your fellow professionals – Recognition for your specialized knowledge in a choice of 17 specialty areas – Series of 3 exams – For more information, please visit: • http://www.dama.org/i4a/pages/index.cfm?pageid=3399 • http://iccp.org/certification/designations/cdmp Copyright 2013 by Data Blueprint DAMA DM BoK & CDMP 13
  14. 14. Copyright 2013 by Data Blueprint Overview: Data Quality Engineering 14
  15. 15. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 15 Tweeting now: #dataed
  16. 16. Copyright 2013 by Data Blueprint Data Data Data Information Fact Meaning Request A Model Specifying Relationships Among Important Terms [Built on definition by Dan Appleton 1983] Intelligence Use 1. Each FACT combines with one or more MEANINGS. 2. Each specific FACT and MEANING combination is referred to as a DATUM. 3. An INFORMATION is one or more DATA that are returned in response to a specific REQUEST 4. INFORMATION REUSE is enabled when one FACT is combined with more than one MEANING. 5. INTELLIGENCE is INFORMATION associated with its USES. Wisdom & knowledge are often used synonymously Data Data Data Data 16
  17. 17. Copyright 2013 by Data Blueprint Definitions • Quality Data – Fit for use meets the requirements of its authors, users, and administrators (adapted from Martin Eppler) – Synonymous with information quality, since poor data quality results in inaccurate information and poor business performance • Data Quality Management – Planning, implementation and control activities that apply quality management techniques to measure, assess, improve, and ensure data quality – Entails the "establishment and deployment of roles, responsibilities concerning the acquisition, maintenance, dissemination, and disposition of data" http://www2.sas.com/proceedings/sugi29/098-29.pdf ✓ Critical supporting process from change management ✓ Continuous process for defining acceptable levels of data quality to meet business needs and for ensuring that data quality meets these levels • Data Quality Engineering – Recognition that data quality solutions cannot not managed but must be engineered – Engineering is the application of scientific, economic, social, and practical knowledge in order to design, build, and maintain solutions to data quality challenges – Engineering concepts are generally not known and understood within IT or business! 17 Spinach/Popeye story from http://it.toolbox.com/blogs/infosphere/spinach-how-a-data-quality-mistake-created-a-myth-and-a-cartoon-character-10166
  18. 18. Copyright 2013 by Data Blueprint Improving Data Quality during System Migration 18 • Challenge – Millions of NSN/SKUs maintained in a catalog – Key and other data stored in clear text/comment fields – Original suggestion was manual approach to text extraction – Left the data structuring problem unsolved • Solution – Proprietary, improvable text extraction process – Converted non-tabular data into tabular data – Saved a minimum of $5 million – Literally person centuries of work
  19. 19. Unmatched Items Ignorable Items Items Matched Week # (% Total) (% Total) (% Total) 1 31.47% 1.34% N/A 2 21.22% 6.97% N/A 3 20.66% 7.49% N/A 4 32.48% 11.99% 55.53% … … … … 14 9.02% 22.62% 68.36% 15 9.06% 22.62% 68.33% 16 9.53% 22.62% 67.85% 17 9.50% 22.62% 67.88% 18 7.46% 22.62% 69.92% Copyright 2013 by Data Blueprint Determining Diminishing Returns 19
  20. 20. Time needed to review all NSNs once over the life of the project:Time needed to review all NSNs once over the life of the project: NSNs 2,000,000 Average time to review & cleanse (in minutes) 5 Total Time (in minutes) 10,000,000 Time available per resource over a one year period of time:Time available per resource over a one year period of time: Work weeks in a year 48 Work days in a week 5 Work hours in a day 7.5 Work minutes in a day 450 Total Work minutes/year 108,000 Person years required to cleanse each NSN once prior to migration:Person years required to cleanse each NSN once prior to migration: Minutes needed 10,000,000 Minutes available person/year 108,000 Total Person-Years 92.6 Resource Cost to cleanse NSN's prior to migration:Resource Cost to cleanse NSN's prior to migration: Avg Salary for SME year (not including overhead) $60,000.00 Projected Years Required to Cleanse/Total DLA Person Year Saved 93 Total Cost to Cleanse/Total DLA Savings to Cleanse NSN's: $5.5 million Copyright 2013 by Data Blueprint 20 Quantitative Benefits
  21. 21. Copyright 2013 by Data Blueprint Data Quality Misconceptions 1. You can fix the data 2. Data quality is an IT problem 3. The problem is in the data sources or data entry 4. The data warehouse will provide a single version of the truth 5. The new system will provide a single version of the truth 6. Standardization will eliminate the problem of different "truths" represented in the reports or analysis Source: Business Intelligence solutions, Athena Systems 21
  22. 22. The Blind Men and the Elephant • It was six men of Indostan, To learning much inclined, Who went to see the Elephant (Though all of them were blind), That each by observation Might satisfy his mind. • The First approached the Elephant, And happening to fall Against his broad and sturdy side, At once began to bawl: "God bless me! but the Elephant Is very like a wall!" • The Second, feeling of the tusk Cried, "Ho! what have we here, So very round and smooth and sharp? To me `tis mighty clear This wonder of an Elephant Is very like a spear!" • The Third approached the animal, And happening to take The squirming trunk within his hands, Thus boldly up he spake: "I see," quoth he, "the Elephant Is very like a snake!" • The Fourth reached out an eager hand, And felt about the knee: "What most this wondrous beast is like Is mighty plain," quoth he; "'Tis clear enough the Elephant Is very like a tree!" • The Fifth, who chanced to touch the ear, Said: "E'en the blindest man Can tell what this resembles most; Deny the fact who can, This marvel of an Elephant Is very like a fan!" • The Sixth no sooner had begun About the beast to grope, Than, seizing on the swinging tail That fell within his scope. "I see," quoth he, "the Elephant Is very like a rope!" • And so these men of Indostan Disputed loud and long, Each in his own opinion Exceeding stiff and strong, Though each was partly in the right, And all were in the wrong! (Source: John Godfrey Saxe's ( 1816-1887) version of the famous Indian legend ) 22 Copyright 2013 by Data Blueprint
  23. 23. Copyright 2013 by Data Blueprint No universal conception of data quality exists, instead many differing perspective compete. • Problem: –Most organizations approach data quality problems in the same way that the blind men approached the elephant - people tend to see only the data that is in front of them –Little cooperation across boundaries, just as the blind men were unable to convey their impressions about the elephant to recognize the entire entity. –Leads to confusion, disputes and narrow views • Solution: –Data quality engineering can help achieve a more complete picture and facilitate cross boundary communications 23
  24. 24. Copyright 2013 by Data Blueprint Structured Data Quality Engineering 1. Allow the form of the Problem to guide the form of the solution 2. Provide a means of decomposing the problem 3. Feature a variety of tools simplifying system understanding 4. Offer a set of strategies for evolving a design solution 5. Provide criteria for evaluating the quality of the various solutions 6. Facilitate development of a framework for developing organizational knowledge. 24
  25. 25. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 25 Tweeting now: #dataed
  26. 26. Copyright 2013 by Data Blueprint Mizuho Securities • Wanted to sell 1 share for 600,000 yen • Sold 600,000 shares for 1 yen • $347 million loss • In-house system did not have limit checking • Tokyo stock exchange system did not have limit checking ... • … and doesn't allow order cancellations CLUMSY typing cost a Japanese bank at least £128 million and staff their Christmas bonuses yesterday, after a trader mistakenly sold 600,000 more shares than he should have. The trader at Mizuho Securities, who has not been named, fell foul of what is known in financial circles as “fat finger syndrome” where a dealer types incorrect details into his computer. He wanted to sell one share in a new telecoms company called J Com, for 600,000 yen (about £3,000). Infamous Data Quality Example 26
  27. 27. Copyright 2013 by Data Blueprint Four ways to make your data sparkle! 1.Prioritize the task – Cleaning data is costly and time consuming – Identify mission critical/non-mission critical data 2.Involve the data owners – Seek input of business units on what constitutes "dirty" data 3.Keep future data clean – Incorporate processes and technologies that check every zip code and area code 4.Align your staff with business – Align IT staff with business units (Source: CIO JULY 1 2004) 27
  28. 28. Copyright 2013 by Data Blueprint • Deming cycle • "Plan-do-study-act" or "plan-do-check-act" 1. Identifying data issues that are critical to the achievement of business objectives 2. Defining business requirements for data quality 3. Identifying key data quality dimensions 4. Defining business rules critical to ensuring high quality data 28 The DQE Cycle
  29. 29. Copyright 2013 by Data Blueprint The DQE Cycle: (1) Plan • Plan for the assessment of the current state and identification of key metrics for measuring quality • The data quality engineering team assesses the scope of known issues – Determining cost and impact – Evaluating alternatives for addressing them 29
  30. 30. Copyright 2013 by Data Blueprint The DQE Cycle: (2) Deploy 30 • Deploy processes for measuring and improving the quality of data: • Data profiling – Institute inspections and monitors to identify data issues when they occur – Fix flawed processes that are the root cause of data errors or correct errors downstream – When it is not possible to correct errors at their source, correct them at their earliest point in the data flow
  31. 31. Copyright 2013 by Data Blueprint The DQE Cycle: (3) Monitor • Monitor the quality of data as measured against the defined business rules • If data quality meets defined thresholds for acceptability, the processes are in control and the level of data quality meets the business requirements • If data quality falls below acceptability thresholds, notify data stewards so they can take action during the next stage 31
  32. 32. Copyright 2013 by Data Blueprint The DQE Cycle: (4) Act • Act to resolve any identified issues to improve data quality and better meet business expectations • New cycles begin as new data sets come under investigation or as new data quality requirements are identified for existing data sets 32
  33. 33. Copyright 2013 by Data Blueprint DQE Context & Engineering Concepts • Can rules be implemented stating that no data can be corrected unless the source of the error has been discovered and addressed? • All data must be 100% perfect? • Pareto – 80/20 rule – Not all data is of equal Importance • Scientific, economic, social, and practical knowledge 33
  34. 34. Copyright 2013 by Data Blueprint Data quality is now acknowledged as a major source of organizational risk by certified risk professionals! 34
  35. 35. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 35 Tweeting now: #dataed
  36. 36. Copyright 2013 by Data Blueprint Two Distinct Activities Support Quality Data 36 • Data quality best practices depend on both – Practice-oriented activities – Structure-oriented activities Practice-oriented activities focus on the capture and manipulation of data Structure-oriented activities focus on the data implementation Quality Data
  37. 37. Copyright 2013 by Data Blueprint Practice-Oriented Activities 37 • Stem from a failure to rigor when capturing/manipulating data such as: – Edit masking – Range checking of input data – CRC-checking of transmitted data • Affect the Data Value Quality and Data Representation Quality • Examples of improper practice-oriented activities: – Allowing imprecise or incorrect data to be collected when requirements specify otherwise – Presenting data out of sequence • Typically diagnosed in bottom-up manner: find and fix the resulting problem • Addressed by imposing more rigorous data-handling governance Quality of Data Representation Quality of Data Values Practice-oriented activities
  38. 38. Copyright 2013 by Data Blueprint Structure-Oriented Activities 38 • Occur because of data and metadata that has been arranged imperfectly. For example: – When the data is in the system but we just can't access it; – When a correct data value is provided as the wrong response to a query; or – When data is not provided because it is unavailable or inaccessible to the customer • Developer focus within system boundaries instead of within organization boundaries • Affect the Data Model Quality and Data Architecture Quality • Examples of improper structure-oriented activities: – Providing a correct response but incomplete data to a query because the user did not comprehend the system data structure – Costly maintenance of inconsistent data used by redundant systems • Typically diagnosed in top-down manner: root cause fixes • Addressed through fundamental data structure governance Quality of Data Architecture Quality of Data Models Structure-oriented activities
  39. 39. Copyright 2013 by Data Blueprint Quality Dimensions 39
  40. 40. Copyright 2013 by Data Blueprint A congratulations letter from another bank Problems • Bank did not know it made an error • Tools alone could not have prevented this error • Lost confidence in the ability of the bank to manage customer funds 40
  41. 41. Copyright 2013 by Data Blueprint 4 Dimensions of Data Quality 41 An organization’s overall data quality is a function of four distinct components, each with its own attributes: • Data Value: the quality of data as stored & maintained in the system • Data Representation – the quality of representation for stored values; perfect data values stored in a system that are inappropriately represented can be harmful • Data Model – the quality of data logically representing user requirements related to data entities, associated attributes, and their relationships; essential for effective communication among data suppliers and consumers • Data Architecture – the coordination of data management activities in cross-functional system development and operations Practice- oriented Structure- oriented
  42. 42. Copyright 2013 by Data Blueprint Effective Data Quality Engineering 42 Data Representation Quality As presented to the user Data Value Quality As maintained in the system Data Model Quality As understood by developers Data Architecture Quality As an organizational asset (closer to the architect)(closer to the user) • Data quality engineering has been focused on operational problem correction – Directing attention to practice-oriented data imperfections • Data quality engineering is more effective when also focused on structure-oriented causes – Ensuring the quality of shared data across system boundaries
  43. 43. Copyright 2013 by Data Blueprint Full Set of Data Quality Attributes 43
  44. 44. Copyright 2013 by Data Blueprint Difficult to obtain leverage at the bottom of the falls 44
  45. 45. Copyright 2013 by Data Blueprint Frozen Falls 45
  46. 46. Copyright 2013 by Data Blueprint New York Turns to Big Data to Solve Big Tree Problem • NYC – 2,500,000 trees • 11-months from 2009 to 2010 – 4 people were killed or seriously injured by falling tree limbs in Central Park alone • Belief – Arborists believe that pruning and otherwise maintaining trees can keep them healthier and make them more likely to withstand a storm, decreasing the likelihood of property damage, injuries and deaths • Until recently – No research or data to back it up 46 http://www.computerworld.com/s/article/9239793/New_York_Turns_to_Big_Data_to_Solve_Big_Tree_Problem?source=CTWNLE_nlt_datamgmt_2013-06-05
  47. 47. Copyright 2013 by Data Blueprint NYC's Big Tree Problem • Question – Does pruning trees in one year reduce the number of hazardous tree conditions in the following year? • Lots of data but granularity challenges – Pruning data recorded block by block – Cleanup data recorded at the address level – Trees have no unique identifiers • After downloading, cleaning, merging, analyzing and intensive modeling – Pruning trees for certain types of hazards caused a 22 percent reduction in the number of times the department had to send a crew for emergency cleanups • The best data analysis – Generates further questions • NYC cannot prune each block every year – Building block risk profiles: number of trees, types of trees, whether the block is in a flood zone or storm zone 47 http://www.computerworld.com/s/article/9239793/New_York_Turns_to_Big_Data_to_Solve_Big_Tree_Problem?source=CTWNLE_nlt_datamgmt_2013-06-05
  48. 48. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 48 Tweeting now: #dataed
  49. 49. Copyright 2013 by Data Blueprint Letter from the Bank … so please continue to open your mail from either Chase or Bank One P.S. Please be on the lookout for any upcoming communications from either Chase or Bank One regarding your Bank One credit card and any other Bank One product you may have. Problems • I initially discarded the letter! • I became upset after reading it • It proclaimed that Chase has data quality challenges 49
  50. 50. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 50 Tweeting now: #dataed
  51. 51. Copyright 2013 by Data Blueprint Data acquisition activities Data usage activitiesData storage Traditional Quality Life Cycle 51
  52. 52. restored data Metadata Creation Metadata Refinement Metadata Structuring Data Utilization Copyright 2013 by Data Blueprint Data Manipulation Data Creation Data Storage Data Assessment Data Refinement 52 data architecture & models populated data models and storage locations data values data values data values value defects structure defects architecture refinements model refinements Data Life Cycle Model Products data
  53. 53. restored data Metadata Refinement Metadata Structuring Data Utilization Copyright 2013 by Data Blueprint Data Manipulation Data Creation Data Storage Data Assessment Data Refinement 53 populated data models and storage locations data values Data Life Cycle Model: Quality Focus data architecture & model quality model quality value quality value quality value quality representation quality Metadata Creation architecture quality
  54. 54. Copyright 2013 by Data Blueprint Starting point for new system development data performance metadata data architecture data architecture and data models shared data updated data corrected data architecture refinements facts & meanings Metadata & Data Storage Starting point for existing systems Metadata Refinement • Correct Structural Defects • Update Implementation Metadata Creation • Define Data Architecture • Define Data Model Structures Metadata Structuring • Implement Data Model Views • Populate Data Model Views Data Refinement • Correct Data Value Defects • Re-store Data Values Data Manipulation • Manipulate Data • Updata Data Data Utilization • Inspect Data • Present Data Data Creation • Create Data • Verify Data Values Data Assessment • Assess Data Values • Assess Metadata Extended data life cycle model with metadata sources and uses 54
  55. 55. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 55 Tweeting now: #dataed
  56. 56. Copyright 2013 by Data Blueprint Profile, Analyze and Assess DQ • Data assessment using 2 different approaches: – Bottom-up – Top-down • Bottom-up assessment: – Inspection and evaluation of the data sets – Highlight potential issues based on the results of automated processes • Top-down assessment: – Engage business users to document their business processes and the corresponding critical data dependencies – Understand how their processes consume data and which data elements are critical to the success of the business applications 56
  57. 57. Copyright 2013 by Data Blueprint Define DQ Measures • Measures development occurs as part of the strategy/ design/plan step • Process for defining data quality measures: 1. Select one of the identified critical business impacts 2. Evaluate the dependent data elements, create and update processes associate with that business impact 3. List any associated data requirements 4. Specify the associated dimension of data quality and one or more business rules to use to determine conformance of the data to expectations 5. Describe the process for measuring conformance 6. Specify an acceptability threshold 57
  58. 58. Copyright 2013 by Data Blueprint Set and Evaluate DQ Service Levels • Data quality inspection and monitoring are used to measure and monitor compliance with defined data quality rules • Data quality SLAs specify the organization’s expectations for response and remediation • Operational data quality control defined in data quality SLAs includes: – Data elements covered by the agreement – Business impacts associated with data flaws – Data quality dimensions associated with each data element – Quality expectations for each data element of the identified dimensions in each application for system in the value chain – Methods for measuring against those expectations – (…) 58
  59. 59. Measure, Monitor & Manage DQ Copyright 2013 by Data Blueprint • DQM procedures depend on available data quality measuring and monitoring services • 2 contexts for control/measurement of conformance to data quality business rules exist: – In-stream: collect in-stream measurements while creating data – In batch: perform batch activities on collections of data instances assembled in a data set • Apply measurements at 3 levels of granularity: – Data element value – Data instance or record – Data set 59
  60. 60. Copyright 2013 by Data Blueprint Overview: Data Quality Tools 4 categories of activities: 1) Analysis 2) Cleansing 3) Enhancement 4) Monitoring 60 Principal tools: 1) Data Profiling 2) Parsing and Standardization 3) Data Transformation 4) Identity Resolution and Matching 5) Enhancement 6) Reporting
  61. 61. Copyright 2013 by Data Blueprint DQ Tool #1: Data Profiling • Data profiling is the assessment of value distribution and clustering of values into domains • Need to be able to distinguish between good and bad data before making any improvements • Data profiling is a set of algorithms for 2 purposes: – Statistical analysis and assessment of the data quality values within a data set – Exploring relationships that exist between value collections within and across data sets • At its most advanced, data profiling takes a series of prescribed rules from data quality engines. It then assesses the data, annotates and tracks violations to determine if they comprise new or inferred data quality rules 61
  62. 62. Copyright 2013 by Data Blueprint DQ Tool #1: Data Profiling, cont’d • Data profiling vs. data quality-business context and semantic/logical layers – Data quality is concerned with proscriptive rules – Data profiling looks for patterns when rules are adhered to and when rules are violated; able to provide input into the business context layer • Incumbent that data profiling services notify all concerned parties of whatever is discovered • Profiling can be used to… – …notify the help desk that valid changes in the data are about to case an avalanche of “skeptical user” calls – …notify business analysts of precisely where they should be working today in terms of shifts in the data 62
  63. 63. Copyright 2013 by Data Blueprint Courtesy GlobalID.com 63
  64. 64. Copyright 2013 by Data Blueprint DQ Tool #2: Parsing & Standardization • Data parsing tools enable the definition of patterns that feed into a rules engine used to distinguish between valid and invalid data values • Actions are triggered upon matching a specific pattern • When an invalid pattern is recognized, the application may attempt to transform the invalid value into one that meets expectations • Data standardization is the process of conforming to a set of business rules and formats that are set up by data stewards and administrators • Data standardization example: – Brining all the different formats of “street” into a single format, e.g. “STR”, “ST.”, “STRT”, “STREET”, etc. 64
  65. 65. Copyright 2013 by Data Blueprint DQ Tool #3: Data Transformation • Upon identification of data errors, trigger data rules to transform the flawed data • Perform standardization and guide rule-based transformations by mapping data values in their original formats and patterns into a target representation • Parsed components of a pattern are subjected to rearrangement, corrections, or any changes as directed by the rules in the knowledge base 65
  66. 66. Copyright 2013 by Data Blueprint DQ Tool #4: Identify Resolution & Matching • Data matching enables analysts to identify relationships between records for de-duplication or group-based processing • Matching is central to maintaining data consistency and integrity throughout the enterprise • The matching process should be used in the initial data migration of data into a single repository • 2 basic approaches to matching: • Deterministic – Relies on defined patterns/rules for assigning weights and scores to determine similarity – Predictable – Dependent on rules developers anticipations • Probabilistic – Relies on statistical techniques for assessing the probability that any pair of record represents the same entity – Not reliant on rules – Probabilities can be refined based on experience -> matchers can improve precision as more data is analyzed 66
  67. 67. Copyright 2013 by Data Blueprint DQ Tool #5: Enhancement • Definition: – A method for adding value to information by accumulating additional information about a base set of entities and then merging all the sets of information to provide a focused view. Improves master data. • Benefits: – Enables use of third party data sources – Allows you to take advantage of the information and research carried out by external data vendors to make data more meaningful and useful • Examples of data enhancements: – Time/date stamps – Auditing information – Contextual information – Geographic information – Demographic information – Psychographic information 67
  68. 68. Copyright 2013 by Data Blueprint DQ Tool #6: Reporting • Good reporting supports: – Inspection and monitoring of conformance to data quality expectations – Monitoring performance of data stewards conforming to data quality SLAs – Workflow processing for data quality incidents – Manual oversight of data cleansing and correction • Data quality tools provide dynamic reporting and monitoring capabilities • Enables analyst and data stewards to support and drive the methodology for ongoing DQM and improvement with a single, easy-to-use solution • Associate report results with: – Data quality measurement – Metrics – Activity 68
  69. 69. Copyright 2013 by Data Blueprint 1. Data Management Overview 2. DQE Definitions (w/ example) 3. DQE Cycle & Contextual Complications 4. DQ Causes and Dimensions 5. Quality and the Data Life Cycle 6. DDE Tools 7. Takeaways and Q&A Outline 69 Tweeting now: #dataed
  70. 70. • Develop and promote data quality awareness • Define data quality requirements • Profile, analyze and asses data quality • Define data quality metrics • Define data quality business rules • Test and validate data quality requirements • Set and evaluate data quality service levels • Measure and monitor data quality • Manage data quality issues • Clean and correct data quality defects • Design and implement operational DQM procedures • Monitor operational DQM procedures and performance Copyright 2013 by Data Blueprint Overview: DQE Concepts and Activities 70
  71. 71. Copyright 2013 by Data Blueprint Concepts and Activities • Data quality expectations provide the inputs necessary to define the data quality framework: – Requirements – Inspection policies – Measures, and monitors that reflect changes in data quality and performance • The data quality framework requirements reflect 3 aspects of business data expectations 1. A manner to record the expectation in business rules 2. A way to measure the quality of data within that dimension 3. An acceptability threshold 71 from The DAMA Guide to the Data Management Body of Knowledge © 2009 by DAMA International
  72. 72. Copyright 2013 by Data Blueprint Summary: Data Quality Engineering 72 1/26/2010 © Copyright this and previous years by Data Blueprint - all rights reserved!
  73. 73. 10124 W. Broad Street, Suite C Glen Allen, Virginia 23060 804.521.4056
  74. 74. Copyright 2013 by Data Blueprint Questions? 74 + = It’s your turn! Use the chat feature or Twitter (#dataed) to submit your questions to Peter now.
  75. 75. Developing a Data-centric Strategy & Roadmap Enterprise Data World April 28, 2014 @ 8:30 AM CT Data Architecture Requirements May 13, 2014 @ 2:00 PM ET/11:00 AM PT Monetizing Data Management June 10, 2014 @ 2:00 PM ET/11:00 AM PT Sign up here: www.datablueprint.com/webinar-schedule or www.dataversity.net Copyright 2013 by Data Blueprint Upcoming Events 75
  76. 76. Copyright 2013 by Data Blueprint References & Recommended Reading 76 • The DAMA Guide to the Data Management Body of Knowledge © 2009 by DAMA International • http://www2.sas.com/proceedings/sugi29/098-29.pdf
  77. 77. Copyright 2013 by Data Blueprint Data Quality Dimensions 77
  78. 78. Copyright 2013 by Data Blueprint Data Value Quality 78
  79. 79. Copyright 2013 by Data Blueprint Data Representation Quality 79
  80. 80. Copyright 2013 by Data Blueprint Data Model Quality 80
  81. 81. Copyright 2013 by Data Blueprint Data Architecture Quality 81
  82. 82. Copyright 2013 by Data Blueprint Guiding Principles • Manage data as a core organizational asset. • Identify a gold record for all data elements • All data elements will have a standardized data definition, data type, and acceptable value domain • Leverage data governance for the control and performance of DQM • Use industry and international data standards whenever possible • Downstream data consumers specify data quality expectations • Define business rules to assert conformance to data quality expectations • Validate data instances and data sets against defined business rules • Business process owners will agree to and abide by data quality SLAs • Apply data corrections at the original source if possible • If it is not possible to correct data at the source, forward data corrections to the owner of the original source. Influence on data brokers to conform to local requirements may be limited • Report measured levels of data quality to appropriate data stewards, business process owners, and SLA managers 82
  83. 83. Copyright 2013 by Data Blueprint Goals and Principles data quality control into the system development life cycle • To provide defined processes for measuring, monitoring, and reporting conformance to acceptable levels of data quality 83 1/26/2010 © Copyright this and previous years by Data Blueprint - all rights reserved! • To measurably improve the quality of data in relation to defined business expectations • To define requirements and specifications for integrating
  84. 84. Copyright 2013 by Data Blueprint Primary Deliverables • Improved Quality Data • Data Management Operational Analysis • Data profiles • Data Quality Certification Reports • Data Quality Service Level Agreements 84
  85. 85. Copyright 2013 by Data Blueprint Roles and Responsibilities 85 1/26/2010 © Copyright this and previous years by Data Blueprint - all rights reserved! Suppliers: • External Sources • Regulatory Bodies • Business Subject Matter Experts • Information Consumers • Data Producers • Data Architects • Data Modelers • Data Stewards Participants: • Data Quality Analysts • Data Analysts • Database Administrators • Data Stewards • Other Data Professionals • DRM Director • Data Stewardship Council Consumers: • Data Stewards • Data Professionals • Other IT Professionals • Knowledge Workers • Managers and Executives • Customers
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×