SlideShare a Scribd company logo
1 of 46
Download to read offline
Data Management
Best Practices
© Copyright 2022 by Peter Aiken Slide # 1
peter.aiken@anythingawesome.com +1.804.382.5957 Peter Aiken, PhD
Practicing Data Management Better
1 + 1 = 11
Peter Aiken, Ph.D.
• I've been doing this a long time
• My work is recognized as useful
• Associate Professor of IS (vcu.edu)
• Institute for Defense Analyses (ida.org)
• DAMA International (dama.org)
• MIT CDO Society (iscdo.org)
• Anything Awesome (anythingawesome.com)
• Experienced w/ 500+ data
management practices worldwide
• Multi-year immersions
– US DoD (DISA/Army/Marines/DLA)
– Nokia
– Deutsche Bank
– Wells Fargo
– Walmart
– HUD …
• 12 books and
dozens of articles
© Copyright 2022 by Peter Aiken Slide # 2
https://anythingawesome.com
+
• DAMA International President 2009-2013/2018/2020
• DAMA International Achievement Award 2001
(with Dr. E. F. "Ted" Codd
• DAMA International Community Award 2005
$1,500,000,000.00 USD
Turn data into action
2
Cloud Platforms
Martech
SaaS Apps
Mobility & Devices
Customer / Partners /
Suppliers
Packaged Apps
Databases
Today, data is highly siloed and fragmented
Jane Smith
Opted in for
marketing
Jane Smith
Renewed
policy
John Smith
Same
household
Jane E
Smith
Called
support
Jane E
Smith
Filed auto
claim
3
Modern needs require modern solutions
0018-9162/07/$25.00 © 2007 IEEE
42 Computer Published by the IEEE Computer Society
version (changing data into other forms, states, or
products), or scrubbing (inspecting and manipulat-
ing, recoding, or rekeying data to prepare it for sub-
sequent use).
• Approximately two-thirds of organizational data
managers have formal data management training;
slightly more than two-thirds of organizations use
or plan to apply formal metadata management tech-
niques; and slightly fewer than one-half manage their
metadata using computer-aided software engineer-
ing tools and repository technologies.3
When combined with our personal observations, these
results suggest that most organizations can benefit from
the application of organization-wide data management
practices. Failure to manage data as an enterprise-, cor-
porate-, or organization-wide asset is costly in terms of
market share, profit, strategic opportunity, stock price,
and so on. To the extent that world-class organizations
have shown that opportunities can be created through
the effective use of data, investing in data as the only
organizational asset that can’t be depleted should be of
great interest.
Increasing data management practice maturity levels can positively impact the
coordination of data flow among organizations,individuals,and systems. Results
from a self-assessment provide a roadmap for improving organizational data
management practices.
Peter Aiken, Virginia Commonwealth University/Institute for Data Research
M. David Allen, Data Blueprint
Burt Parker, Independent consultant
Angela Mattia, J. Sergeant Reynolds Community College
A
s increasing amounts of data flow within and
between organizations, the problems that can
result from poor data management practices
are becoming more apparent. Studies have
shown that such poor practices are widespread.
For example,
• PricewaterhouseCoopers reported that in 2004, only
one in three organizations were highly confident in
their own data, and only 18 percent were very con-
fident in data received from other organizations.
Further, just two in five companies have a docu-
mented board-approved data strategy (www.pwc.
com/extweb/pwcpublications.nsf/docid/15383D6E7
48A727DCA2571B6002F6EE9).
• Michael Blaha1
and others in the research community
have cited past organizational data management edu-
cation and practices as the cause for poor database
design being the norm.
• According to industry pioneer John Zachman,2
orga-
nizations typically spend between 20 and 40 percent
of their information technology budgets evolving their
data via migration (changing data locations), con-
Measuring Data Management
Practice Maturity:
A Community’s
Self-Assessment
April 2007 43
DATA MANAGEMENT DEFINITION
AND EVOLUTION
As Table 1 shows, data management consists of six
interrelated and coordinated processes, primarily
derived by Burt Parker from sponsored research he led
for the US Department of Defense at the MITRE
Corporation.4
Figure 1 supports the similarly standardized defini-
tion: “Enterprise-wide management of data is under-
standing the current and future data needs of an
enterprise and making that data effective and efficient in
supporting business activities.”4
The figure illustrates how
organizational strategies guide
other data management pro-
cesses. Two of these processes
—data program coordination
and organizational data inte-
gration—provide direction to
the implementation processes
—data development, data sup-
port operations, and data asset
use. The data stewardship pro-
cess straddles the line between
direction and implementation.
All processes exchange feed-
back designed to improve and
fine-tune overall data manage-
ment practices.
Data management has existed
in some form since the 1950s
and has been recognized as a
discipline since the 1970s. Data
management is thus a young
discipline compared to, for
example, the relatively mature
accounting practices that have been practiced for thou-
sands of years. As Figure 2 shows, data management’s
scope has expanded over time, and this expansion contin-
ues today.
Ideally, organizations derive their data management
requirements from enterprise-wide information and
functional user requirements. Some of these require-
ments come from legacy systems and off-the-shelf soft-
ware packages. An organization derives its future data
requirements from an analysis of what it will deliver, as
well as future capabilities it will need to implement orga-
nizational strategies. Data management guides the trans-
Data program
coordination
Organizational
data integration
Data
stewardship
Data support
operations
Data
asset use
Organizational strategies
Goals
Integrated
models
Business
data
Business value
Application models
and designs
Feedback
Implementation
Direction
Guidance
Data
development
Standard
data
Figure 1.Interrelationships among data management processes (adapted from Burt
Parker’s earlier work4
).Blue lines indicate guidance,red lines indicate feedback,and green
lines indicate data.
Table 1. Data management processes.4
Process Description Focus Data type
Data program Provide appropriate data Direction Program data: Descriptive propositions or observations needed to
coordination management process and establish, document, sustain, control, and improve organizational
technological infrastructure data-oriented activities (such as vision, goals, policies, and metrics).
Organizational Achieve organizational Direction Development data: Descriptive facts, propositions, or observations used
data integration sharing of appropriate data to develop and document the structures and interrelationships of data
(for example, data models, database designs, and specifications).
Data stewardship Achieve business-entity Direction and Stewardship data: Descriptive facts about data documenting
subject area data integration implementation semantics and syntax (such as name, definition, and format).
Data development Achieve data sharing within Implementation Business data: Facts and their constructs used to accomplish enterprise
a business area business activities (such as data elements, records, and files).
Data support Provide reliable access to Implementation
operations data
Data asset use Leverage data in business Implementation
activities
44 Computer
formation of strategic organizational information needs
into specific data requirements associated with particu-
lar technology system development projects.
All organizations have data architectures, whether
explicitly documented or implicitly assumed. An impor-
tant data management process is to document the archi-
tecture’s capabilities, making it more useful to the
organization.
In addition, data management
• must be viewed as a means to an end, not the end
itself. Organizations must not practice data man-
agement as an abstract discipline, but as a process
supporting specific enterprise objectives—in partic-
ular, to provide a shared-resource basis on which to
build additional services.
• involves both process and policy. Data management
tasks range from strategic data planning to the cre-
ation of data element standards to database design,
implementation, and maintenance.
• has a technical component: interfacing with and facil-
itating interaction between software and hardware.
• has a specific focus: creating and maintaining data to
provide useful information.
• includes management of metadata artifacts that
address the data’s form as well as its content.
Although data management serves the organization,
the organization often doesn’t appreciate the value it
provides. Some data management staffs keep ahead of
the layoff curve by demonstrating positive business
value. Management’s short-term focus has often made
it difficult to secure funding for medium- and long-term
data management investments. Tracing the discipline’s
efforts to direct and indirect organizational benefits has
been difficult, so it hasn’t been easy to present an artic-
ulate business case to management that justifies subse-
quent strategic investments in data
management.
Viewing data management as a col-
lection of processes, each with a role
that provides value to the organization
through data, makes it easier to trace
value through those processes and
point not only to a methodological
“why” of data management practice
improvement but also to a specific,
concrete “how.”
RESEARCH BASIS
Mark Gillenson has published three
papers that serve as an excellent back-
ground to this research.5-7
Like earlier
works, Gillenson focuses on the
implementation half of Figure 1,
adopting a more narrow definition of
data administration. Over time, his work paints a pic-
ture of an industry attempting to catch up with techno-
logical implementation. Our work here updates and
confirms his basic conclusions while changing the focus
from whether a process is performed to the maturity
with which it is performed.
Three other works also influenced our research: Ralph
Keeney’s value-focused thinking,8
Richard Nolan’s six-
stage theory of data processing,9
and the Capability
Maturity Model Integration (CMMI).10,11
Keeney’s value-focused thinking provides a method-
ological approach to analyzing and evaluating the var-
ious aspects of data management and their associated
key process areas. We wove the concepts behind means
and fundamental objectives into our assessment’s con-
struction to connect how we measure data management
with what customers require from it.
In Stage VI of his six-stage theory of data processing,
Nolan defined maturity as data resource management.
Although Nolan’s theory predates and is similar to the
CMMI, it contains several ideas that we adapted and
reused in the larger data management context. However,
CMMI refinement remains our primary influence.
Most technologists are familiar with the CMM (and its
upgrade to the CMMI), developed at Carnegie Mellon’s
Software Engineering Institute with assistance from the
MITRE Corporation.10,11
The CMMI itself was derived
from work that Ron Radice and Watts Humphrey per-
formed while at IBM. Dennis Goldenson and Diane
Gibson presented results pointing to a link between
CMMI process maturity and organizational success.12
In
addition, Cyndy Billings and Jeanie Clifton demonstrated
the long-term effects for organizations that successfully
sustain process improvement for more than a decade.13
CMMI-based maturity models exist for human
resources, security, training, and several other areas of
the software-related development process. Our colleague,
Expanding Data Management Scope 1950-1970 1970-1990 1990-2000 2000 to
present
Database development
Database operation
Data requirements analysis
Data modeling
Enterprise data management coordination
Enterprise data integration
Enterprise data stewardship
Enterprise data use
Explicit focus on data quality throughout
Security
Compliance
Other responsibilities
Figure 2.Data management’s growth over time.The discipline has expanded from
an initial focus on database development and operation in the 1950s to 1970s to
include additional responsibilities in the periods 1970-1990,1990-2000,and from
2000 to the present.
Brett Champlin, contributed a list of dozens of maturity
measurements derived from or influenced by the CMMI.
This list includes maturity measurement frameworks for
data warehousing, metadata management, and software
systems deployment. The CMMI’s successful adoption in
other areas encouraged us to use it as the basis for our
data management practice assessment.
Whereas the core ideas behind the CMMI present a
reasonable base for data management practice maturity
measurement, we can avoid some potential pitfalls by
learning from the revisions and later work done with
the CMMI. Examples of such improvements include
general changes to how the CMMI makes interrela-
tionships between process areas more explicit and how
it presents results to a target organization.
Work by Cynthia Hauer14
and Walter Schnider and
Klaus Schwinn15
also influenced our general approach to
a data management maturity model. Hauer nicely artic-
ulated some examples of the value determination fac-
tors and results criteria that we have adopted. Schnider
and Schwinn presented a rough but inspirational out-
line of what mature data management practices might
look like and the accompanying motivations.
RESEARCH OBJECTIVES
Our research had six specific objectives, which we
grouped into two types: community descriptive goals
and self-improvement goals.
Community descriptive research goals help clarify our
understanding of the data management community and
associated practices. Specifically, we want to understand
• the range of practices within the data management
community;
• the distribution of data management practices, specif-
ically the various stages of organizational data man-
agement maturity; and
• the current state of data management practices—in
what areas are the community data management
practices weak, average, and strong?
Self-improvement research goals help the community
as a whole improve its collective data management prac-
tices. Here, we desire to
• better understand what defines current data man-
agement practices;
• determine how the assessment informs our standing
as a technical community (specifically, how does data
management compare to software development?);
and
• gain information useful for developing a roadmap
for improving current practice.
The CMMI’s stated goals are almost identical to ours:
“[The CMMI] was designed to help developers select
process-improvement strategies by determining their cur-
rent process maturity and identifying the most critical
issues to improving their software quality and process.”10
Similarly, our goal was to aid data management practice
improvement by presenting a scale for measuring data
management accomplishments. Our assessment results
can help data managers identify and implement process
improvement strategies by recognizing their data man-
agement challenges.
DATA COLLECTION PROCESS
AND RESEARCHTARGETS
Between 2000 and 2006, we assessed the data man-
agement practices of 175 organizations. Table 2 pro-
vides a breakdown of organization types.
Students from some of our graduate and advanced
undergraduate classes largely conducted the assessments.
We provided detailed assessment instruction as part of
the course work. Assessors used structured telephone
and in-person interviews to assess specific organizational
data management practices by soliciting evidence of
processes, products, and common features. Key concepts
sought included the presence of commitments, abilities,
measurements, verification, and governance.
Assessors conducted the interviews with the person
identified as having the best, firsthand knowledge of
organizational data management practices. Tracking
down these individuals required much legwork; identi-
fying these individuals was often more difficult than
securing the interview commitment.
The assessors attempted to locate evidence in the orga-
nization indicating the existence of key process areas
within specific data management practices. During the
evaluation, assessors observed strict confidentiality—
they reported only compiled results, with no mention of
specific organizations, individuals, groups, programs,
or projects. Assessors and participants kept all infor-
mation to themselves and observed proprietary rights,
including several nondisclosure agreements.
All organizations implement their data management
practice in ways that can be classified as one of five
maturity model levels, detailed in Table 3 on the next
page. Specific evidence, organized by maturity level,
helped identify the level of data management practiced.
April 2007 45
Table 2. Organizations included in data management
analysis, by type.
Organization type Percent
Local government 4
State government 17
Federal government 11
International organization 10
Commercial organization 58
46 Computer
For example, the data program coordination practice
area results include:
• Mystery Airline achieved level 1 on responses 1, 2,
and 5, and level 2 on responses 3 and 4.
• The airline industry performed above both Mystery
Airline and all respondents on responses 1 through
3.
• The airline industry performed below both Mystery
Airline and all respondents on response 4, and
Mystery Airline performed well below all respon-
dents and just those in the airline industry on
response 5.
Figure 3f illustrates the range of results for all orga-
nizations surveyed for each data management process—
for example, the assessment results for data program
coordination ranged from 2.06 to 3.31.
The maturity measurement framework dictates that
a data program can achieve no greater rating than the
lowest rating achieved—hence the translation to the
scores for Mystery Airline of 1, 2, 2, 2, and 2 combin-
ing for an overall rating of 1. This is congruent with
CMMI application.
Although this might seem a tough standard, the rat-
ing reflects the adage that a chain is only as strong as its
weakest link. Mature data management programs can’t
rely on immature or ad hoc processes in related areas.
The lowest rating received becomes the highest possible
For each data management process, the assessment
used between four and six objective criteria to probe
for evidence. Assessed outside the data collection
process, the presence or absence of this evidence indi-
cated organizational performance at a corresponding
maturity level.
ASSESSMENT RESULTS
The assessment results reported for the various prac-
tice areas show that overall scores are repeatable (level
2) in all data management practice areas.
Figure 3 shows assessment averages of the individual
response scores. We used a composite chart to group the
averages by practice area. Such groupings facilitate
numerous comparisons, which organizations can use to
plan improvements to their data management practices.
We present sample results (blue) for an assessed orga-
nization (disguised as “Mystery Airline”), whose man-
agement was interested in not only how the organization
scored but also how it compared to other assessed air-
lines (red) and other organizations (white).
We grouped 19 individual responses according to the
five data management maturity levels in the horizontal
bar charts. Most numbers are averages. That is, for an
individual organization, we surveyed multiple data man-
agement operations, combined the individual assessment
results, and presented them as averages. We reported
assessments of organizations with only one data man-
agement function as integers.
Table 3. Data management practice assessment levels.
Level Name Practice Quality and results predictability
1 Initial The organization lacks the necessary processes for The organization depends entirely on individuals, with little or no
sustaining data management practices. Data corporate visibility into cost or performance, or even awareness
management is characterized as ad hoc or chaotic. of data management practices. There is variable quality, low
results predictability, and little to no repeatability.
2 Repeatable The organization might know where data management The organization exhibits variable quality with some
expertise exists internally and has some ability to predictability. The best individuals are assigned to critical
duplicate good practices and successes. projects to reduce risk and improve results.
3 Defined The organization uses a set of defined processes, Good quality results within expected tolerances most of the time.
which are published for recommended use. The poorest individual performers improve toward the best
performers, and the best performers achieve more leverage.
4 Managed The organization statistically forecasts and directs Reliability and predictability of results, such as the ability to
data management, based on defined processes, determine progress or six sigma versus three sigma
selected cost, schedule, and customer satisfaction measurability, is significantly improved.
levels. The use of defined data management processes
within the organization is required and monitored.
5 Optimizing The organization analyzes existing data management The organization achieves high levels of results certainty.
processes to determine whether they can be improved,
makes changes in a controlled fashion, and reduces
operating costs by improving current process
performance or by introducing innovative services to
maintain their competitive edge.
overall rating. This also explains why many organiza-
tions are at level 1 with regard to their software devel-
opment practices. While the CMMI process results in a
single overall rating for the organization, data manage-
ment requires a more fine-grained feedback mechanism.
Knowing that some data management processes per-
form better than others can help an organization develop
incentives as well as a roadmap for improving individ-
ual ratings.
Taken as a whole, these numbers show that no data
management process or subprocess measured on aver-
age higher than the data program coordination process,
at 3.31. It’s also the only data management process that
performed on average at a defined level (greater than 3).
The results show a community that is approaching
the ability to repeat its processes across all of data
management.
Results analysis
Perhaps the most important general fact represented
in Figure 3 is that organizations gave themselves rela-
tively low scores. The assessment results are based on
self-reporting and, although our 15-percent validation
sample is adequate to verify accurate industry-wide
assessment results, 85 percent of the assessment is based
on facts that were described but not observed. Although
direct observables for all survey respondents would have
provided valuable confirming evidence, the cost of such
a survey and the required organizational access would
have been prohibitive.
We held in-person, follow-up assessment validation
sessions with about 15 percent of the assessed organi-
zations. These sessions helped us validate the collection
method and refine the technique. They also let us gauge
the assessments’ accuracy.
April 2007 47
0 1 2 3 4 0 1 2 3 4
Response 1
Response 2
Response 3
Response 4
Response 5
(a) (b)
1
1
2
2
1
Response 8
Response 9
Response 7
Response 6
2
2
2
2
3.08
2.18
2.57
2.05
0.98
2.34
2.66
2.98
Response 10-a
Response 10-b
Response 10-c
Response 10-d
Response 10-e
Response 10-f
2
2
1
2
2
2
2.13
0.96
1./05
2.23
2.21
1.98
1.1
0.97
2.15
3.04
2.40
0.965
Response 14
Response 13
Response 12
Response 11
Response 15
2
2
2
2
2
0.89
1.2
1.05
0.79
1.14
2.25
2.46
2.01
1.57
2.33
Response 19
Response 18
Response 17
Response 16
0.00
1.00
2.00
3.00
4.00
5.00
Data program
coordination
results
Enterprise
data
integration
results
Data
stewardship
results
Data
development
results
Data support
operations
results
Mystery Airline
Airline industry
All respondents
3.31
3.14
2.88
1.09
3.11
2.06
2.57
2.98
2.72
3.15
0 1 2 3 4
(c)
0 1 2 3 4
(e) (f)
0 1 2 3 4
(d)
3
3
3
3
1.11
2.17
3.04
2.04
2.66
3.11
2.89
2.66
2.06
3.31
2.66
2.18 1.98
2.28
2.46
1.57
2.04
2.66
Figure 3.Assessment results useful to Mystery Airline:(a) data program coordination,(b) enterprise data integration,(c) data
stewardship,(d) data development,(e) data support organizations,and (f) assessments range.
48 Computer
Although the assessors strove to accurately measure
each subprocess’s maturity level, some interviews
inevitably were skewed toward the positive end of the
scale. This occurred most often because interviewees
reported on milestones that they wanted to or would
soon achieve as opposed to what they had achieved. We
suspected, and confirmed during the validation sessions,
that responses were typically exaggerated by one point
on the five-point scale.
When we factor in the one-point inflation, the num-
bers in Table 4 become important. Knowing that the bar
is so low will hopefully inspire some organizations to
invest in data management. Doing so might give them a
strategic advantage if the competition is unlikely to be
making a similar investment.
The relatively low scores reinforce the need for
this data management assessment. Based on the
overall scores in the data management practice
areas, the community receives five Ds. These areas
provide immediate targets for future data manage-
ment investment.
WHERE AREWE NOW?
We address our original research objectives according
to our two goal categories.
Community descriptive research goals
First, we wanted to determine the range of practices
within the data management community. A wide range
of such practices exists. Some organizations are strong
in some data management practices and weak in others
(the range of practice is consistently inconsistent). The
wide divergence of practices both within and between
organizations can dilute results from otherwise strong
data management programs. The assessment’s applica-
bility to longitudinal studies remains to be seen; this is
an area for follow-up research. Although researchers
might undertake formal studies of such trends in the
future, evidence from ongoing assessments suggests that
results are converging. Consequently, we feel that our
sample constitutes a representation of community-wide
data management practices.
Next, we wanted to know whether the distribution of
practices informs us specifically about the various stages
of organizational data management maturity. The
assessment results confirm the framework’s utility, as do
the postassessment validation sessions. Building on the
framework, we were able to specify target characteris-
tics and objective measurements. We now have better
information as to what comprises the various stages of
organizational data management practice maturity.
Organizations do clump together into the various matu-
rity stages that Nolan originally described. We can now
determine the investments required to predictably move
organizations from one data management maturity level
to another.
Finally, we wanted to determine in what areas the
community data management practices are weak, aver-
age, and strong. Figure 4 shows an average of unad-
justed rates summarizing the assessment results. As the
figure shows, the data management community reports
itself relatively and perhaps surprisingly strong in all five
major data management processes when compared to
the industry averages for software development. The
range and averages indicate that the data management
community has more mature data program coordina-
tion processes, followed by organizational data inte-
gration, support operations, stewardship, and then data
development. The relatively lower data development
scores might suggest data program coordination imple-
mentation difficulties.
Self-improvement research goals
Our first objective was to produce results that would
help the community better understand current best prac-
tices. Organizations can use the assessment results to
compare their specific performance against others in
their industry and against the community results as a
whole. Quantities and groupings indicate the relative
state and robustness of the best practices within each
process. Future research can use this information to
identify specific practices that can be shared with the
Table 4. Assessment scores adjusted for self-reporting
inflation.
Response Adjusted average
1 1.72388
2 1.57463
3 1.0597
4 1.8806
5 2.31343
6 1.66418
7 1.33582
8 1.57463
9 1.1791
10 a 1.40299
10 b 1.14925
10 c 0.97761
10 d 1.20896
10 e 1.23134
10 f 1.12687
11 1.32836
12 0.57463
13 1.00746
14 1.46269
15 1.24627
16 1.65672
17 1.66418
18 1.04478
19 1.17164
community. Further study of
these areas will provide lever-
ageable benefits.
Next, we wanted to deter-
mine how the assessment in-
forms our standing as a tech-
nical community. Our research
gives some indication of the
claimed current state of data
management practices. How-
ever, given the validation session
results, we believe that it’s best to caution readers that
the numbers presented probably more accurately
describe the intended state of the data management
community.
As it turns out, the relative number of organizations
above level 1 for both software and data management
are approximately the same, but a more detailed analy-
sis would be helpful. Given the belief that investment
in software development practices will result in signif-
icant improvements, it’s appropriate to anticipate sim-
ilar benefits from investments in data management
practices.
Finally, we hoped to gain information useful for devel-
oping a roadmap for improving current practice.
Organizations can use the survey assessment information
to develop roadmaps to improve their individual data
management practices. Mystery Airline, for example,
could develop a roadmap for achieving data management
improvement by focusing on enterprise data integration,
data stewardship, and data development practices.
SUGGESTIONS FOR FUTURE RESEARCH
Additional research must include a look at relation-
ships between data management practice areas, which
could indicate an efficient path to higher maturity lev-
els. Research should also explore the success or failure
of previous attempts to raise the maturity levels of orga-
nizational data management practices.
One of our goals was to determine why so many orga-
nizational data management practices are below expec-
tations. Several current theses could spur investigation
of the root causes of poor data management practices.
For example,
• Are poor data management practices a result of the
organization’s lack of understanding?
• Does data management have a poor reputation or
track record in the organization?
• Are the executive sponsors capable of understanding
the subject?
• How have personnel and project changes affected
the organization efforts?
Our assessment results suggest a need for a more for-
malized feedback loop that organizations can use to
improve their data management practices. Organizations
can use this data as a baseline from which to look for,
describe, and measure improvements in the state of the
practice. Such information can enhance their under-
standing of the relative development of organizational
data management. Other investigations should probe
further to see if patterns exist for specific industry or busi-
ness focus types.
Building an effective business case for achieving a cer-
tain level of data management is now easier. The failure
to adequately address enterprise-level data needs has
hobbled past efforts.4
Data management has, at best, a
business-area focus rather than an enterprise outlook.
Likewise, applications development focuses almost
exclusively on line-of-business needs, with little atten-
tion to cross-business-line data integration or enterprise-
wide planning, analysis, and decision needs (other than
within personnel, finance, and facilities management).
In addition, data management staff is inexperienced in
modern data management needs, focusing on data man-
agement rather than metadata management and on syn-
taxes instead of semantics and data usage.
F
ew organizations manage data as an asset. Instead,
most consider data management a maintenance cost.
A small shift in perception (from viewing data as a
cost to regarding it as an asset) can dramatically change
how an organization manages data. Properly managed
data is an organizational asset that can’t be exhausted.
Although data can be polluted, retired, destroyed, or
become obsolete, it’s the one organizational resource that
can be repeatedly reused without deterioration, provided
that the appropriate safeguards are in place. Further, all
organizational activities depend on data.
To illustrate the potential payoff of the work presented
here, consider what 300 software professionals applying
software process improvement over an 18-year period
achieved:16
• They predicted costs within 10 percent.
• They missed only one deadline in 15 years.
• The relative cost to fix a defect is 1X during inspec-
tion, 13X during system testing, and 92X during
operation.
April 2007 49
Initial Repeatable Defined
Data program coordination 2.06 2.71 3.31
Enterprise data integration 2.18 2.44 2.66
Data stewardship 1.98 2.18 2.40
Data development 1.57 2.12 2.46
Data support operations 2.04 2.38 2.66
Figure 4.Average of unadjusted rates for the assessment results,by process.
50 Computer
• Early error detection rose from 45 to 95 percent
between 1982 and 1993.
• Product error rate (measured as defects per 1,000
lines of code) dropped from 2.0 to 0.01 between
1982 and 1993.
If improvements in data management can produce
similar results, organizations should increase their matu-
rity efforts. ■
Acknowledgments
We thank Graham Blevins, David Rafner, and Santa
Susarapu for their assistance in preparing some of the
reported data. We are greatly indebted to many of Peter
Aiken’s classes in data reengineering and related topics
at Virginia Commonwealth University for the careful
work and excellent results obtained as a result of their
various contributions to this research. This article also
benefited from the suggestions of several anonymous
reviewers. We also acknowledge the helpful, continuing
work of Brett Chaplin at Allstate in collecting, apply-
ing, and assessing CMMI-related efforts.
References
1. M. Blaha, “A Retrospective on Industrial Database Reverse
Engineering Projects—Parts 1 & 2,” Proc. 8th Working Conf.
Reverse Eng., IEEE Press, 2001, pp. 147-164.
2. J. Zachman, “A Framework for Information Systems Archi-
tecture,” IBM Systems J., vol. 26, 1987, pp. 276-292.
3. P.H. Aiken, “Keynote Address to the 2002 DAMA Interna-
tional Conference: Trends in Metadata,” Proc. 2002 DAMA
Int’l/Metadata Conf., CD-ROM, Wilshire Conf., 2002, pp.
1-32.
4. B. Parker, “Enterprise Data Management Process Maturity,”
Handbook of Data Management, S. Purba, ed., Auerbach
Publications, CRC Press, 1999, pp. 824-843.
5. M. Gillenson, “The State of Practice of Data Administration—
1981,” Comm. ACM, vol. 25, no. 10, 1982, pp. 699-706.
6. M. Gillenson, “Trends in Data Administration,” MIS Quar-
terly, Dec. 1985, pp. 317-325.
7. M. Gillenson, “Database Administration at the Crossroads:
The Era of End-User-Oriented, Decentralized Data Process-
ing,” J. Database Administration, Fall 1991, pp. 1-11.
8. R.L. Keeney, Value-Focused Thinking—A Path to Creative
Decisionmaking, Harvard Univ. Press, 1992.
9. R. Nolan, “Managing the Crisis in Data Processing,” Har-
vard Business Rev., Mar./Apr. 1979, pp. 115-126.
10. Carnegie Mellon Univ. Software Eng. Inst., Capability Matu-
rity Model: Guidelines for Improving the Software Process,
1st ed., Addison-Wesley Professional, 1995.
11. M.C. Paulk and B. Curtis, “Capability Maturity Model, Ver-
sion 1.1,” IEEE Software, vol. 10, 1993, pp. 18-28.
12. D.R. Goldenson and D.L. Gibson, “Demonstrating the Impact
and Benefits of CMM: An Update and Preliminary Results,”
special report CMU/SEI-2003-SR-009, Carnegie Mellon Univ.
Software Eng. Inst., 2003, pp. 1-55.
13. C. Billings and J. Clifton, “Journey to a Mature Software
Process,” IBM Systems J., vol. 33, 1994, pp. 46-62.
14. C.C. Hauer, “Data Management and the CMM/CMMI:
Translating Capability Maturity Models to Organizational
Functions,” presented at National Defense Industrial Assoc.
Technical Information Division Symp., 2003; www.dtic.mil/
ndia/2003technical/hauer1.ppt.
15. W. Schnider and K. Schwinn, “Der Reifegrad des Datenman-
agements” [The Data Management Maturity Model], KPP
Consulting; www.kpp-consulting.ch/downloadbereich/DM%
20Maturity%20Model.pdf, 2004 (in German).
16. H. Krasner, J. Pyles, and H. Wohlwend, “A Case History of
the Space Shuttle Onboard Systems Project,” Technology
Transfer 94092551A-TR, Sematech, 31 Oct. 1994.
Peter Aiken is an associate professor of information systems
at Virginia Commonwealth University and founding direc-
tor of Data Blueprint. His research interests include data
and systems reengineering. Aiken received a PhD in infor-
mation technology from George Mason University. He is a
senior member of the IEEE, the ACM, and the Data Man-
agement Association (DAMA) International. Contact him
at peter@datablueprint.com.
M. David Allen is chief operating officer of Data Blueprint.
His research interests include data and systems reengineer-
ing. Allen received an MS in information systems from
Virginia Commonwealth University. He is a member of
DAMA. Contact him at mda@datablueprint.com.
Burt Parker is an independent consultant based in Wash-
ington, DC. His technical interests include enterprise data
management program development. Parker received an
MBA in operations research/systems analysis (general sys-
tems theory) from the University of Michigan. He is a mem-
ber of DAMA. Contact him at parkerbg@comcast.net.
Angela Mattia is a professor of information systems at J.
Sergeant Reynolds Community College. Her research inter-
ests include data and systems reengineering and maturity
models. Mattia received an MS in information systems from
Virginia Commonwealth University. She is a member of
DAMA. Contact her at amattia@jsr.vccs.edu.
Four Current Data Truths
1. Data volume is still
increasing faster than
we are able to process it,
2. Data interchange
overhead and other
costs of poor data
practices are
measurably sapping
organization and individual resources–and therefore productivity,
3. Reliance on existing technology-based approaches and
education methods has not materially addressed this gap
between creation and processing or reduced bottom line costs, &
4. There exists an industry-type, whose sole purpose is to extract
data from citizens and then use it for to make money.
© Copyright 2022 by Peter Aiken Slide # 3
https://anythingawesome.com
Doing Data Better means that you
• Understand that the vastness and quality of data
plays an increasing role in everyone’s life
• Are motivated to increase your individual data
skills because you now know that poor data skills:
– Cost you more
– Steal increasing amounts of your time
– Deliver less
– Presents greater risk
• Recognize the critical importance of data management in modern
life and its positive and negative applications
• Develop defensive skills to differentiate between good and bad
data (understanding that most data is of unknown quality)
• Can assign values to some of your personal data and its use
• Are able to take advantage of decreases in the general workload
load needed to effectively manage data in your professional and
personal life
© Copyright 2022 by Peter Aiken Slide # 4
https://anythingawesome.com
© Copyright 2022 by Peter Aiken Slide # 5
https://anythingawesome.com
© Copyright 2022 by Peter Aiken Slide #
https://anythingawesome.com
Program
6
• Motivation
- Frustration–we are unsatisfied with current state
- Are we making progress? (No)
• How did we get here? (Building on proven research)
- DoD ➜ SEI ➜ MITRE ➜ CMMI
- Industry push for best practices
• Ingredients
- What is the Data Maturity Model? (DMM)
- Body of Knowledge (DM BOK)
• Understanding and applying them together
- Weakest link in the chain architecture
- Just a bit on strategy
- Three legged stool
- How does one get to Carnegie Hall?
• Where to next?
• Q & A?
1 + 1 = 11
Practicing
Data Management
Better
Measures of Unproductivity
Knowledge Worker Stress
• 33% of time spent reworking/ recreating knowledge that already exists!
• 10% of time spent creating new knowledge and content
• 53% would rather to household chores
• 52% would rather pay bills than use content management/repositories
• 74% report feeling
overwhelmed or unhappy
when working with data
• 33% of overwhelmed
employees spend at least
one hour a week
procrastinating over
data-related tasks
© Copyright 2022 by Peter Aiken Slide # 7
https://anythingawesome.com
https://medium.com/interoperable/knowledge-workers-information-life-cycles-and-content-silos-oh-my-a4263eed427
Measurements
Everyone
• 14% have a good understanding of how to use business data
• 21% aged 16-24 classified themselves as data literate
Conclusion: future employees are underprepared for data-driven workplaces
Business decision makers
• 24% of business decision makers feel fully confident in their ability to
read, work with, analyze & argue with that data
• 33% are able to create measurable value from data
• 27% say my analytics projects produce actionable insights
• 78% willing to invest time/energy improving data skillsets
© Copyright 2022 by Peter Aiken Slide # 8
https://anythingawesome.com
http://TheDataLiteracyProject.org
When asked to incorporate data
• Data appreciation isn't translating
into employee adoption
– 48% frequently make gut decisions
– 66% for C-suite executives
• Lack of data skills is limiting
workplace productivity
– 36% said they would find an
alternative method to complete the task
without using data
– 14 percent avoid the task entirely
© Copyright 2022 by Peter Aiken Slide # 9
https://anythingawesome.com
http://TheDataLiteracyProject.org
Avoid Task Entirely
14%
Find Alt. Method
36%
Incorporate Data
50%
Too many organizations have simply
put data in the hands of employees and
expected them to make a success of it
Incorporate Data
52%
Defer to Gut
48%
Incorporate Data
34%
Defer to Gut
66%
http://TheDataLiteracyProject.org
Why weren't my
data problems
solved when we
moved to the
cloud?
© Copyright 2022 by Peter Aiken Slide # 10
https://anythingawesome.com
Data in the cloud should have three attributes that
data outside the cloud/warehouse should not
have. It should be:
Sharable-er
© Copyright 2022 by Peter Aiken Slide # 11
https://anythingawesome.com
Cleaner
Smaller
Transform
Problems with forklifting
1. no basis for decisions made
2. no inclusion of architecture/
engineering concepts
3. no idea that these
concepts are missing
from the process
4. 80% of
organizational
data is ROT
© Copyright 2022 by Peter Aiken Slide # 12
https://anythingawesome.com
Less
Cleaner
More shareable
... data
Making Warehousing Successful
-—Cloud——
Data Branding
Fixing Data in the Cloud is Like Using A Glovebox
© Copyright 2022 by Peter Aiken Slide # 13
https://anythingawesome.com
Success Requires a 3-Legged Stool
© Copyright 2022 by Peter Aiken Slide # 14
https://anythingawesome.com
P
e
o
p
l
e
Process
T
e
c
h
n
o
l
o
g
y
Current approaches are not and have not been working
© Copyright 2022 by Peter Aiken Slide # 15
https://anythingawesome.com
Driving Innovation with Data
Competing on data and analytics
Managing data as a business asset
Created a data-driven organization
Forged a data culture
25% 50% 75% 100%
24%
24%
39%
41%
49%
Yes No
Source: Big Data and AI Executive Survey 2021 by Randy Bean and Thomas Davenport www.newvantage.com
2021
0%
25%
50%
75%
100%
technology people/process
92.2%
7.8%
80% of data challenges are people/process based!
&
Data Leadership is the only resource to address these challenges
Data Management Data Governance Program
External Comprehension
© Copyright 2022 by Peter Aiken Slide # 16
https://anythingawesome.com
Sample from: https://artist.com/kathy-linden/on-outside-looking-in/?artid=4385
Everything Else Data
Data (blah blah blah)
Data Program
Most do not appreciate the
difference between Data
Governance and the other data
stuff that needs to be done
© Copyright 2022 by Peter Aiken Slide #
https://anythingawesome.com
Program
17
• Motivation
- Frustration–we are unsatisfied with current state
- Are we making progress? (No)
• How did we get here? (Building on proven research)
- DoD ➜ SEI ➜ MITRE ➜ CMMI
- Industry push for best practices
• Ingredients
- What is the Data Maturity Model? (DMM)
- Body of Knowledge (DM BOK)
• Understanding and applying them together
- Weakest link in the chain architecture
- Just a bit on strategy
- Three legged stool
- How does one get to Carnegie Hall?
• Where to next?
• Q & A?
1 + 1 = 11
Practicing
Data Management
Better
Motivation
© Copyright 2022 by Peter Aiken Slide #
"One day Alice came to a fork in the road and
saw a Cheshire cat in a tree. Which road do I
take? she asked. Where do you want to go? was
his response. I don't know, Alice answered.
Then, said the cat, it doesn't matter."
Lewis Carroll from Alice in Wonderland
18
https://anythingawesome.com
• "We want to move our data management
program to the next level"
– Question: What level are you at now?
• You are currently managing your data,
– But, if you can't measure it,
– How can you manage it effectively?
• How do you know where to put time,
money, and energy so that data
management best supports the mission?
DoD Origins
• US DoD Reverse Engineering Program Manager
• We sponsored research at the CMM/SEI asking
– “How can we measure the performance of DoD and our partners?”
– “Go check out what the Navy is up to!”
• SEI responded with an integrated process/data improvement
approach
– DoD required SEI to remove the data portion of the approach
– It grew into CMMI/DM BoK, etc.
© Copyright 2022 by Peter Aiken Slide # 19
https://anythingawesome.com
MITRE Corporation: Data Management Maturity Model
• Internal research project: Oct ‘94-Sept ‘95
• Based on Software Engineering Institute Capability Maturity
Model (SEI CMMSM) for Software Development Projects
• Key Process Areas (KPAs) parallel SEI CMMSM KPAs, but
with data management focus and key practices
• Normative model for data management required; need to:
Understand scope of data management
Organize data management key practices
• Reported as not-done-well by those who do it
Acknowledgements
© Copyright 2022 by Peter Aiken Slide #
version (changing data into other forms, states, or
products), or scrubbing (inspecting and manipulat-
ing, recoding, or rekeying data to prepare it for sub-
Increasing data management practice maturity levels can positively impact the
coordination of data flow among organizations,individuals,and systems. Results
from a self-assessment provide a roadmap for improving organizational data
management practices.
Peter Aiken, Virginia Commonwealth University/Institute for Data Research
M. David Allen, Data Blueprint
Burt Parker, Independent consultant
Angela Mattia, J. Sergeant Reynolds Community College
s increasing amounts of data flow within and
between organizations, the problems that can
result from poor data management practices
Measuring Data Management
Practice Maturity:
A Community’s
Self-Assessment
20
https://anythingawesome.com
version (changing data into other forms, states, or
products), or scrubbing (inspecting and manipulat-
ing, recoding, or rekeying data to prepare it for sub-
Increasing data management practice maturity levels can positively impact the
coordination of data flow among organizations,individuals,and systems. Results
from a self-assessment provide a roadmap for improving organizational data
management practices.
Peter Aiken, Virginia Commonwealth University/Institute for Data Research
M. David Allen, Data Blueprint
Burt Parker, Independent consultant
Angela Mattia, J. Sergeant Reynolds Community College
s increasing amounts of data flow within and
between organizations, the problems that can
result from poor data management practices
Measuring Data Management
Practice Maturity:
A Community’s
Self-Assessment
CMMI Institute Background
• Evolved from Carnegie Mellon’s Software Engineering Institute
(SEI) - a federally funded research and development center
(FFRDC)
• Continues to support and provide all CMMI offerings and services
delivered over its 20+ year history at the SEI
– Industry leading reference models - benchmarks and guidelines for improvement
➜ Development
➜ Acquisition
➜ Services
➜ People
➜ Data Management
– Training and Certification program, Partner program
• Dedicated training, partner and certification teams to support
organizations and professionals
• Now owned by ISACA (CISO/M, COBIT, IT Governance,
Cybersecurity) and joint product offerings are planned
© Copyright 2022 by Peter Aiken Slide # 21
https://anythingawesome.com
Melanie Mecca
• Former CMMI Institute/Director of Data Management Products and
Services ➜ datawise.inc
• 30+ years designing and
implementing strategies and
solutions for private/public sectors
• Architecture/Design experience in:
– Data Management Programs
– Enterprise Data Architecture
– Enterprise Architecture
• DMM's Managing Author
Certified Partner, CMMI Institute
– melanie@datawise-inc.com
© Copyright 2022 by Peter Aiken Slide # 22
https://anythingawesome.com
Data Management Maturity (DMM)SM Model
• DMM 1.0 released August 2014
– 3.5 years in development
– Sponsors – Microsoft, Lockheed Martin,
Booz Allen Hamilton
– 50+ contributing authors, 70+ peer
reviewers, 80+ orgs
• Reference model framework of
fundamental best practices
– 414 specific practice statements
– 596 functional work products
– Maturity practices
• Measurement instrument for
organizations to evaluate
capabilities and maturity, identify
gaps, and incorporate guidelines
for improvements.
© Copyright 2022 by Peter Aiken Slide # 23
https://anythingawesome.com
DMM Structure
© Copyright 2022 by Peter Aiken Slide #
Core Category
Process Area
Purpose
Introductory Notes
Goal(s) of the Process Area
Core Questions for the Process Area
Functional Practices (Levels 1-5)
r
Related Process Areas
Example Work Products
Infrastructure Support Practices
e
Explanatory Model Components R equired for Model Compliance
24
https://anythingawesome.com
“You Are What You DO”
• Model emphasizes behavior
– Proactive positive behavioral
changes
– Creating and carrying out
effective, repeatable processes
– Leveraging and extending across
the organization
• Activities result in work
products
– Processes, standards, guidelines,
templates, policies, etc.
– Reuse and extension = maximum
value, lower costs, happier staff
• Practical focus reflects real-
world organizations –
enterprise program evolving
to all hands on deck.
© Copyright 2022 by Peter Aiken Slide # 25
https://anythingawesome.com
Source: Applications Executive Council, Applications Budget, Spend, and Performance Benchmarks: 2005 Member Survey Results, Washington D.C.: Corporate Executive Board 2006, p. 23.
Percentage of Projects on Budget
By Process Framework Adoption
…while the same pattern generally holds true for on-time performance
Percentage of Projects on Time
By Process Framework Adoption
Key Finding: Process Frameworks are not Created Equal
© Copyright 2022 by Peter Aiken Slide #
With the exception of CMM and ITIL, use of process-efficiency
frameworks does not predict higher on-budget project delivery…
26
https://anythingawesome.com
"While all improvement efforts begin
with the obligatory 'assessment'
phase, Carnegie Mellon’s CMMI and
DMM are the only proven
frameworks that have the added
benefit of literally decades of practice
and benchmarking data.
Organizations not using the DMM
risk an inability to meaningfully
compare results against other
organizations and, as a result, adopt
unproven methods."
© Copyright 2022 by Peter Aiken Slide # 27
https://anythingawesome.com
© Copyright 2022 by Peter Aiken Slide # 28
https://anythingawesome.com
Data
Management
Body
of
Knowledge
(DM
BoK
V2)
Practice
Areas
from The DAMA Guide to the Data Management Body of Knowledge 2E © 2017 by DAMA International
Missing two important concepts:
Optionality
Dependency
© Copyright 2022 by Peter Aiken Slide #
https://anythingawesome.com
Program
29
• Motivation
- Frustration–we are unsatisfied with current state
- Are we making progress? (No)
• How did we get here? (Building on proven research)
- DoD ➜ SEI ➜ MITRE ➜ CMMI
- Industry push for best practices
• Ingredients
- What is the Data Maturity Model? (DMM)
- Body of Knowledge (DM BOK)
• Understanding and applying them together
- Weakest link in the chain architecture
- Just a bit on strategy
- Three legged stool
- How does one get to Carnegie Hall?
• Where to next?
• Q & A?
1 + 1 = 11
Practicing
Data Management
Better
Our barn had to pass a foundation inspection
• Before further construction could proceed
• It makes good business sense
• No IT equivalent
© Copyright 2022 by Peter Aiken Slide # 30
https://anythingawesome.com
You can accomplish
Advanced Data Practices
without becoming proficient
in the Foundational Data
Practices however
this will:
• Take longer
• Cost more
• Deliver less
• Present
greater
risk
(with thanks to
Tom DeMarco)
Data Management Practices Hierarchy
© Copyright 2022 by Peter Aiken Slide #
Advanced
Data
Practices
• MDM
• Mining
• Big Data
• Analytics
• Warehousing
• SOA
Foundational Data Practices
Data Platform/Architecture
Data Governance Data Quality
Data Operations
Data Management Strategy
T
e
c
h
n
o
l
o
g
i
e
s
C
a
p
a
b
i
l
i
t
i
e
s
31
https://anythingawesome.com
• AI/ML
• Ops
• Mesh
• Crypto
© Copyright 2022 by Peter Aiken Slide #
Data$Management$
Strategy
Data Management Goals
Corporate Culture
Data Management Funding
Data Requirements Lifecycle
Data
Governance
Governance Management
Business Glossary
Metadata Management
Data
Quality
Data Quality Framework
Data Quality Assurance
Data
Operations
Standards and Procedures
Data Sourcing
Platform$&$
Architecture
Architectural Framework
Platforms & Integration
Supporting$
Processes
Measurement & Analysis
Process Management
Process Quality Assurance
Risk Management
Configuration Management
Component Process$Areas
Data
Governance
Data
Management
Strategy
Data
Operations
Platform
Architecture
Supporting
Processes
Maintain fit-for-purpose data,
efficiently and effectively
32
https://anythingawesome.com
Manage data coherently
Manage data assets professionally
Data life cycle
management
Organizational support
Data
Quality
Data$Management$
Strategy
Data Management Goals
Corporate Culture
Data Management Funding
Data Requirements Lifecycle
Data
Governance
Governance Management
Business Glossary
Metadata Management
Data
Quality
Data Quality Framework
Data Quality Assurance
Data
Operations
Standards and Procedures
Data Sourcing
Platform$&$
Architecture
Architectural Framework
Platforms & Integration
Supporting$
Processes
Measurement & Analysis
Process Management
Process Quality Assurance
Risk Management
Configuration Management
Component Process$Areas
Data$Management$
Strategy
Data Management Goals
Corporate Culture
Data Management Funding
Data Requirements Lifecycle
Data
Governance
Governance Management
Business Glossary
Metadata Management
Data
Quality
Data Quality Framework
Data Quality Assurance
Data
Operations
Standards and Procedures
Data Sourcing
Platform$&$
Architecture
Architectural Framework
Platforms & Integration
Supporting$
Processes
Measurement & Analysis
Process Management
Process Quality Assurance
Risk Management
Configuration Management
Component Process$Areas
Data$Management$
Strategy
Data Management Goals
Corporate Culture
Data Management Funding
Data Requirements Lifecycle
Data
Governance
Governance Management
Business Glossary
Metadata Management
Data
Quality
Data Quality Framework
Data Quality Assurance
Data
Operations
Standards and Procedures
Data Sourcing
Platform$&$
Architecture
Architectural Framework
Platforms & Integration
Supporting$
Processes
Measurement & Analysis
Process Management
Process Quality Assurance
Risk Management
Configuration Management
Component Process$Areas
Data$Management$
Strategy
Data Management Goals
Corporate Culture
Data Management Funding
Data Requirements Lifecycle
Data
Governance
Governance Management
Business Glossary
Metadata Management
Data
Quality
Data Quality Framework
Data Quality Assurance
Data
Operations
Standards and Procedures
Data Sourcing
Platform$&$
Architecture
Architectural Framework
Platforms & Integration
Supporting$
Processes
Measurement & Analysis
Process Management
Process Quality Assurance
Risk Management
Configuration Management
Component Process$Areas
Data$Management$
Strategy
Data Management Goals
Corporate Culture
Data Management Funding
Data Requirements Lifecycle
Data
Governance
Governance Management
Business Glossary
Metadata Management
Data
Quality
Data Quality Framework
Data Quality Assurance
Data
Operations
Standards and Procedures
Data Sourcing
Platform$&$
Architecture
Architectural Framework
Platforms & Integration
Supporting$
Processes
Measurement & Analysis
Process Management
Process Quality Assurance
Risk Management
Configuration Management
Component Process$Areas
Data architecture
implementation
DMM℠ Structure of
5 Integrated
DM Practice Areas
© Copyright 2022 by Peter Aiken Slide #
Data architecture
implementation
Data
Governance
Data
Management
Strategy
Data
Operations
Platform
Architecture
Supporting
Processes
Maintain fit-for-purpose data,
efficiently and effectively
33
https://anythingawesome.com
Manage data coherently
Manage data assets professionally
Data life cycle
management
Organizational support
Data
Quality
Data
Governance
Data
Quality
Platform
Architecture
Data
Operations
Data
Management
Strategy
3 3
3
3
1
Supporting
Processes
Optimized
Measured
Defined
Managed
Initial
Optimized
Measured
Defined
Managed
Initial
Optimized
Measured
Defined
Managed
Initial
Optimized
Measured
Defined
Managed
Initial
Your data foundation
can only be as strong
as its weakest link!
Optimized
Measured
Defined
Managed
Initial
© Copyright 2022 by Peter Aiken Slide #
Data Management Practice Areas
Data Management
Strategy
DM is practiced as a
coherent and
coordinated set of
activities
Data Quality
Delivery of data is
support of
organizational
objectives – the
currency of DM
Data
Governance
Designating specific
individuals caretakers
for certain data
Data Platform/
Architecture
Efficient delivery of
data via appropriate
channels
Data Operations
Ensuring reliable
access to data
Capability
Maturity Model
Levels
Examples of practice
maturity
1 – Performed
Our DM practices are ad hoc and
dependent upon "heroes" and
heroic efforts
2 – Managed
We have DM experience and have
the ability to implement disciplined
processes
3 – Defined
We have standardized DM
practices so that all in the
organization can perform it with
uniform quality
4 – Measured
We manage our DM processes so
that the whole organization can
follow our standard DM guidance
5 – Optimized
We have a process for improving
our DM capabilities
34
https://anythingawesome.com
Assessment Components
Sample Assessment Summary
© Copyright 2022 by Peter Aiken Slide # 35
https://anythingawesome.com
Cumulative Benchmark – Multiple organizations
© Copyright 2022 by Peter Aiken Slide # 36
https://anythingawesome.com
Assessments
© Copyright 2022 by Peter Aiken Slide # 37
https://anythingawesome.com
• It is generally not worth a
lot of investment to
discover that you are at
the very beginning of
your journey
• Use it to uncover
previously unknown
pockets of excellence
• First plan should be
examine the feasibility of
expanding these to other
parts of the organization
Industry Focused Results
© Copyright 2022 by Peter Aiken Slide # 38
https://anythingawesome.com
Data Management Strategy
Data Governance
Platform & Architecture
Data Quality
Data Operations
Focus:
Implementation
and Access
Focus:
Guidance and
Facilitation
Optimized
(V)
Measured
(IV)
Defined
(III)
Managed
(II)
Initial
(I)
• CMU's Software
Engineering Institute (SEI) Collaboration
• Results from hundreds organizations in various industries
including:
✓ Public Companies
✓ State Government Agencies
✓ Federal Government
✓ International Organizations
• Defined industry standard
• Steps toward defining data management "state of the practice"
Development guidance
Data Adminstration
Support systems
Asset recovery capability
Development training
0 1 2 3 4 5
Client Industry Competition All Respondents
Data Management Practices Assessment
© Copyright 2022 by Peter Aiken Slide #
Challenge
Challenge
Challenge
Data Program
Coordination
Organizational Data
Integration
Data Stewardship
Data Development
Data Support
Operations
39
https://anythingawesome.com
High Marks for IFC's Audit
© Copyright 2022 by Peter Aiken Slide # 40
https://anythingawesome.com
Leadership & Guidance
Asset Creation
Metadata Management
Quality Assurance
Change Management
Data Quality
0 1 2 3 4 5
TRE ISG IFC Industry Benchmarks Overall Benchmarks
1
2
3
4
5
Data
Program
Coordination
Organizational
Data
Integration
Data
Stewardship
Data
Development
Data
Support
Operations
2007 Maturity Levels 2019 Maturity Levels
Comparison of DM Maturity 2007-2019
© Copyright 2022 by Peter Aiken Slide # 41
https://anythingawesome.com
How Literate are we?
What is NAAL?
• a Nationally representative Assessment of English Literacy
among American Adults age 16 and older NAAL ➜ PIAAC (Program for the International Assessment of Adult Competencies)
• PIAAC assesses three key competencies for 21st-century society and the global economy
• No statistically significant differences from 2012/14 to 2017!
© Copyright 2022 by Peter Aiken Slide # 42
https://anythingawesome.com
https://nces.ed.gov/surveys/piaac/current_results.asp
• Literacy
the ability to understand, use, and
respond appropriately to written
texts.
• Numeracy
the ability to use basic
mathematical and computational
skills.
• Digital Problem Solving
the ability to access/interpret
information in digital environments
to perform practical tasks.
Strategy Guides Workgroup Activities
© Copyright 2022 by Peter Aiken Slide #
A pattern
in a stream
of decisions
43
https://anythingawesome.com
Mintzberg
Theory of Constraints - Generic
© Copyright 2022 by Peter Aiken Slide # 44
https://anythingawesome.com
Identify the current constraints,
the components of the system
limiting goal realization
Make quick
improvements
to the constraint
using existing
resources
Review other activities in the process facilitate proper alignment and support of constraint
If the constraint
persists, identify other
actions to eliminate
the constraint
Repeat until the
constraint is
eliminated
Alleviate
Strategy Example 1
© Copyright 2022 by Peter Aiken Slide #
Good Guys
(Us)
Bad Guys
(Them)
45
https://anythingawesome.com
Strategy Example 2
© Copyright 2022 by Peter Aiken Slide #
Good Guys
(Us)
Bad Guys
(Them)
46
https://anythingawesome.com
Strategy Example 3
© Copyright 2022 by Peter Aiken Slide #
Good Guys
(Us)
Bad Guys
(Them)
47
https://anythingawesome.com
General Dwight D. Eisenhower
© Copyright 2022 by Peter Aiken Slide # 48
https://anythingawesome.com
“In preparing for battle I have always found that plans
are useless, but planning is indispensable …”
https://quoteinvestigator.com/2017/11/18/planning/
–
“In preparing for battle I have always found that plans
are useless, but planning is indispensable …”
https://quoteinvestigator.com/2017/11/18/planning/
© Copyright 2022 by Peter Aiken Slide # 49
https://anythingawesome.com
Data
Sandwich
Data supply
Data literacy
Standard data
Standard data
Leverage point - high performance automation
© Copyright 2022 by Peter Aiken Slide #
Data literacy
50
https://anythingawesome.com
Data supply
Leverage point - high performance automation
© Copyright 2022 by Peter Aiken Slide #
Standard data
Data supply
Data literacy
51
https://anythingawesome.com
Leverage point - high performance automation
© Copyright 2022 by Peter Aiken Slide #
This cannot happen without investments in
engineering and architecture!
52
https://anythingawesome.com
Data supply
Data literacy
Standard data
Quality engineering/
architecture work products
do not happen accidentally!
Quality data engineering/
architecture work products
do not happen accidentally!
Leverage point - high performance automation
© Copyright 2022 by Peter Aiken Slide #
This cannot happen without investments in
data engineering and architecture!
53
https://anythingawesome.com
Data supply
Data literacy
Standard data
Version 1
© Copyright 2022 by Peter Aiken Slide # 54
https://anythingawesome.com
Data
Strategy
Data
Governance
BI/
Warehouse
Perfecting
operations in 3
data management
practice areas
1X
1X
1X
Metadata
Data
Quality
Version 2
© Copyright 2022 by Peter Aiken Slide # 55
https://anythingawesome.com
Data
Strategy
Data
Governance
BI/
Warehouse
Perfecting
operations in 3
data management
practice areas
Metadata
2X
2X
1X
Version 3
© Copyright 2022 by Peter Aiken Slide # 56
https://anythingawesome.com
Data
Strategy
Data
Governance
BI/
Warehouse
Reference &
Master Data
Perfecting
operations in 3
data management
practice areas
1X
3X
3X
(Things that further)
Organizational Strategy
Lighthouse Projects Provide Focus
© Copyright 2022 by Peter Aiken Slide #
(
O
c
c
a
s
i
o
n
s
t
o
P
r
a
c
t
i
c
e
)
N
e
e
d
e
d
D
a
t
a
S
k
i
l
l
s
(
O
p
p
o
r
t
u
n
i
t
i
e
s
t
o
i
m
p
r
o
v
e
)
D
a
t
a
u
s
e
b
y
t
h
e
b
u
s
i
n
e
s
s
57
https://anythingawesome.com
A Musical Analogy
© Copyright 2022 by Peter Aiken Slide # 58
https://anythingawesome.com
+ =
https://www.youtube.com/watch?v=4n1GT-VjjVs&frags=pl%2Cwn
© Copyright 2022 by Peter Aiken Slide #
https://anythingawesome.com
Program
59
• Motivation
- Frustration–we are unsatisfied with current state
- Are we making progress? (No)
• How did we get here? (Building on proven research)
- DoD ➜ SEI ➜ MITRE ➜ CMMI
- Industry push for best practices
• Ingredients
- What is the Data Maturity Model? (DMM)
- Body of Knowledge (DM BOK)
• Understanding and applying them together
- Weakest link in the chain architecture
- Just a bit on strategy
- Three legged stool
- How does one get to Carnegie Hall?
• Where to next?
• Q & A?
1 + 1 = 11
Practicing
Data Management
Better
Change Management & Leadership
© Copyright 2022 by Peter Aiken Slide # 60
https://anythingawesome.com
Diagnosing Organizational Readiness
© Copyright 2022 by Peter Aiken Slide #
adapted from the Managing Complex Change model by Lippitt, 1987
Culture is the biggest impediment to a
shift in organizational thinking about data!
61
https://anythingawesome.com
No cost, no registration case study download
• Download
– http://dl.acm.org/citation.cfm?doid=2888577.2893482
© Copyright 2022 by Peter Aiken Slide # 62
https://anythingawesome.com
8
EXPERIENCE: Succeeding at Data Management—BigCo Attempts to
Leverage Data
PETER AIKEN, Virginia Commonwealth University/Data Blueprint
In a manner similar to most organizations, BigCompany (BigCo) was determined to benefit strategically from
its widely recognized and vast quantities of data. (U.S. government agencies make regular visits to BigCo to
learn from its experiences in this area.) When faced with an explosion in data volume, increases in complexity,
and a need to respond to changing conditions, BigCo struggled to respond using a traditional, information
technology (IT) project-based approach to address these challenges. As BigCo was not data knowledgeable,
it did not realize that traditional approaches could not work. Two full years into the initiative, BigCo was
far from achieving its initial goals. How much more time, money, and effort would be required before results
were achieved? Moreover, could the results be achieved in time to support a larger, critical, technology-driven
challenge that also depended on solving the data challenges? While these questions remain unaddressed,
these considerations increase our collective understanding of data assets as separate from IT projects.
Only by reconceiving data as a strategic asset can organizations begin to address these new challenges.
Transformation to a data-driven culture requires far more than technology, which remains just one of three
required “stool legs” (people and process being the other two). Seven prerequisites to effectively leveraging
data are necessary, but insufficient awareness exists in most organizations—hence, the widespread misfires
in these areas, especially when attempting to implement the so-called big data initiatives. Refocusing on
foundational data management practices is required for all organizations, regardless of their organizational
or data strategies.
Categories and Subject Descriptors: H.2.0 [Information Systems]: Database Management—General; E.0
[Data]: General
General Terms: Management, Performance, Design
Additional Key Words and Phrases: Data management, data governance, data stewardship, organizational
design, CDO, CIO, chief data officer, chief information officer, data, data architecture, enterprise data exec-
utive, IT management, strategy, policy, enterprise architecture, information systems, conceptual modeling,
data integration, data warehousing, analytics, and business intelligence, BigCo
ACM Reference Format:
Peter Aiken. 2016. Experience: Succeeding at data management—BigCo attempts to leverage data. J. Data
and Information Quality 7, 1–2, Article 8 (May 2016), 35 pages.
DOI: http://dx.doi.org/10.1145/2893482
1. CASE INTRODUCTION
Good technology in the hands of an inexperienced user rarely produces positive
results.
Everyone wants to “leverage” data. Today, this is most often interpreted as invest-
ments in warehousing, analytics, business intelligence (BI), and so on. After all, that
is what you do with an asset—you leverage it—so the asset can help you to attain
strategic objectives; see Redman [2008] and Ladley [2010]. Widespread and pervasive
Author’s address: P. Aiken, 10124C West Broad Street, Glen Allen, VA 23060; email: peter.aiken@vcu.edu.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted
without fee provided that copies are not made or distributed for profit or commercial advantage and that
copies bear this notice and the full citation on the first page. Copyrights for components of this work owned
by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request
permissions from Permissions@acm.org.
2016 Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 1936-1955/2016/05-ART8 $15.00
DOI: http://dx.doi.org/10.1145/2893482
ACM Journal of Data and Information Quality, Vol. 7, No. 1–2, Article 8, Publication date: May 2016.
• Download
1. Data volume is still
increasing faster than
we are able to process it,
2. Data interchange
overhead and other
costs of poor data
practices are
measurably sapping
organization and individual resources–and therefore
productivity,
3. Reliance on existing technology-based approaches
and education methods has not materially
addressed this gap between creation and
processing or reduced bottom line costs, &
4. There exists an industry-type, whose sole purpose is
to extract data from citizens and then use it for to
make money.
Big changes
1. Process is more important
than results at first
2. Failure is itself a lesson
3. People and process
aspects are not receiving
enough attention
4. Best practices do exist
© Copyright 2022 by Peter Aiken Slide # 63
https://anythingawesome.com
© Copyright 2022 by Peter Aiken Slide # 64
https://anythingawesome.com
• Motivation
- Frustration–we are unsatisfied with current state
- Are we making progress? (No)
• How did we get here? (Building on proven research)
- DoD ➜ SEI ➜ MITRE ➜ CMMI
- Industry push for best practices
• Ingredients
- What is the Data Maturity Model? (DMM)
- Body of Knowledge (DM BOK)
• Understanding and applying them together
- Weakest link in the chain architecture
- Just a bit on strategy
- Three legged stool
- How does one get to Carnegie Hall?
• Where to next?
• Q & A?
Program Practicing
Data Management
Better
1 + 1 = 11
Upcoming Events
Data Strategy Best Practices
10 January 2023
Data Modeling Fundamentals
14 February 2023
Data Stewards - Defining and Assigning
14 March 2023
© Copyright 2022 by Peter Aiken Slide # 65
https://anythingawesome.com
Brought to you by:
Time: 19:00 UTC (2:00 PM NYC) | Presented by: Peter Aiken, PhD
Note: In this .pdf, clicking any webinar title opens the registration link
Event Pricing
© Copyright 2022 by Peter Aiken Slide # 66
https://anythingawesome.com
• 20% off
directly from the publisher on
select titles
• My Book Store @
http://plusanythingawesome.com
• Enter the code
"anythingawesome" at the
Technics bookstore checkout
where it says to
"Apply Coupon"
Peter.Aiken@AnythingAwesome.com +1.804.382.5957
Thank You!
© Copyright 2022 by Peter Aiken Slide # 67
Book a call with Peter to discuss anything - https://anythingawesome.com/OfficeHours.html
Critical Design Review?
Hiring Assistance?
Reverse Engineering Expertise?
Executive Data
Literacy Training?
Mentoring?
Tool/automation evaluation?
Use your data more strategically?
Independent
Verification
&
Validation
Collaboration?

More Related Content

What's hot

Data Architecture for Data Governance
Data Architecture for Data GovernanceData Architecture for Data Governance
Data Architecture for Data GovernanceDATAVERSITY
 
Improving Data Literacy Around Data Architecture
Improving Data Literacy Around Data ArchitectureImproving Data Literacy Around Data Architecture
Improving Data Literacy Around Data ArchitectureDATAVERSITY
 
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...Data Architecture, Solution Architecture, Platform Architecture — What’s the ...
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
 
Data Governance — Aligning Technical and Business Approaches
Data Governance — Aligning Technical and Business ApproachesData Governance — Aligning Technical and Business Approaches
Data Governance — Aligning Technical and Business ApproachesDATAVERSITY
 
Data Strategy Best Practices
Data Strategy Best PracticesData Strategy Best Practices
Data Strategy Best PracticesDATAVERSITY
 
Building a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business GoalsBuilding a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
 
Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
 
Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)DATAVERSITY
 
Data Catalogs Are the Answer – What is the Question?
Data Catalogs Are the Answer – What is the Question?Data Catalogs Are the Answer – What is the Question?
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
 
Data Governance Best Practices, Assessments, and Roadmaps
Data Governance Best Practices, Assessments, and RoadmapsData Governance Best Practices, Assessments, and Roadmaps
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
 
The Importance of Metadata
The Importance of MetadataThe Importance of Metadata
The Importance of MetadataDATAVERSITY
 
Becoming a Data-Driven Organization - Aligning Business & Data Strategy
Becoming a Data-Driven Organization - Aligning Business & Data StrategyBecoming a Data-Driven Organization - Aligning Business & Data Strategy
Becoming a Data-Driven Organization - Aligning Business & Data StrategyDATAVERSITY
 
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...DATAVERSITY
 
Building a Data Governance Strategy
Building a Data Governance StrategyBuilding a Data Governance Strategy
Building a Data Governance StrategyAnalytics8
 
Enterprise Data Management Framework Overview
Enterprise Data Management Framework OverviewEnterprise Data Management Framework Overview
Enterprise Data Management Framework OverviewJohn Bao Vuu
 
Glossaries, Dictionaries, and Catalogs Result in Data Governance
Glossaries, Dictionaries, and Catalogs Result in Data GovernanceGlossaries, Dictionaries, and Catalogs Result in Data Governance
Glossaries, Dictionaries, and Catalogs Result in Data GovernanceDATAVERSITY
 
Data Catalog as a Business Enabler
Data Catalog as a Business EnablerData Catalog as a Business Enabler
Data Catalog as a Business EnablerSrinivasan Sankar
 
DW Migration Webinar-March 2022.pptx
DW Migration Webinar-March 2022.pptxDW Migration Webinar-March 2022.pptx
DW Migration Webinar-March 2022.pptxDatabricks
 
Building a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business GoalsBuilding a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
 

What's hot (20)

Data Architecture for Data Governance
Data Architecture for Data GovernanceData Architecture for Data Governance
Data Architecture for Data Governance
 
Improving Data Literacy Around Data Architecture
Improving Data Literacy Around Data ArchitectureImproving Data Literacy Around Data Architecture
Improving Data Literacy Around Data Architecture
 
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...Data Architecture, Solution Architecture, Platform Architecture — What’s the ...
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...
 
Data Governance — Aligning Technical and Business Approaches
Data Governance — Aligning Technical and Business ApproachesData Governance — Aligning Technical and Business Approaches
Data Governance — Aligning Technical and Business Approaches
 
Data Strategy Best Practices
Data Strategy Best PracticesData Strategy Best Practices
Data Strategy Best Practices
 
Building a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business GoalsBuilding a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business Goals
 
Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?Emerging Trends in Data Architecture – What’s the Next Big Thing?
Emerging Trends in Data Architecture – What’s the Next Big Thing?
 
Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)Data Governance Takes a Village (So Why is Everyone Hiding?)
Data Governance Takes a Village (So Why is Everyone Hiding?)
 
Data Catalogs Are the Answer – What is the Question?
Data Catalogs Are the Answer – What is the Question?Data Catalogs Are the Answer – What is the Question?
Data Catalogs Are the Answer – What is the Question?
 
Data Governance Best Practices, Assessments, and Roadmaps
Data Governance Best Practices, Assessments, and RoadmapsData Governance Best Practices, Assessments, and Roadmaps
Data Governance Best Practices, Assessments, and Roadmaps
 
The Importance of Metadata
The Importance of MetadataThe Importance of Metadata
The Importance of Metadata
 
Becoming a Data-Driven Organization - Aligning Business & Data Strategy
Becoming a Data-Driven Organization - Aligning Business & Data StrategyBecoming a Data-Driven Organization - Aligning Business & Data Strategy
Becoming a Data-Driven Organization - Aligning Business & Data Strategy
 
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...
 
Building a Data Governance Strategy
Building a Data Governance StrategyBuilding a Data Governance Strategy
Building a Data Governance Strategy
 
Enterprise Data Management Framework Overview
Enterprise Data Management Framework OverviewEnterprise Data Management Framework Overview
Enterprise Data Management Framework Overview
 
Glossaries, Dictionaries, and Catalogs Result in Data Governance
Glossaries, Dictionaries, and Catalogs Result in Data GovernanceGlossaries, Dictionaries, and Catalogs Result in Data Governance
Glossaries, Dictionaries, and Catalogs Result in Data Governance
 
Data Catalog as a Business Enabler
Data Catalog as a Business EnablerData Catalog as a Business Enabler
Data Catalog as a Business Enabler
 
8 Steps to Creating a Data Strategy
8 Steps to Creating a Data Strategy8 Steps to Creating a Data Strategy
8 Steps to Creating a Data Strategy
 
DW Migration Webinar-March 2022.pptx
DW Migration Webinar-March 2022.pptxDW Migration Webinar-March 2022.pptx
DW Migration Webinar-March 2022.pptx
 
Building a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business GoalsBuilding a Data Strategy – Practical Steps for Aligning with Business Goals
Building a Data Strategy – Practical Steps for Aligning with Business Goals
 

Similar to Data Management Best Practices Self-Assessment

Data governance
Data governanceData governance
Data governanceMD Redaan
 
Data Quality Strategy: A Step-by-Step Approach
Data Quality Strategy: A Step-by-Step ApproachData Quality Strategy: A Step-by-Step Approach
Data Quality Strategy: A Step-by-Step ApproachFindWhitePapers
 
data-management-strategy data-management-strategy
data-management-strategy data-management-strategydata-management-strategy data-management-strategy
data-management-strategy data-management-strategymaheshs191007
 
Introduction to Data Governance
Introduction to Data GovernanceIntroduction to Data Governance
Introduction to Data GovernanceJohn Bao Vuu
 
CDMP SLIDE TRAINER .pptx
CDMP SLIDE TRAINER .pptxCDMP SLIDE TRAINER .pptx
CDMP SLIDE TRAINER .pptxssuser65981b
 
sas-data-governance-framework-107325
sas-data-governance-framework-107325sas-data-governance-framework-107325
sas-data-governance-framework-107325hurt3303
 
DGIQ 2013 Learned and Applied Concepts
DGIQ 2013 Learned and Applied Concepts DGIQ 2013 Learned and Applied Concepts
DGIQ 2013 Learned and Applied Concepts Angela Boyd
 
Business Value Through Reference and Master Data Strategies
Business Value Through Reference and Master Data StrategiesBusiness Value Through Reference and Master Data Strategies
Business Value Through Reference and Master Data StrategiesDATAVERSITY
 
Data-Ed Webinar: Data Governance Strategies
Data-Ed Webinar: Data Governance StrategiesData-Ed Webinar: Data Governance Strategies
Data-Ed Webinar: Data Governance StrategiesDATAVERSITY
 
Data-Ed: Data Governance Strategies
Data-Ed: Data Governance Strategies Data-Ed: Data Governance Strategies
Data-Ed: Data Governance Strategies Data Blueprint
 
Is Your Agency Data Challenged?
Is Your Agency Data Challenged?Is Your Agency Data Challenged?
Is Your Agency Data Challenged?DLT Solutions
 
Data Governance & Data Architecture - Alignment and Synergies
Data Governance & Data Architecture - Alignment and SynergiesData Governance & Data Architecture - Alignment and Synergies
Data Governance & Data Architecture - Alignment and SynergiesDATAVERSITY
 
data collection, data integration, data management, data modeling.pptx
data collection, data integration, data management, data modeling.pptxdata collection, data integration, data management, data modeling.pptx
data collection, data integration, data management, data modeling.pptxSourabhkumar729579
 
Boosting Cybersecurity with Data Governance (peer reviewed)
Boosting Cybersecurity with Data Governance (peer reviewed)Boosting Cybersecurity with Data Governance (peer reviewed)
Boosting Cybersecurity with Data Governance (peer reviewed)Guy Pearce
 
Module 1 Data Governance and Stewardship Core Concepts1.pptx
Module 1 Data Governance and Stewardship Core Concepts1.pptxModule 1 Data Governance and Stewardship Core Concepts1.pptx
Module 1 Data Governance and Stewardship Core Concepts1.pptxAhmad Rjoub
 
DISCUSSION 15 4All students must review one (1) Group PowerP.docx
DISCUSSION 15 4All students must review one (1) Group PowerP.docxDISCUSSION 15 4All students must review one (1) Group PowerP.docx
DISCUSSION 15 4All students must review one (1) Group PowerP.docxcuddietheresa
 
Data-Ed Webinar: Data Quality Engineering
Data-Ed Webinar: Data Quality EngineeringData-Ed Webinar: Data Quality Engineering
Data-Ed Webinar: Data Quality EngineeringDATAVERSITY
 

Similar to Data Management Best Practices Self-Assessment (20)

Why data governance is the new buzz?
Why data governance is the new buzz?Why data governance is the new buzz?
Why data governance is the new buzz?
 
Data governance
Data governanceData governance
Data governance
 
Data Quality Strategy: A Step-by-Step Approach
Data Quality Strategy: A Step-by-Step ApproachData Quality Strategy: A Step-by-Step Approach
Data Quality Strategy: A Step-by-Step Approach
 
data-management-strategy data-management-strategy
data-management-strategy data-management-strategydata-management-strategy data-management-strategy
data-management-strategy data-management-strategy
 
Introduction to Data Governance
Introduction to Data GovernanceIntroduction to Data Governance
Introduction to Data Governance
 
Radicals of Information Management in System Speculative Inscription
Radicals of Information Management in System Speculative InscriptionRadicals of Information Management in System Speculative Inscription
Radicals of Information Management in System Speculative Inscription
 
CDMP SLIDE TRAINER .pptx
CDMP SLIDE TRAINER .pptxCDMP SLIDE TRAINER .pptx
CDMP SLIDE TRAINER .pptx
 
sas-data-governance-framework-107325
sas-data-governance-framework-107325sas-data-governance-framework-107325
sas-data-governance-framework-107325
 
DGIQ 2013 Learned and Applied Concepts
DGIQ 2013 Learned and Applied Concepts DGIQ 2013 Learned and Applied Concepts
DGIQ 2013 Learned and Applied Concepts
 
Business Value Through Reference and Master Data Strategies
Business Value Through Reference and Master Data StrategiesBusiness Value Through Reference and Master Data Strategies
Business Value Through Reference and Master Data Strategies
 
Data-Ed Webinar: Data Governance Strategies
Data-Ed Webinar: Data Governance StrategiesData-Ed Webinar: Data Governance Strategies
Data-Ed Webinar: Data Governance Strategies
 
Data-Ed: Data Governance Strategies
Data-Ed: Data Governance Strategies Data-Ed: Data Governance Strategies
Data-Ed: Data Governance Strategies
 
Is Your Agency Data Challenged?
Is Your Agency Data Challenged?Is Your Agency Data Challenged?
Is Your Agency Data Challenged?
 
Data Governance & Data Architecture - Alignment and Synergies
Data Governance & Data Architecture - Alignment and SynergiesData Governance & Data Architecture - Alignment and Synergies
Data Governance & Data Architecture - Alignment and Synergies
 
data collection, data integration, data management, data modeling.pptx
data collection, data integration, data management, data modeling.pptxdata collection, data integration, data management, data modeling.pptx
data collection, data integration, data management, data modeling.pptx
 
Boosting Cybersecurity with Data Governance (peer reviewed)
Boosting Cybersecurity with Data Governance (peer reviewed)Boosting Cybersecurity with Data Governance (peer reviewed)
Boosting Cybersecurity with Data Governance (peer reviewed)
 
Module 1 Data Governance and Stewardship Core Concepts1.pptx
Module 1 Data Governance and Stewardship Core Concepts1.pptxModule 1 Data Governance and Stewardship Core Concepts1.pptx
Module 1 Data Governance and Stewardship Core Concepts1.pptx
 
DISCUSSION 15 4All students must review one (1) Group PowerP.docx
DISCUSSION 15 4All students must review one (1) Group PowerP.docxDISCUSSION 15 4All students must review one (1) Group PowerP.docx
DISCUSSION 15 4All students must review one (1) Group PowerP.docx
 
Data-Ed Webinar: Data Quality Engineering
Data-Ed Webinar: Data Quality EngineeringData-Ed Webinar: Data Quality Engineering
Data-Ed Webinar: Data Quality Engineering
 
2014 dqe handouts
2014 dqe handouts2014 dqe handouts
2014 dqe handouts
 

More from DATAVERSITY

Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...
Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...
Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...DATAVERSITY
 
Exploring Levels of Data Literacy
Exploring Levels of Data LiteracyExploring Levels of Data Literacy
Exploring Levels of Data LiteracyDATAVERSITY
 
Make Data Work for You
Make Data Work for YouMake Data Work for You
Make Data Work for YouDATAVERSITY
 
Data Catalogs Are the Answer – What Is the Question?
Data Catalogs Are the Answer – What Is the Question?Data Catalogs Are the Answer – What Is the Question?
Data Catalogs Are the Answer – What Is the Question?DATAVERSITY
 
Data Modeling Fundamentals
Data Modeling FundamentalsData Modeling Fundamentals
Data Modeling FundamentalsDATAVERSITY
 
Showing ROI for Your Analytic Project
Showing ROI for Your Analytic ProjectShowing ROI for Your Analytic Project
Showing ROI for Your Analytic ProjectDATAVERSITY
 
How a Semantic Layer Makes Data Mesh Work at Scale
How a Semantic Layer Makes  Data Mesh Work at ScaleHow a Semantic Layer Makes  Data Mesh Work at Scale
How a Semantic Layer Makes Data Mesh Work at ScaleDATAVERSITY
 
Is Enterprise Data Literacy Possible?
Is Enterprise Data Literacy Possible?Is Enterprise Data Literacy Possible?
Is Enterprise Data Literacy Possible?DATAVERSITY
 
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...DATAVERSITY
 
Data Governance Trends - A Look Backwards and Forwards
Data Governance Trends - A Look Backwards and ForwardsData Governance Trends - A Look Backwards and Forwards
Data Governance Trends - A Look Backwards and ForwardsDATAVERSITY
 
Data Governance Trends and Best Practices To Implement Today
Data Governance Trends and Best Practices To Implement TodayData Governance Trends and Best Practices To Implement Today
Data Governance Trends and Best Practices To Implement TodayDATAVERSITY
 
2023 Trends in Enterprise Analytics
2023 Trends in Enterprise Analytics2023 Trends in Enterprise Analytics
2023 Trends in Enterprise AnalyticsDATAVERSITY
 
Data Strategy Best Practices
Data Strategy Best PracticesData Strategy Best Practices
Data Strategy Best PracticesDATAVERSITY
 
Who Should Own Data Governance – IT or Business?
Who Should Own Data Governance – IT or Business?Who Should Own Data Governance – IT or Business?
Who Should Own Data Governance – IT or Business?DATAVERSITY
 
MLOps – Applying DevOps to Competitive Advantage
MLOps – Applying DevOps to Competitive AdvantageMLOps – Applying DevOps to Competitive Advantage
MLOps – Applying DevOps to Competitive AdvantageDATAVERSITY
 
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...DATAVERSITY
 
Empowering the Data Driven Business with Modern Business Intelligence
Empowering the Data Driven Business with Modern Business IntelligenceEmpowering the Data Driven Business with Modern Business Intelligence
Empowering the Data Driven Business with Modern Business IntelligenceDATAVERSITY
 
Including All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and AnalyticsIncluding All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and AnalyticsDATAVERSITY
 
Assessing New Database Capabilities – Multi-Model
Assessing New Database Capabilities – Multi-ModelAssessing New Database Capabilities – Multi-Model
Assessing New Database Capabilities – Multi-ModelDATAVERSITY
 
What’s in Your Data Warehouse?
What’s in Your Data Warehouse?What’s in Your Data Warehouse?
What’s in Your Data Warehouse?DATAVERSITY
 

More from DATAVERSITY (20)

Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...
Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...
Architecture, Products, and Total Cost of Ownership of the Leading Machine Le...
 
Exploring Levels of Data Literacy
Exploring Levels of Data LiteracyExploring Levels of Data Literacy
Exploring Levels of Data Literacy
 
Make Data Work for You
Make Data Work for YouMake Data Work for You
Make Data Work for You
 
Data Catalogs Are the Answer – What Is the Question?
Data Catalogs Are the Answer – What Is the Question?Data Catalogs Are the Answer – What Is the Question?
Data Catalogs Are the Answer – What Is the Question?
 
Data Modeling Fundamentals
Data Modeling FundamentalsData Modeling Fundamentals
Data Modeling Fundamentals
 
Showing ROI for Your Analytic Project
Showing ROI for Your Analytic ProjectShowing ROI for Your Analytic Project
Showing ROI for Your Analytic Project
 
How a Semantic Layer Makes Data Mesh Work at Scale
How a Semantic Layer Makes  Data Mesh Work at ScaleHow a Semantic Layer Makes  Data Mesh Work at Scale
How a Semantic Layer Makes Data Mesh Work at Scale
 
Is Enterprise Data Literacy Possible?
Is Enterprise Data Literacy Possible?Is Enterprise Data Literacy Possible?
Is Enterprise Data Literacy Possible?
 
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...
The Data Trifecta – Privacy, Security & Governance Race from Reactivity to Re...
 
Data Governance Trends - A Look Backwards and Forwards
Data Governance Trends - A Look Backwards and ForwardsData Governance Trends - A Look Backwards and Forwards
Data Governance Trends - A Look Backwards and Forwards
 
Data Governance Trends and Best Practices To Implement Today
Data Governance Trends and Best Practices To Implement TodayData Governance Trends and Best Practices To Implement Today
Data Governance Trends and Best Practices To Implement Today
 
2023 Trends in Enterprise Analytics
2023 Trends in Enterprise Analytics2023 Trends in Enterprise Analytics
2023 Trends in Enterprise Analytics
 
Data Strategy Best Practices
Data Strategy Best PracticesData Strategy Best Practices
Data Strategy Best Practices
 
Who Should Own Data Governance – IT or Business?
Who Should Own Data Governance – IT or Business?Who Should Own Data Governance – IT or Business?
Who Should Own Data Governance – IT or Business?
 
MLOps – Applying DevOps to Competitive Advantage
MLOps – Applying DevOps to Competitive AdvantageMLOps – Applying DevOps to Competitive Advantage
MLOps – Applying DevOps to Competitive Advantage
 
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...
Keeping the Pulse of Your Data – Why You Need Data Observability to Improve D...
 
Empowering the Data Driven Business with Modern Business Intelligence
Empowering the Data Driven Business with Modern Business IntelligenceEmpowering the Data Driven Business with Modern Business Intelligence
Empowering the Data Driven Business with Modern Business Intelligence
 
Including All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and AnalyticsIncluding All Your Mission-Critical Data in Modern Apps and Analytics
Including All Your Mission-Critical Data in Modern Apps and Analytics
 
Assessing New Database Capabilities – Multi-Model
Assessing New Database Capabilities – Multi-ModelAssessing New Database Capabilities – Multi-Model
Assessing New Database Capabilities – Multi-Model
 
What’s in Your Data Warehouse?
What’s in Your Data Warehouse?What’s in Your Data Warehouse?
What’s in Your Data Warehouse?
 

Recently uploaded

Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Colleen Farrelly
 
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...Thomas Poetter
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanMYRABACSAFRA2
 
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBoston Institute of Analytics
 
RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.natarajan8993
 
Top 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In QueensTop 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In Queensdataanalyticsqueen03
 
ASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel CanterASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel Cantervoginip
 
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...ssuserf63bd7
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 217djon017
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Seán Kennedy
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degreeyuu sss
 
Decoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectDecoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectBoston Institute of Analytics
 
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...Boston Institute of Analytics
 
Defining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryDefining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryJeremy Anderson
 
Real-Time AI Streaming - AI Max Princeton
Real-Time AI  Streaming - AI Max PrincetonReal-Time AI  Streaming - AI Max Princeton
Real-Time AI Streaming - AI Max PrincetonTimothy Spann
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhYasamin16
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxaleedritatuxx
 
Digital Marketing Plan, how digital marketing works
Digital Marketing Plan, how digital marketing worksDigital Marketing Plan, how digital marketing works
Digital Marketing Plan, how digital marketing worksdeepakthakur548787
 
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...Amil Baba Dawood bangali
 

Recently uploaded (20)

Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024
 
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population Mean
 
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis ProjectBank Loan Approval Analysis: A Comprehensive Data Analysis Project
Bank Loan Approval Analysis: A Comprehensive Data Analysis Project
 
RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.
 
Data Analysis Project: Stroke Prediction
Data Analysis Project: Stroke PredictionData Analysis Project: Stroke Prediction
Data Analysis Project: Stroke Prediction
 
Top 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In QueensTop 5 Best Data Analytics Courses In Queens
Top 5 Best Data Analytics Courses In Queens
 
ASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel CanterASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel Canter
 
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
Decoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis ProjectDecoding Patterns: Customer Churn Prediction Data Analysis Project
Decoding Patterns: Customer Churn Prediction Data Analysis Project
 
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
 
Defining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryDefining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data Story
 
Real-Time AI Streaming - AI Max Princeton
Real-Time AI  Streaming - AI Max PrincetonReal-Time AI  Streaming - AI Max Princeton
Real-Time AI Streaming - AI Max Princeton
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
 
Digital Marketing Plan, how digital marketing works
Digital Marketing Plan, how digital marketing worksDigital Marketing Plan, how digital marketing works
Digital Marketing Plan, how digital marketing works
 
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
 

Data Management Best Practices Self-Assessment

  • 1. Data Management Best Practices © Copyright 2022 by Peter Aiken Slide # 1 peter.aiken@anythingawesome.com +1.804.382.5957 Peter Aiken, PhD Practicing Data Management Better 1 + 1 = 11 Peter Aiken, Ph.D. • I've been doing this a long time • My work is recognized as useful • Associate Professor of IS (vcu.edu) • Institute for Defense Analyses (ida.org) • DAMA International (dama.org) • MIT CDO Society (iscdo.org) • Anything Awesome (anythingawesome.com) • Experienced w/ 500+ data management practices worldwide • Multi-year immersions – US DoD (DISA/Army/Marines/DLA) – Nokia – Deutsche Bank – Wells Fargo – Walmart – HUD … • 12 books and dozens of articles © Copyright 2022 by Peter Aiken Slide # 2 https://anythingawesome.com + • DAMA International President 2009-2013/2018/2020 • DAMA International Achievement Award 2001 (with Dr. E. F. "Ted" Codd • DAMA International Community Award 2005 $1,500,000,000.00 USD
  • 2. Turn data into action
  • 3. 2 Cloud Platforms Martech SaaS Apps Mobility & Devices Customer / Partners / Suppliers Packaged Apps Databases Today, data is highly siloed and fragmented Jane Smith Opted in for marketing Jane Smith Renewed policy John Smith Same household Jane E Smith Called support Jane E Smith Filed auto claim
  • 4. 3 Modern needs require modern solutions
  • 5. 0018-9162/07/$25.00 © 2007 IEEE 42 Computer Published by the IEEE Computer Society version (changing data into other forms, states, or products), or scrubbing (inspecting and manipulat- ing, recoding, or rekeying data to prepare it for sub- sequent use). • Approximately two-thirds of organizational data managers have formal data management training; slightly more than two-thirds of organizations use or plan to apply formal metadata management tech- niques; and slightly fewer than one-half manage their metadata using computer-aided software engineer- ing tools and repository technologies.3 When combined with our personal observations, these results suggest that most organizations can benefit from the application of organization-wide data management practices. Failure to manage data as an enterprise-, cor- porate-, or organization-wide asset is costly in terms of market share, profit, strategic opportunity, stock price, and so on. To the extent that world-class organizations have shown that opportunities can be created through the effective use of data, investing in data as the only organizational asset that can’t be depleted should be of great interest. Increasing data management practice maturity levels can positively impact the coordination of data flow among organizations,individuals,and systems. Results from a self-assessment provide a roadmap for improving organizational data management practices. Peter Aiken, Virginia Commonwealth University/Institute for Data Research M. David Allen, Data Blueprint Burt Parker, Independent consultant Angela Mattia, J. Sergeant Reynolds Community College A s increasing amounts of data flow within and between organizations, the problems that can result from poor data management practices are becoming more apparent. Studies have shown that such poor practices are widespread. For example, • PricewaterhouseCoopers reported that in 2004, only one in three organizations were highly confident in their own data, and only 18 percent were very con- fident in data received from other organizations. Further, just two in five companies have a docu- mented board-approved data strategy (www.pwc. com/extweb/pwcpublications.nsf/docid/15383D6E7 48A727DCA2571B6002F6EE9). • Michael Blaha1 and others in the research community have cited past organizational data management edu- cation and practices as the cause for poor database design being the norm. • According to industry pioneer John Zachman,2 orga- nizations typically spend between 20 and 40 percent of their information technology budgets evolving their data via migration (changing data locations), con- Measuring Data Management Practice Maturity: A Community’s Self-Assessment
  • 6. April 2007 43 DATA MANAGEMENT DEFINITION AND EVOLUTION As Table 1 shows, data management consists of six interrelated and coordinated processes, primarily derived by Burt Parker from sponsored research he led for the US Department of Defense at the MITRE Corporation.4 Figure 1 supports the similarly standardized defini- tion: “Enterprise-wide management of data is under- standing the current and future data needs of an enterprise and making that data effective and efficient in supporting business activities.”4 The figure illustrates how organizational strategies guide other data management pro- cesses. Two of these processes —data program coordination and organizational data inte- gration—provide direction to the implementation processes —data development, data sup- port operations, and data asset use. The data stewardship pro- cess straddles the line between direction and implementation. All processes exchange feed- back designed to improve and fine-tune overall data manage- ment practices. Data management has existed in some form since the 1950s and has been recognized as a discipline since the 1970s. Data management is thus a young discipline compared to, for example, the relatively mature accounting practices that have been practiced for thou- sands of years. As Figure 2 shows, data management’s scope has expanded over time, and this expansion contin- ues today. Ideally, organizations derive their data management requirements from enterprise-wide information and functional user requirements. Some of these require- ments come from legacy systems and off-the-shelf soft- ware packages. An organization derives its future data requirements from an analysis of what it will deliver, as well as future capabilities it will need to implement orga- nizational strategies. Data management guides the trans- Data program coordination Organizational data integration Data stewardship Data support operations Data asset use Organizational strategies Goals Integrated models Business data Business value Application models and designs Feedback Implementation Direction Guidance Data development Standard data Figure 1.Interrelationships among data management processes (adapted from Burt Parker’s earlier work4 ).Blue lines indicate guidance,red lines indicate feedback,and green lines indicate data. Table 1. Data management processes.4 Process Description Focus Data type Data program Provide appropriate data Direction Program data: Descriptive propositions or observations needed to coordination management process and establish, document, sustain, control, and improve organizational technological infrastructure data-oriented activities (such as vision, goals, policies, and metrics). Organizational Achieve organizational Direction Development data: Descriptive facts, propositions, or observations used data integration sharing of appropriate data to develop and document the structures and interrelationships of data (for example, data models, database designs, and specifications). Data stewardship Achieve business-entity Direction and Stewardship data: Descriptive facts about data documenting subject area data integration implementation semantics and syntax (such as name, definition, and format). Data development Achieve data sharing within Implementation Business data: Facts and their constructs used to accomplish enterprise a business area business activities (such as data elements, records, and files). Data support Provide reliable access to Implementation operations data Data asset use Leverage data in business Implementation activities
  • 7. 44 Computer formation of strategic organizational information needs into specific data requirements associated with particu- lar technology system development projects. All organizations have data architectures, whether explicitly documented or implicitly assumed. An impor- tant data management process is to document the archi- tecture’s capabilities, making it more useful to the organization. In addition, data management • must be viewed as a means to an end, not the end itself. Organizations must not practice data man- agement as an abstract discipline, but as a process supporting specific enterprise objectives—in partic- ular, to provide a shared-resource basis on which to build additional services. • involves both process and policy. Data management tasks range from strategic data planning to the cre- ation of data element standards to database design, implementation, and maintenance. • has a technical component: interfacing with and facil- itating interaction between software and hardware. • has a specific focus: creating and maintaining data to provide useful information. • includes management of metadata artifacts that address the data’s form as well as its content. Although data management serves the organization, the organization often doesn’t appreciate the value it provides. Some data management staffs keep ahead of the layoff curve by demonstrating positive business value. Management’s short-term focus has often made it difficult to secure funding for medium- and long-term data management investments. Tracing the discipline’s efforts to direct and indirect organizational benefits has been difficult, so it hasn’t been easy to present an artic- ulate business case to management that justifies subse- quent strategic investments in data management. Viewing data management as a col- lection of processes, each with a role that provides value to the organization through data, makes it easier to trace value through those processes and point not only to a methodological “why” of data management practice improvement but also to a specific, concrete “how.” RESEARCH BASIS Mark Gillenson has published three papers that serve as an excellent back- ground to this research.5-7 Like earlier works, Gillenson focuses on the implementation half of Figure 1, adopting a more narrow definition of data administration. Over time, his work paints a pic- ture of an industry attempting to catch up with techno- logical implementation. Our work here updates and confirms his basic conclusions while changing the focus from whether a process is performed to the maturity with which it is performed. Three other works also influenced our research: Ralph Keeney’s value-focused thinking,8 Richard Nolan’s six- stage theory of data processing,9 and the Capability Maturity Model Integration (CMMI).10,11 Keeney’s value-focused thinking provides a method- ological approach to analyzing and evaluating the var- ious aspects of data management and their associated key process areas. We wove the concepts behind means and fundamental objectives into our assessment’s con- struction to connect how we measure data management with what customers require from it. In Stage VI of his six-stage theory of data processing, Nolan defined maturity as data resource management. Although Nolan’s theory predates and is similar to the CMMI, it contains several ideas that we adapted and reused in the larger data management context. However, CMMI refinement remains our primary influence. Most technologists are familiar with the CMM (and its upgrade to the CMMI), developed at Carnegie Mellon’s Software Engineering Institute with assistance from the MITRE Corporation.10,11 The CMMI itself was derived from work that Ron Radice and Watts Humphrey per- formed while at IBM. Dennis Goldenson and Diane Gibson presented results pointing to a link between CMMI process maturity and organizational success.12 In addition, Cyndy Billings and Jeanie Clifton demonstrated the long-term effects for organizations that successfully sustain process improvement for more than a decade.13 CMMI-based maturity models exist for human resources, security, training, and several other areas of the software-related development process. Our colleague, Expanding Data Management Scope 1950-1970 1970-1990 1990-2000 2000 to present Database development Database operation Data requirements analysis Data modeling Enterprise data management coordination Enterprise data integration Enterprise data stewardship Enterprise data use Explicit focus on data quality throughout Security Compliance Other responsibilities Figure 2.Data management’s growth over time.The discipline has expanded from an initial focus on database development and operation in the 1950s to 1970s to include additional responsibilities in the periods 1970-1990,1990-2000,and from 2000 to the present.
  • 8. Brett Champlin, contributed a list of dozens of maturity measurements derived from or influenced by the CMMI. This list includes maturity measurement frameworks for data warehousing, metadata management, and software systems deployment. The CMMI’s successful adoption in other areas encouraged us to use it as the basis for our data management practice assessment. Whereas the core ideas behind the CMMI present a reasonable base for data management practice maturity measurement, we can avoid some potential pitfalls by learning from the revisions and later work done with the CMMI. Examples of such improvements include general changes to how the CMMI makes interrela- tionships between process areas more explicit and how it presents results to a target organization. Work by Cynthia Hauer14 and Walter Schnider and Klaus Schwinn15 also influenced our general approach to a data management maturity model. Hauer nicely artic- ulated some examples of the value determination fac- tors and results criteria that we have adopted. Schnider and Schwinn presented a rough but inspirational out- line of what mature data management practices might look like and the accompanying motivations. RESEARCH OBJECTIVES Our research had six specific objectives, which we grouped into two types: community descriptive goals and self-improvement goals. Community descriptive research goals help clarify our understanding of the data management community and associated practices. Specifically, we want to understand • the range of practices within the data management community; • the distribution of data management practices, specif- ically the various stages of organizational data man- agement maturity; and • the current state of data management practices—in what areas are the community data management practices weak, average, and strong? Self-improvement research goals help the community as a whole improve its collective data management prac- tices. Here, we desire to • better understand what defines current data man- agement practices; • determine how the assessment informs our standing as a technical community (specifically, how does data management compare to software development?); and • gain information useful for developing a roadmap for improving current practice. The CMMI’s stated goals are almost identical to ours: “[The CMMI] was designed to help developers select process-improvement strategies by determining their cur- rent process maturity and identifying the most critical issues to improving their software quality and process.”10 Similarly, our goal was to aid data management practice improvement by presenting a scale for measuring data management accomplishments. Our assessment results can help data managers identify and implement process improvement strategies by recognizing their data man- agement challenges. DATA COLLECTION PROCESS AND RESEARCHTARGETS Between 2000 and 2006, we assessed the data man- agement practices of 175 organizations. Table 2 pro- vides a breakdown of organization types. Students from some of our graduate and advanced undergraduate classes largely conducted the assessments. We provided detailed assessment instruction as part of the course work. Assessors used structured telephone and in-person interviews to assess specific organizational data management practices by soliciting evidence of processes, products, and common features. Key concepts sought included the presence of commitments, abilities, measurements, verification, and governance. Assessors conducted the interviews with the person identified as having the best, firsthand knowledge of organizational data management practices. Tracking down these individuals required much legwork; identi- fying these individuals was often more difficult than securing the interview commitment. The assessors attempted to locate evidence in the orga- nization indicating the existence of key process areas within specific data management practices. During the evaluation, assessors observed strict confidentiality— they reported only compiled results, with no mention of specific organizations, individuals, groups, programs, or projects. Assessors and participants kept all infor- mation to themselves and observed proprietary rights, including several nondisclosure agreements. All organizations implement their data management practice in ways that can be classified as one of five maturity model levels, detailed in Table 3 on the next page. Specific evidence, organized by maturity level, helped identify the level of data management practiced. April 2007 45 Table 2. Organizations included in data management analysis, by type. Organization type Percent Local government 4 State government 17 Federal government 11 International organization 10 Commercial organization 58
  • 9. 46 Computer For example, the data program coordination practice area results include: • Mystery Airline achieved level 1 on responses 1, 2, and 5, and level 2 on responses 3 and 4. • The airline industry performed above both Mystery Airline and all respondents on responses 1 through 3. • The airline industry performed below both Mystery Airline and all respondents on response 4, and Mystery Airline performed well below all respon- dents and just those in the airline industry on response 5. Figure 3f illustrates the range of results for all orga- nizations surveyed for each data management process— for example, the assessment results for data program coordination ranged from 2.06 to 3.31. The maturity measurement framework dictates that a data program can achieve no greater rating than the lowest rating achieved—hence the translation to the scores for Mystery Airline of 1, 2, 2, 2, and 2 combin- ing for an overall rating of 1. This is congruent with CMMI application. Although this might seem a tough standard, the rat- ing reflects the adage that a chain is only as strong as its weakest link. Mature data management programs can’t rely on immature or ad hoc processes in related areas. The lowest rating received becomes the highest possible For each data management process, the assessment used between four and six objective criteria to probe for evidence. Assessed outside the data collection process, the presence or absence of this evidence indi- cated organizational performance at a corresponding maturity level. ASSESSMENT RESULTS The assessment results reported for the various prac- tice areas show that overall scores are repeatable (level 2) in all data management practice areas. Figure 3 shows assessment averages of the individual response scores. We used a composite chart to group the averages by practice area. Such groupings facilitate numerous comparisons, which organizations can use to plan improvements to their data management practices. We present sample results (blue) for an assessed orga- nization (disguised as “Mystery Airline”), whose man- agement was interested in not only how the organization scored but also how it compared to other assessed air- lines (red) and other organizations (white). We grouped 19 individual responses according to the five data management maturity levels in the horizontal bar charts. Most numbers are averages. That is, for an individual organization, we surveyed multiple data man- agement operations, combined the individual assessment results, and presented them as averages. We reported assessments of organizations with only one data man- agement function as integers. Table 3. Data management practice assessment levels. Level Name Practice Quality and results predictability 1 Initial The organization lacks the necessary processes for The organization depends entirely on individuals, with little or no sustaining data management practices. Data corporate visibility into cost or performance, or even awareness management is characterized as ad hoc or chaotic. of data management practices. There is variable quality, low results predictability, and little to no repeatability. 2 Repeatable The organization might know where data management The organization exhibits variable quality with some expertise exists internally and has some ability to predictability. The best individuals are assigned to critical duplicate good practices and successes. projects to reduce risk and improve results. 3 Defined The organization uses a set of defined processes, Good quality results within expected tolerances most of the time. which are published for recommended use. The poorest individual performers improve toward the best performers, and the best performers achieve more leverage. 4 Managed The organization statistically forecasts and directs Reliability and predictability of results, such as the ability to data management, based on defined processes, determine progress or six sigma versus three sigma selected cost, schedule, and customer satisfaction measurability, is significantly improved. levels. The use of defined data management processes within the organization is required and monitored. 5 Optimizing The organization analyzes existing data management The organization achieves high levels of results certainty. processes to determine whether they can be improved, makes changes in a controlled fashion, and reduces operating costs by improving current process performance or by introducing innovative services to maintain their competitive edge.
  • 10. overall rating. This also explains why many organiza- tions are at level 1 with regard to their software devel- opment practices. While the CMMI process results in a single overall rating for the organization, data manage- ment requires a more fine-grained feedback mechanism. Knowing that some data management processes per- form better than others can help an organization develop incentives as well as a roadmap for improving individ- ual ratings. Taken as a whole, these numbers show that no data management process or subprocess measured on aver- age higher than the data program coordination process, at 3.31. It’s also the only data management process that performed on average at a defined level (greater than 3). The results show a community that is approaching the ability to repeat its processes across all of data management. Results analysis Perhaps the most important general fact represented in Figure 3 is that organizations gave themselves rela- tively low scores. The assessment results are based on self-reporting and, although our 15-percent validation sample is adequate to verify accurate industry-wide assessment results, 85 percent of the assessment is based on facts that were described but not observed. Although direct observables for all survey respondents would have provided valuable confirming evidence, the cost of such a survey and the required organizational access would have been prohibitive. We held in-person, follow-up assessment validation sessions with about 15 percent of the assessed organi- zations. These sessions helped us validate the collection method and refine the technique. They also let us gauge the assessments’ accuracy. April 2007 47 0 1 2 3 4 0 1 2 3 4 Response 1 Response 2 Response 3 Response 4 Response 5 (a) (b) 1 1 2 2 1 Response 8 Response 9 Response 7 Response 6 2 2 2 2 3.08 2.18 2.57 2.05 0.98 2.34 2.66 2.98 Response 10-a Response 10-b Response 10-c Response 10-d Response 10-e Response 10-f 2 2 1 2 2 2 2.13 0.96 1./05 2.23 2.21 1.98 1.1 0.97 2.15 3.04 2.40 0.965 Response 14 Response 13 Response 12 Response 11 Response 15 2 2 2 2 2 0.89 1.2 1.05 0.79 1.14 2.25 2.46 2.01 1.57 2.33 Response 19 Response 18 Response 17 Response 16 0.00 1.00 2.00 3.00 4.00 5.00 Data program coordination results Enterprise data integration results Data stewardship results Data development results Data support operations results Mystery Airline Airline industry All respondents 3.31 3.14 2.88 1.09 3.11 2.06 2.57 2.98 2.72 3.15 0 1 2 3 4 (c) 0 1 2 3 4 (e) (f) 0 1 2 3 4 (d) 3 3 3 3 1.11 2.17 3.04 2.04 2.66 3.11 2.89 2.66 2.06 3.31 2.66 2.18 1.98 2.28 2.46 1.57 2.04 2.66 Figure 3.Assessment results useful to Mystery Airline:(a) data program coordination,(b) enterprise data integration,(c) data stewardship,(d) data development,(e) data support organizations,and (f) assessments range.
  • 11. 48 Computer Although the assessors strove to accurately measure each subprocess’s maturity level, some interviews inevitably were skewed toward the positive end of the scale. This occurred most often because interviewees reported on milestones that they wanted to or would soon achieve as opposed to what they had achieved. We suspected, and confirmed during the validation sessions, that responses were typically exaggerated by one point on the five-point scale. When we factor in the one-point inflation, the num- bers in Table 4 become important. Knowing that the bar is so low will hopefully inspire some organizations to invest in data management. Doing so might give them a strategic advantage if the competition is unlikely to be making a similar investment. The relatively low scores reinforce the need for this data management assessment. Based on the overall scores in the data management practice areas, the community receives five Ds. These areas provide immediate targets for future data manage- ment investment. WHERE AREWE NOW? We address our original research objectives according to our two goal categories. Community descriptive research goals First, we wanted to determine the range of practices within the data management community. A wide range of such practices exists. Some organizations are strong in some data management practices and weak in others (the range of practice is consistently inconsistent). The wide divergence of practices both within and between organizations can dilute results from otherwise strong data management programs. The assessment’s applica- bility to longitudinal studies remains to be seen; this is an area for follow-up research. Although researchers might undertake formal studies of such trends in the future, evidence from ongoing assessments suggests that results are converging. Consequently, we feel that our sample constitutes a representation of community-wide data management practices. Next, we wanted to know whether the distribution of practices informs us specifically about the various stages of organizational data management maturity. The assessment results confirm the framework’s utility, as do the postassessment validation sessions. Building on the framework, we were able to specify target characteris- tics and objective measurements. We now have better information as to what comprises the various stages of organizational data management practice maturity. Organizations do clump together into the various matu- rity stages that Nolan originally described. We can now determine the investments required to predictably move organizations from one data management maturity level to another. Finally, we wanted to determine in what areas the community data management practices are weak, aver- age, and strong. Figure 4 shows an average of unad- justed rates summarizing the assessment results. As the figure shows, the data management community reports itself relatively and perhaps surprisingly strong in all five major data management processes when compared to the industry averages for software development. The range and averages indicate that the data management community has more mature data program coordina- tion processes, followed by organizational data inte- gration, support operations, stewardship, and then data development. The relatively lower data development scores might suggest data program coordination imple- mentation difficulties. Self-improvement research goals Our first objective was to produce results that would help the community better understand current best prac- tices. Organizations can use the assessment results to compare their specific performance against others in their industry and against the community results as a whole. Quantities and groupings indicate the relative state and robustness of the best practices within each process. Future research can use this information to identify specific practices that can be shared with the Table 4. Assessment scores adjusted for self-reporting inflation. Response Adjusted average 1 1.72388 2 1.57463 3 1.0597 4 1.8806 5 2.31343 6 1.66418 7 1.33582 8 1.57463 9 1.1791 10 a 1.40299 10 b 1.14925 10 c 0.97761 10 d 1.20896 10 e 1.23134 10 f 1.12687 11 1.32836 12 0.57463 13 1.00746 14 1.46269 15 1.24627 16 1.65672 17 1.66418 18 1.04478 19 1.17164
  • 12. community. Further study of these areas will provide lever- ageable benefits. Next, we wanted to deter- mine how the assessment in- forms our standing as a tech- nical community. Our research gives some indication of the claimed current state of data management practices. How- ever, given the validation session results, we believe that it’s best to caution readers that the numbers presented probably more accurately describe the intended state of the data management community. As it turns out, the relative number of organizations above level 1 for both software and data management are approximately the same, but a more detailed analy- sis would be helpful. Given the belief that investment in software development practices will result in signif- icant improvements, it’s appropriate to anticipate sim- ilar benefits from investments in data management practices. Finally, we hoped to gain information useful for devel- oping a roadmap for improving current practice. Organizations can use the survey assessment information to develop roadmaps to improve their individual data management practices. Mystery Airline, for example, could develop a roadmap for achieving data management improvement by focusing on enterprise data integration, data stewardship, and data development practices. SUGGESTIONS FOR FUTURE RESEARCH Additional research must include a look at relation- ships between data management practice areas, which could indicate an efficient path to higher maturity lev- els. Research should also explore the success or failure of previous attempts to raise the maturity levels of orga- nizational data management practices. One of our goals was to determine why so many orga- nizational data management practices are below expec- tations. Several current theses could spur investigation of the root causes of poor data management practices. For example, • Are poor data management practices a result of the organization’s lack of understanding? • Does data management have a poor reputation or track record in the organization? • Are the executive sponsors capable of understanding the subject? • How have personnel and project changes affected the organization efforts? Our assessment results suggest a need for a more for- malized feedback loop that organizations can use to improve their data management practices. Organizations can use this data as a baseline from which to look for, describe, and measure improvements in the state of the practice. Such information can enhance their under- standing of the relative development of organizational data management. Other investigations should probe further to see if patterns exist for specific industry or busi- ness focus types. Building an effective business case for achieving a cer- tain level of data management is now easier. The failure to adequately address enterprise-level data needs has hobbled past efforts.4 Data management has, at best, a business-area focus rather than an enterprise outlook. Likewise, applications development focuses almost exclusively on line-of-business needs, with little atten- tion to cross-business-line data integration or enterprise- wide planning, analysis, and decision needs (other than within personnel, finance, and facilities management). In addition, data management staff is inexperienced in modern data management needs, focusing on data man- agement rather than metadata management and on syn- taxes instead of semantics and data usage. F ew organizations manage data as an asset. Instead, most consider data management a maintenance cost. A small shift in perception (from viewing data as a cost to regarding it as an asset) can dramatically change how an organization manages data. Properly managed data is an organizational asset that can’t be exhausted. Although data can be polluted, retired, destroyed, or become obsolete, it’s the one organizational resource that can be repeatedly reused without deterioration, provided that the appropriate safeguards are in place. Further, all organizational activities depend on data. To illustrate the potential payoff of the work presented here, consider what 300 software professionals applying software process improvement over an 18-year period achieved:16 • They predicted costs within 10 percent. • They missed only one deadline in 15 years. • The relative cost to fix a defect is 1X during inspec- tion, 13X during system testing, and 92X during operation. April 2007 49 Initial Repeatable Defined Data program coordination 2.06 2.71 3.31 Enterprise data integration 2.18 2.44 2.66 Data stewardship 1.98 2.18 2.40 Data development 1.57 2.12 2.46 Data support operations 2.04 2.38 2.66 Figure 4.Average of unadjusted rates for the assessment results,by process.
  • 13. 50 Computer • Early error detection rose from 45 to 95 percent between 1982 and 1993. • Product error rate (measured as defects per 1,000 lines of code) dropped from 2.0 to 0.01 between 1982 and 1993. If improvements in data management can produce similar results, organizations should increase their matu- rity efforts. ■ Acknowledgments We thank Graham Blevins, David Rafner, and Santa Susarapu for their assistance in preparing some of the reported data. We are greatly indebted to many of Peter Aiken’s classes in data reengineering and related topics at Virginia Commonwealth University for the careful work and excellent results obtained as a result of their various contributions to this research. This article also benefited from the suggestions of several anonymous reviewers. We also acknowledge the helpful, continuing work of Brett Chaplin at Allstate in collecting, apply- ing, and assessing CMMI-related efforts. References 1. M. Blaha, “A Retrospective on Industrial Database Reverse Engineering Projects—Parts 1 & 2,” Proc. 8th Working Conf. Reverse Eng., IEEE Press, 2001, pp. 147-164. 2. J. Zachman, “A Framework for Information Systems Archi- tecture,” IBM Systems J., vol. 26, 1987, pp. 276-292. 3. P.H. Aiken, “Keynote Address to the 2002 DAMA Interna- tional Conference: Trends in Metadata,” Proc. 2002 DAMA Int’l/Metadata Conf., CD-ROM, Wilshire Conf., 2002, pp. 1-32. 4. B. Parker, “Enterprise Data Management Process Maturity,” Handbook of Data Management, S. Purba, ed., Auerbach Publications, CRC Press, 1999, pp. 824-843. 5. M. Gillenson, “The State of Practice of Data Administration— 1981,” Comm. ACM, vol. 25, no. 10, 1982, pp. 699-706. 6. M. Gillenson, “Trends in Data Administration,” MIS Quar- terly, Dec. 1985, pp. 317-325. 7. M. Gillenson, “Database Administration at the Crossroads: The Era of End-User-Oriented, Decentralized Data Process- ing,” J. Database Administration, Fall 1991, pp. 1-11. 8. R.L. Keeney, Value-Focused Thinking—A Path to Creative Decisionmaking, Harvard Univ. Press, 1992. 9. R. Nolan, “Managing the Crisis in Data Processing,” Har- vard Business Rev., Mar./Apr. 1979, pp. 115-126. 10. Carnegie Mellon Univ. Software Eng. Inst., Capability Matu- rity Model: Guidelines for Improving the Software Process, 1st ed., Addison-Wesley Professional, 1995. 11. M.C. Paulk and B. Curtis, “Capability Maturity Model, Ver- sion 1.1,” IEEE Software, vol. 10, 1993, pp. 18-28. 12. D.R. Goldenson and D.L. Gibson, “Demonstrating the Impact and Benefits of CMM: An Update and Preliminary Results,” special report CMU/SEI-2003-SR-009, Carnegie Mellon Univ. Software Eng. Inst., 2003, pp. 1-55. 13. C. Billings and J. Clifton, “Journey to a Mature Software Process,” IBM Systems J., vol. 33, 1994, pp. 46-62. 14. C.C. Hauer, “Data Management and the CMM/CMMI: Translating Capability Maturity Models to Organizational Functions,” presented at National Defense Industrial Assoc. Technical Information Division Symp., 2003; www.dtic.mil/ ndia/2003technical/hauer1.ppt. 15. W. Schnider and K. Schwinn, “Der Reifegrad des Datenman- agements” [The Data Management Maturity Model], KPP Consulting; www.kpp-consulting.ch/downloadbereich/DM% 20Maturity%20Model.pdf, 2004 (in German). 16. H. Krasner, J. Pyles, and H. Wohlwend, “A Case History of the Space Shuttle Onboard Systems Project,” Technology Transfer 94092551A-TR, Sematech, 31 Oct. 1994. Peter Aiken is an associate professor of information systems at Virginia Commonwealth University and founding direc- tor of Data Blueprint. His research interests include data and systems reengineering. Aiken received a PhD in infor- mation technology from George Mason University. He is a senior member of the IEEE, the ACM, and the Data Man- agement Association (DAMA) International. Contact him at peter@datablueprint.com. M. David Allen is chief operating officer of Data Blueprint. His research interests include data and systems reengineer- ing. Allen received an MS in information systems from Virginia Commonwealth University. He is a member of DAMA. Contact him at mda@datablueprint.com. Burt Parker is an independent consultant based in Wash- ington, DC. His technical interests include enterprise data management program development. Parker received an MBA in operations research/systems analysis (general sys- tems theory) from the University of Michigan. He is a mem- ber of DAMA. Contact him at parkerbg@comcast.net. Angela Mattia is a professor of information systems at J. Sergeant Reynolds Community College. Her research inter- ests include data and systems reengineering and maturity models. Mattia received an MS in information systems from Virginia Commonwealth University. She is a member of DAMA. Contact her at amattia@jsr.vccs.edu.
  • 14. Four Current Data Truths 1. Data volume is still increasing faster than we are able to process it, 2. Data interchange overhead and other costs of poor data practices are measurably sapping organization and individual resources–and therefore productivity, 3. Reliance on existing technology-based approaches and education methods has not materially addressed this gap between creation and processing or reduced bottom line costs, & 4. There exists an industry-type, whose sole purpose is to extract data from citizens and then use it for to make money. © Copyright 2022 by Peter Aiken Slide # 3 https://anythingawesome.com Doing Data Better means that you • Understand that the vastness and quality of data plays an increasing role in everyone’s life • Are motivated to increase your individual data skills because you now know that poor data skills: – Cost you more – Steal increasing amounts of your time – Deliver less – Presents greater risk • Recognize the critical importance of data management in modern life and its positive and negative applications • Develop defensive skills to differentiate between good and bad data (understanding that most data is of unknown quality) • Can assign values to some of your personal data and its use • Are able to take advantage of decreases in the general workload load needed to effectively manage data in your professional and personal life © Copyright 2022 by Peter Aiken Slide # 4 https://anythingawesome.com
  • 15. © Copyright 2022 by Peter Aiken Slide # 5 https://anythingawesome.com © Copyright 2022 by Peter Aiken Slide # https://anythingawesome.com Program 6 • Motivation - Frustration–we are unsatisfied with current state - Are we making progress? (No) • How did we get here? (Building on proven research) - DoD ➜ SEI ➜ MITRE ➜ CMMI - Industry push for best practices • Ingredients - What is the Data Maturity Model? (DMM) - Body of Knowledge (DM BOK) • Understanding and applying them together - Weakest link in the chain architecture - Just a bit on strategy - Three legged stool - How does one get to Carnegie Hall? • Where to next? • Q & A? 1 + 1 = 11 Practicing Data Management Better
  • 16. Measures of Unproductivity Knowledge Worker Stress • 33% of time spent reworking/ recreating knowledge that already exists! • 10% of time spent creating new knowledge and content • 53% would rather to household chores • 52% would rather pay bills than use content management/repositories • 74% report feeling overwhelmed or unhappy when working with data • 33% of overwhelmed employees spend at least one hour a week procrastinating over data-related tasks © Copyright 2022 by Peter Aiken Slide # 7 https://anythingawesome.com https://medium.com/interoperable/knowledge-workers-information-life-cycles-and-content-silos-oh-my-a4263eed427 Measurements Everyone • 14% have a good understanding of how to use business data • 21% aged 16-24 classified themselves as data literate Conclusion: future employees are underprepared for data-driven workplaces Business decision makers • 24% of business decision makers feel fully confident in their ability to read, work with, analyze & argue with that data • 33% are able to create measurable value from data • 27% say my analytics projects produce actionable insights • 78% willing to invest time/energy improving data skillsets © Copyright 2022 by Peter Aiken Slide # 8 https://anythingawesome.com http://TheDataLiteracyProject.org
  • 17. When asked to incorporate data • Data appreciation isn't translating into employee adoption – 48% frequently make gut decisions – 66% for C-suite executives • Lack of data skills is limiting workplace productivity – 36% said they would find an alternative method to complete the task without using data – 14 percent avoid the task entirely © Copyright 2022 by Peter Aiken Slide # 9 https://anythingawesome.com http://TheDataLiteracyProject.org Avoid Task Entirely 14% Find Alt. Method 36% Incorporate Data 50% Too many organizations have simply put data in the hands of employees and expected them to make a success of it Incorporate Data 52% Defer to Gut 48% Incorporate Data 34% Defer to Gut 66% http://TheDataLiteracyProject.org Why weren't my data problems solved when we moved to the cloud? © Copyright 2022 by Peter Aiken Slide # 10 https://anythingawesome.com
  • 18. Data in the cloud should have three attributes that data outside the cloud/warehouse should not have. It should be: Sharable-er © Copyright 2022 by Peter Aiken Slide # 11 https://anythingawesome.com Cleaner Smaller Transform Problems with forklifting 1. no basis for decisions made 2. no inclusion of architecture/ engineering concepts 3. no idea that these concepts are missing from the process 4. 80% of organizational data is ROT © Copyright 2022 by Peter Aiken Slide # 12 https://anythingawesome.com Less Cleaner More shareable ... data Making Warehousing Successful -—Cloud—— Data Branding
  • 19. Fixing Data in the Cloud is Like Using A Glovebox © Copyright 2022 by Peter Aiken Slide # 13 https://anythingawesome.com Success Requires a 3-Legged Stool © Copyright 2022 by Peter Aiken Slide # 14 https://anythingawesome.com P e o p l e Process T e c h n o l o g y
  • 20. Current approaches are not and have not been working © Copyright 2022 by Peter Aiken Slide # 15 https://anythingawesome.com Driving Innovation with Data Competing on data and analytics Managing data as a business asset Created a data-driven organization Forged a data culture 25% 50% 75% 100% 24% 24% 39% 41% 49% Yes No Source: Big Data and AI Executive Survey 2021 by Randy Bean and Thomas Davenport www.newvantage.com 2021 0% 25% 50% 75% 100% technology people/process 92.2% 7.8% 80% of data challenges are people/process based! & Data Leadership is the only resource to address these challenges Data Management Data Governance Program External Comprehension © Copyright 2022 by Peter Aiken Slide # 16 https://anythingawesome.com Sample from: https://artist.com/kathy-linden/on-outside-looking-in/?artid=4385 Everything Else Data Data (blah blah blah) Data Program Most do not appreciate the difference between Data Governance and the other data stuff that needs to be done
  • 21. © Copyright 2022 by Peter Aiken Slide # https://anythingawesome.com Program 17 • Motivation - Frustration–we are unsatisfied with current state - Are we making progress? (No) • How did we get here? (Building on proven research) - DoD ➜ SEI ➜ MITRE ➜ CMMI - Industry push for best practices • Ingredients - What is the Data Maturity Model? (DMM) - Body of Knowledge (DM BOK) • Understanding and applying them together - Weakest link in the chain architecture - Just a bit on strategy - Three legged stool - How does one get to Carnegie Hall? • Where to next? • Q & A? 1 + 1 = 11 Practicing Data Management Better Motivation © Copyright 2022 by Peter Aiken Slide # "One day Alice came to a fork in the road and saw a Cheshire cat in a tree. Which road do I take? she asked. Where do you want to go? was his response. I don't know, Alice answered. Then, said the cat, it doesn't matter." Lewis Carroll from Alice in Wonderland 18 https://anythingawesome.com • "We want to move our data management program to the next level" – Question: What level are you at now? • You are currently managing your data, – But, if you can't measure it, – How can you manage it effectively? • How do you know where to put time, money, and energy so that data management best supports the mission?
  • 22. DoD Origins • US DoD Reverse Engineering Program Manager • We sponsored research at the CMM/SEI asking – “How can we measure the performance of DoD and our partners?” – “Go check out what the Navy is up to!” • SEI responded with an integrated process/data improvement approach – DoD required SEI to remove the data portion of the approach – It grew into CMMI/DM BoK, etc. © Copyright 2022 by Peter Aiken Slide # 19 https://anythingawesome.com MITRE Corporation: Data Management Maturity Model • Internal research project: Oct ‘94-Sept ‘95 • Based on Software Engineering Institute Capability Maturity Model (SEI CMMSM) for Software Development Projects • Key Process Areas (KPAs) parallel SEI CMMSM KPAs, but with data management focus and key practices • Normative model for data management required; need to: Understand scope of data management Organize data management key practices • Reported as not-done-well by those who do it Acknowledgements © Copyright 2022 by Peter Aiken Slide # version (changing data into other forms, states, or products), or scrubbing (inspecting and manipulat- ing, recoding, or rekeying data to prepare it for sub- Increasing data management practice maturity levels can positively impact the coordination of data flow among organizations,individuals,and systems. Results from a self-assessment provide a roadmap for improving organizational data management practices. Peter Aiken, Virginia Commonwealth University/Institute for Data Research M. David Allen, Data Blueprint Burt Parker, Independent consultant Angela Mattia, J. Sergeant Reynolds Community College s increasing amounts of data flow within and between organizations, the problems that can result from poor data management practices Measuring Data Management Practice Maturity: A Community’s Self-Assessment 20 https://anythingawesome.com version (changing data into other forms, states, or products), or scrubbing (inspecting and manipulat- ing, recoding, or rekeying data to prepare it for sub- Increasing data management practice maturity levels can positively impact the coordination of data flow among organizations,individuals,and systems. Results from a self-assessment provide a roadmap for improving organizational data management practices. Peter Aiken, Virginia Commonwealth University/Institute for Data Research M. David Allen, Data Blueprint Burt Parker, Independent consultant Angela Mattia, J. Sergeant Reynolds Community College s increasing amounts of data flow within and between organizations, the problems that can result from poor data management practices Measuring Data Management Practice Maturity: A Community’s Self-Assessment
  • 23. CMMI Institute Background • Evolved from Carnegie Mellon’s Software Engineering Institute (SEI) - a federally funded research and development center (FFRDC) • Continues to support and provide all CMMI offerings and services delivered over its 20+ year history at the SEI – Industry leading reference models - benchmarks and guidelines for improvement ➜ Development ➜ Acquisition ➜ Services ➜ People ➜ Data Management – Training and Certification program, Partner program • Dedicated training, partner and certification teams to support organizations and professionals • Now owned by ISACA (CISO/M, COBIT, IT Governance, Cybersecurity) and joint product offerings are planned © Copyright 2022 by Peter Aiken Slide # 21 https://anythingawesome.com Melanie Mecca • Former CMMI Institute/Director of Data Management Products and Services ➜ datawise.inc • 30+ years designing and implementing strategies and solutions for private/public sectors • Architecture/Design experience in: – Data Management Programs – Enterprise Data Architecture – Enterprise Architecture • DMM's Managing Author Certified Partner, CMMI Institute – melanie@datawise-inc.com © Copyright 2022 by Peter Aiken Slide # 22 https://anythingawesome.com
  • 24. Data Management Maturity (DMM)SM Model • DMM 1.0 released August 2014 – 3.5 years in development – Sponsors – Microsoft, Lockheed Martin, Booz Allen Hamilton – 50+ contributing authors, 70+ peer reviewers, 80+ orgs • Reference model framework of fundamental best practices – 414 specific practice statements – 596 functional work products – Maturity practices • Measurement instrument for organizations to evaluate capabilities and maturity, identify gaps, and incorporate guidelines for improvements. © Copyright 2022 by Peter Aiken Slide # 23 https://anythingawesome.com DMM Structure © Copyright 2022 by Peter Aiken Slide # Core Category Process Area Purpose Introductory Notes Goal(s) of the Process Area Core Questions for the Process Area Functional Practices (Levels 1-5) r Related Process Areas Example Work Products Infrastructure Support Practices e Explanatory Model Components R equired for Model Compliance 24 https://anythingawesome.com
  • 25. “You Are What You DO” • Model emphasizes behavior – Proactive positive behavioral changes – Creating and carrying out effective, repeatable processes – Leveraging and extending across the organization • Activities result in work products – Processes, standards, guidelines, templates, policies, etc. – Reuse and extension = maximum value, lower costs, happier staff • Practical focus reflects real- world organizations – enterprise program evolving to all hands on deck. © Copyright 2022 by Peter Aiken Slide # 25 https://anythingawesome.com Source: Applications Executive Council, Applications Budget, Spend, and Performance Benchmarks: 2005 Member Survey Results, Washington D.C.: Corporate Executive Board 2006, p. 23. Percentage of Projects on Budget By Process Framework Adoption …while the same pattern generally holds true for on-time performance Percentage of Projects on Time By Process Framework Adoption Key Finding: Process Frameworks are not Created Equal © Copyright 2022 by Peter Aiken Slide # With the exception of CMM and ITIL, use of process-efficiency frameworks does not predict higher on-budget project delivery… 26 https://anythingawesome.com
  • 26. "While all improvement efforts begin with the obligatory 'assessment' phase, Carnegie Mellon’s CMMI and DMM are the only proven frameworks that have the added benefit of literally decades of practice and benchmarking data. Organizations not using the DMM risk an inability to meaningfully compare results against other organizations and, as a result, adopt unproven methods." © Copyright 2022 by Peter Aiken Slide # 27 https://anythingawesome.com © Copyright 2022 by Peter Aiken Slide # 28 https://anythingawesome.com Data Management Body of Knowledge (DM BoK V2) Practice Areas from The DAMA Guide to the Data Management Body of Knowledge 2E © 2017 by DAMA International Missing two important concepts: Optionality Dependency
  • 27. © Copyright 2022 by Peter Aiken Slide # https://anythingawesome.com Program 29 • Motivation - Frustration–we are unsatisfied with current state - Are we making progress? (No) • How did we get here? (Building on proven research) - DoD ➜ SEI ➜ MITRE ➜ CMMI - Industry push for best practices • Ingredients - What is the Data Maturity Model? (DMM) - Body of Knowledge (DM BOK) • Understanding and applying them together - Weakest link in the chain architecture - Just a bit on strategy - Three legged stool - How does one get to Carnegie Hall? • Where to next? • Q & A? 1 + 1 = 11 Practicing Data Management Better Our barn had to pass a foundation inspection • Before further construction could proceed • It makes good business sense • No IT equivalent © Copyright 2022 by Peter Aiken Slide # 30 https://anythingawesome.com
  • 28. You can accomplish Advanced Data Practices without becoming proficient in the Foundational Data Practices however this will: • Take longer • Cost more • Deliver less • Present greater risk (with thanks to Tom DeMarco) Data Management Practices Hierarchy © Copyright 2022 by Peter Aiken Slide # Advanced Data Practices • MDM • Mining • Big Data • Analytics • Warehousing • SOA Foundational Data Practices Data Platform/Architecture Data Governance Data Quality Data Operations Data Management Strategy T e c h n o l o g i e s C a p a b i l i t i e s 31 https://anythingawesome.com • AI/ML • Ops • Mesh • Crypto © Copyright 2022 by Peter Aiken Slide # Data$Management$ Strategy Data Management Goals Corporate Culture Data Management Funding Data Requirements Lifecycle Data Governance Governance Management Business Glossary Metadata Management Data Quality Data Quality Framework Data Quality Assurance Data Operations Standards and Procedures Data Sourcing Platform$&$ Architecture Architectural Framework Platforms & Integration Supporting$ Processes Measurement & Analysis Process Management Process Quality Assurance Risk Management Configuration Management Component Process$Areas Data Governance Data Management Strategy Data Operations Platform Architecture Supporting Processes Maintain fit-for-purpose data, efficiently and effectively 32 https://anythingawesome.com Manage data coherently Manage data assets professionally Data life cycle management Organizational support Data Quality Data$Management$ Strategy Data Management Goals Corporate Culture Data Management Funding Data Requirements Lifecycle Data Governance Governance Management Business Glossary Metadata Management Data Quality Data Quality Framework Data Quality Assurance Data Operations Standards and Procedures Data Sourcing Platform$&$ Architecture Architectural Framework Platforms & Integration Supporting$ Processes Measurement & Analysis Process Management Process Quality Assurance Risk Management Configuration Management Component Process$Areas Data$Management$ Strategy Data Management Goals Corporate Culture Data Management Funding Data Requirements Lifecycle Data Governance Governance Management Business Glossary Metadata Management Data Quality Data Quality Framework Data Quality Assurance Data Operations Standards and Procedures Data Sourcing Platform$&$ Architecture Architectural Framework Platforms & Integration Supporting$ Processes Measurement & Analysis Process Management Process Quality Assurance Risk Management Configuration Management Component Process$Areas Data$Management$ Strategy Data Management Goals Corporate Culture Data Management Funding Data Requirements Lifecycle Data Governance Governance Management Business Glossary Metadata Management Data Quality Data Quality Framework Data Quality Assurance Data Operations Standards and Procedures Data Sourcing Platform$&$ Architecture Architectural Framework Platforms & Integration Supporting$ Processes Measurement & Analysis Process Management Process Quality Assurance Risk Management Configuration Management Component Process$Areas Data$Management$ Strategy Data Management Goals Corporate Culture Data Management Funding Data Requirements Lifecycle Data Governance Governance Management Business Glossary Metadata Management Data Quality Data Quality Framework Data Quality Assurance Data Operations Standards and Procedures Data Sourcing Platform$&$ Architecture Architectural Framework Platforms & Integration Supporting$ Processes Measurement & Analysis Process Management Process Quality Assurance Risk Management Configuration Management Component Process$Areas Data$Management$ Strategy Data Management Goals Corporate Culture Data Management Funding Data Requirements Lifecycle Data Governance Governance Management Business Glossary Metadata Management Data Quality Data Quality Framework Data Quality Assurance Data Operations Standards and Procedures Data Sourcing Platform$&$ Architecture Architectural Framework Platforms & Integration Supporting$ Processes Measurement & Analysis Process Management Process Quality Assurance Risk Management Configuration Management Component Process$Areas Data architecture implementation DMM℠ Structure of 5 Integrated DM Practice Areas
  • 29. © Copyright 2022 by Peter Aiken Slide # Data architecture implementation Data Governance Data Management Strategy Data Operations Platform Architecture Supporting Processes Maintain fit-for-purpose data, efficiently and effectively 33 https://anythingawesome.com Manage data coherently Manage data assets professionally Data life cycle management Organizational support Data Quality Data Governance Data Quality Platform Architecture Data Operations Data Management Strategy 3 3 3 3 1 Supporting Processes Optimized Measured Defined Managed Initial Optimized Measured Defined Managed Initial Optimized Measured Defined Managed Initial Optimized Measured Defined Managed Initial Your data foundation can only be as strong as its weakest link! Optimized Measured Defined Managed Initial © Copyright 2022 by Peter Aiken Slide # Data Management Practice Areas Data Management Strategy DM is practiced as a coherent and coordinated set of activities Data Quality Delivery of data is support of organizational objectives – the currency of DM Data Governance Designating specific individuals caretakers for certain data Data Platform/ Architecture Efficient delivery of data via appropriate channels Data Operations Ensuring reliable access to data Capability Maturity Model Levels Examples of practice maturity 1 – Performed Our DM practices are ad hoc and dependent upon "heroes" and heroic efforts 2 – Managed We have DM experience and have the ability to implement disciplined processes 3 – Defined We have standardized DM practices so that all in the organization can perform it with uniform quality 4 – Measured We manage our DM processes so that the whole organization can follow our standard DM guidance 5 – Optimized We have a process for improving our DM capabilities 34 https://anythingawesome.com Assessment Components
  • 30. Sample Assessment Summary © Copyright 2022 by Peter Aiken Slide # 35 https://anythingawesome.com Cumulative Benchmark – Multiple organizations © Copyright 2022 by Peter Aiken Slide # 36 https://anythingawesome.com
  • 31. Assessments © Copyright 2022 by Peter Aiken Slide # 37 https://anythingawesome.com • It is generally not worth a lot of investment to discover that you are at the very beginning of your journey • Use it to uncover previously unknown pockets of excellence • First plan should be examine the feasibility of expanding these to other parts of the organization Industry Focused Results © Copyright 2022 by Peter Aiken Slide # 38 https://anythingawesome.com Data Management Strategy Data Governance Platform & Architecture Data Quality Data Operations Focus: Implementation and Access Focus: Guidance and Facilitation Optimized (V) Measured (IV) Defined (III) Managed (II) Initial (I) • CMU's Software Engineering Institute (SEI) Collaboration • Results from hundreds organizations in various industries including: ✓ Public Companies ✓ State Government Agencies ✓ Federal Government ✓ International Organizations • Defined industry standard • Steps toward defining data management "state of the practice"
  • 32. Development guidance Data Adminstration Support systems Asset recovery capability Development training 0 1 2 3 4 5 Client Industry Competition All Respondents Data Management Practices Assessment © Copyright 2022 by Peter Aiken Slide # Challenge Challenge Challenge Data Program Coordination Organizational Data Integration Data Stewardship Data Development Data Support Operations 39 https://anythingawesome.com High Marks for IFC's Audit © Copyright 2022 by Peter Aiken Slide # 40 https://anythingawesome.com Leadership & Guidance Asset Creation Metadata Management Quality Assurance Change Management Data Quality 0 1 2 3 4 5 TRE ISG IFC Industry Benchmarks Overall Benchmarks
  • 33. 1 2 3 4 5 Data Program Coordination Organizational Data Integration Data Stewardship Data Development Data Support Operations 2007 Maturity Levels 2019 Maturity Levels Comparison of DM Maturity 2007-2019 © Copyright 2022 by Peter Aiken Slide # 41 https://anythingawesome.com How Literate are we? What is NAAL? • a Nationally representative Assessment of English Literacy among American Adults age 16 and older NAAL ➜ PIAAC (Program for the International Assessment of Adult Competencies) • PIAAC assesses three key competencies for 21st-century society and the global economy • No statistically significant differences from 2012/14 to 2017! © Copyright 2022 by Peter Aiken Slide # 42 https://anythingawesome.com https://nces.ed.gov/surveys/piaac/current_results.asp • Literacy the ability to understand, use, and respond appropriately to written texts. • Numeracy the ability to use basic mathematical and computational skills. • Digital Problem Solving the ability to access/interpret information in digital environments to perform practical tasks.
  • 34. Strategy Guides Workgroup Activities © Copyright 2022 by Peter Aiken Slide # A pattern in a stream of decisions 43 https://anythingawesome.com Mintzberg Theory of Constraints - Generic © Copyright 2022 by Peter Aiken Slide # 44 https://anythingawesome.com Identify the current constraints, the components of the system limiting goal realization Make quick improvements to the constraint using existing resources Review other activities in the process facilitate proper alignment and support of constraint If the constraint persists, identify other actions to eliminate the constraint Repeat until the constraint is eliminated Alleviate
  • 35. Strategy Example 1 © Copyright 2022 by Peter Aiken Slide # Good Guys (Us) Bad Guys (Them) 45 https://anythingawesome.com Strategy Example 2 © Copyright 2022 by Peter Aiken Slide # Good Guys (Us) Bad Guys (Them) 46 https://anythingawesome.com
  • 36. Strategy Example 3 © Copyright 2022 by Peter Aiken Slide # Good Guys (Us) Bad Guys (Them) 47 https://anythingawesome.com General Dwight D. Eisenhower © Copyright 2022 by Peter Aiken Slide # 48 https://anythingawesome.com “In preparing for battle I have always found that plans are useless, but planning is indispensable …” https://quoteinvestigator.com/2017/11/18/planning/ – “In preparing for battle I have always found that plans are useless, but planning is indispensable …” https://quoteinvestigator.com/2017/11/18/planning/
  • 37. © Copyright 2022 by Peter Aiken Slide # 49 https://anythingawesome.com Data Sandwich Data supply Data literacy Standard data Standard data Leverage point - high performance automation © Copyright 2022 by Peter Aiken Slide # Data literacy 50 https://anythingawesome.com Data supply
  • 38. Leverage point - high performance automation © Copyright 2022 by Peter Aiken Slide # Standard data Data supply Data literacy 51 https://anythingawesome.com Leverage point - high performance automation © Copyright 2022 by Peter Aiken Slide # This cannot happen without investments in engineering and architecture! 52 https://anythingawesome.com Data supply Data literacy Standard data Quality engineering/ architecture work products do not happen accidentally!
  • 39. Quality data engineering/ architecture work products do not happen accidentally! Leverage point - high performance automation © Copyright 2022 by Peter Aiken Slide # This cannot happen without investments in data engineering and architecture! 53 https://anythingawesome.com Data supply Data literacy Standard data Version 1 © Copyright 2022 by Peter Aiken Slide # 54 https://anythingawesome.com Data Strategy Data Governance BI/ Warehouse Perfecting operations in 3 data management practice areas 1X 1X 1X Metadata Data Quality
  • 40. Version 2 © Copyright 2022 by Peter Aiken Slide # 55 https://anythingawesome.com Data Strategy Data Governance BI/ Warehouse Perfecting operations in 3 data management practice areas Metadata 2X 2X 1X Version 3 © Copyright 2022 by Peter Aiken Slide # 56 https://anythingawesome.com Data Strategy Data Governance BI/ Warehouse Reference & Master Data Perfecting operations in 3 data management practice areas 1X 3X 3X
  • 41. (Things that further) Organizational Strategy Lighthouse Projects Provide Focus © Copyright 2022 by Peter Aiken Slide # ( O c c a s i o n s t o P r a c t i c e ) N e e d e d D a t a S k i l l s ( O p p o r t u n i t i e s t o i m p r o v e ) D a t a u s e b y t h e b u s i n e s s 57 https://anythingawesome.com A Musical Analogy © Copyright 2022 by Peter Aiken Slide # 58 https://anythingawesome.com + = https://www.youtube.com/watch?v=4n1GT-VjjVs&frags=pl%2Cwn
  • 42. © Copyright 2022 by Peter Aiken Slide # https://anythingawesome.com Program 59 • Motivation - Frustration–we are unsatisfied with current state - Are we making progress? (No) • How did we get here? (Building on proven research) - DoD ➜ SEI ➜ MITRE ➜ CMMI - Industry push for best practices • Ingredients - What is the Data Maturity Model? (DMM) - Body of Knowledge (DM BOK) • Understanding and applying them together - Weakest link in the chain architecture - Just a bit on strategy - Three legged stool - How does one get to Carnegie Hall? • Where to next? • Q & A? 1 + 1 = 11 Practicing Data Management Better Change Management & Leadership © Copyright 2022 by Peter Aiken Slide # 60 https://anythingawesome.com
  • 43. Diagnosing Organizational Readiness © Copyright 2022 by Peter Aiken Slide # adapted from the Managing Complex Change model by Lippitt, 1987 Culture is the biggest impediment to a shift in organizational thinking about data! 61 https://anythingawesome.com No cost, no registration case study download • Download – http://dl.acm.org/citation.cfm?doid=2888577.2893482 © Copyright 2022 by Peter Aiken Slide # 62 https://anythingawesome.com 8 EXPERIENCE: Succeeding at Data Management—BigCo Attempts to Leverage Data PETER AIKEN, Virginia Commonwealth University/Data Blueprint In a manner similar to most organizations, BigCompany (BigCo) was determined to benefit strategically from its widely recognized and vast quantities of data. (U.S. government agencies make regular visits to BigCo to learn from its experiences in this area.) When faced with an explosion in data volume, increases in complexity, and a need to respond to changing conditions, BigCo struggled to respond using a traditional, information technology (IT) project-based approach to address these challenges. As BigCo was not data knowledgeable, it did not realize that traditional approaches could not work. Two full years into the initiative, BigCo was far from achieving its initial goals. How much more time, money, and effort would be required before results were achieved? Moreover, could the results be achieved in time to support a larger, critical, technology-driven challenge that also depended on solving the data challenges? While these questions remain unaddressed, these considerations increase our collective understanding of data assets as separate from IT projects. Only by reconceiving data as a strategic asset can organizations begin to address these new challenges. Transformation to a data-driven culture requires far more than technology, which remains just one of three required “stool legs” (people and process being the other two). Seven prerequisites to effectively leveraging data are necessary, but insufficient awareness exists in most organizations—hence, the widespread misfires in these areas, especially when attempting to implement the so-called big data initiatives. Refocusing on foundational data management practices is required for all organizations, regardless of their organizational or data strategies. Categories and Subject Descriptors: H.2.0 [Information Systems]: Database Management—General; E.0 [Data]: General General Terms: Management, Performance, Design Additional Key Words and Phrases: Data management, data governance, data stewardship, organizational design, CDO, CIO, chief data officer, chief information officer, data, data architecture, enterprise data exec- utive, IT management, strategy, policy, enterprise architecture, information systems, conceptual modeling, data integration, data warehousing, analytics, and business intelligence, BigCo ACM Reference Format: Peter Aiken. 2016. Experience: Succeeding at data management—BigCo attempts to leverage data. J. Data and Information Quality 7, 1–2, Article 8 (May 2016), 35 pages. DOI: http://dx.doi.org/10.1145/2893482 1. CASE INTRODUCTION Good technology in the hands of an inexperienced user rarely produces positive results. Everyone wants to “leverage” data. Today, this is most often interpreted as invest- ments in warehousing, analytics, business intelligence (BI), and so on. After all, that is what you do with an asset—you leverage it—so the asset can help you to attain strategic objectives; see Redman [2008] and Ladley [2010]. Widespread and pervasive Author’s address: P. Aiken, 10124C West Broad Street, Glen Allen, VA 23060; email: peter.aiken@vcu.edu. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions@acm.org. 2016 Copyright is held by the owner/author(s). Publication rights licensed to ACM. ACM 1936-1955/2016/05-ART8 $15.00 DOI: http://dx.doi.org/10.1145/2893482 ACM Journal of Data and Information Quality, Vol. 7, No. 1–2, Article 8, Publication date: May 2016. • Download
  • 44. 1. Data volume is still increasing faster than we are able to process it, 2. Data interchange overhead and other costs of poor data practices are measurably sapping organization and individual resources–and therefore productivity, 3. Reliance on existing technology-based approaches and education methods has not materially addressed this gap between creation and processing or reduced bottom line costs, & 4. There exists an industry-type, whose sole purpose is to extract data from citizens and then use it for to make money. Big changes 1. Process is more important than results at first 2. Failure is itself a lesson 3. People and process aspects are not receiving enough attention 4. Best practices do exist © Copyright 2022 by Peter Aiken Slide # 63 https://anythingawesome.com © Copyright 2022 by Peter Aiken Slide # 64 https://anythingawesome.com • Motivation - Frustration–we are unsatisfied with current state - Are we making progress? (No) • How did we get here? (Building on proven research) - DoD ➜ SEI ➜ MITRE ➜ CMMI - Industry push for best practices • Ingredients - What is the Data Maturity Model? (DMM) - Body of Knowledge (DM BOK) • Understanding and applying them together - Weakest link in the chain architecture - Just a bit on strategy - Three legged stool - How does one get to Carnegie Hall? • Where to next? • Q & A? Program Practicing Data Management Better 1 + 1 = 11
  • 45. Upcoming Events Data Strategy Best Practices 10 January 2023 Data Modeling Fundamentals 14 February 2023 Data Stewards - Defining and Assigning 14 March 2023 © Copyright 2022 by Peter Aiken Slide # 65 https://anythingawesome.com Brought to you by: Time: 19:00 UTC (2:00 PM NYC) | Presented by: Peter Aiken, PhD Note: In this .pdf, clicking any webinar title opens the registration link Event Pricing © Copyright 2022 by Peter Aiken Slide # 66 https://anythingawesome.com • 20% off directly from the publisher on select titles • My Book Store @ http://plusanythingawesome.com • Enter the code "anythingawesome" at the Technics bookstore checkout where it says to "Apply Coupon"
  • 46. Peter.Aiken@AnythingAwesome.com +1.804.382.5957 Thank You! © Copyright 2022 by Peter Aiken Slide # 67 Book a call with Peter to discuss anything - https://anythingawesome.com/OfficeHours.html Critical Design Review? Hiring Assistance? Reverse Engineering Expertise? Executive Data Literacy Training? Mentoring? Tool/automation evaluation? Use your data more strategically? Independent Verification & Validation Collaboration?