The document discusses the Department of Defense's (DoD) Data Reference Model (DRM) and Net-Centric Data Strategy. The DRM provides a framework to enable information sharing across agencies by abstracting data sources and decoupling applications. It shifts data architecture from entity relationship modeling to a business context approach. The Net-Centric Data Strategy aims to break down barriers to information sharing by tagging data with metadata, organizing data with taxonomies through Communities of Interest, and defining data models and structures. An example is provided of how the Blue Force Tracking Community of Interest implements the strategy by tagging its data assets and organizing them to be discoverable, interoperable, accessible, and understandable across the DoD enterprise.
BRIDGING DATA SILOS USING BIG DATA INTEGRATIONijmnct
With cloud computing, cheap storage and technology advancements, an enterprise uses multiple
applications to operate business functions. Applications are not limited to just transactions, customer
service, sales, finance but they also include security, application logs, marketing, engineering, operations,
HR and many more. Each business vertical uses multiple applications which generate a huge amount of
data. On top of that, social media, IoT sensors, SaaS solutions, and mobile applications record exponential
growth in data volume. In almost all enterprises, data silos exist through these applications. These
applications can produce structured, semi-structured, or unstructured data at different velocity and in
different volume. Having all data sources integrated and generating timely insights helps in overall
decision making. With recent development in Big Data Integration, data silos can be managed better and it
can generate tremendous value for enterprises. Big data integration offers flexibility, speed and scalability
for integrating large data sources. It also offers tools to generate analytical insights which can help
stakeholders to make effective decisions. This paper presents the overview on data silos, challenges with
data silos and how big data integration can help to stun them.
Denodo Data Innovation Award: Digital Transformation & Regulatory Excellence ...Denodo
Watch full webinar here: https://bit.ly/33wQHCd
The much sought after Denodo Data Innovation Award. Who will be the winner this year? Listen two customers duel it out. You determine the winner.
Enterprise Information Management: Strategy, Best Practices & Technologies on...FindWhitePapers
Authored by Frank Dravis, Baseline Consulting, this paper discusses: (1) EIM strategy development and (2) enabling information management technology. Understanding these two areas is crucial to starting, planning and executing an EIM initiative.
Presentation made at IRMS (Information and Records Management Society conference, UK) May 2018.
Operational steps to make GDPR work. Based on real-world examples.
Dama Ireland slides - Data Trust event 9th June 2016Ken O'Connor
Do we need a Data Trust / Data Quality Mark?
Presentation by Data Management Specialist, Ken O'Connor:
Our food packaging provides facts about the food we buy. It's required by law. These facts enable us to make informed decisions about the food we consume. What about when we seek to make informed decisions in our business processes? What do we know about the data we're consuming? How can we trust that the data we depend on is fit for the purpose for which we need it? In this presentation, you will learn:
Your rights and responsibilities as a data consumer and provider;
The questions you should ask about the data you consume;
The facts you should provide about the data you provide;
The need for a "Data Q-Mark" or a "Data Trust-Level" ;
The presentation was followed by a panel discussion with Ronan Brennan, the CTO of Silverfinch (a MoneyMate company). In October 2015, Silverfinch announced it was handing €2.5 trillion of look-through assets for asset manager clients worldwide. Ronan shared the SilverFinch success story with the attendees, which is built on solid data management practices.
BRIDGING DATA SILOS USING BIG DATA INTEGRATIONijmnct
With cloud computing, cheap storage and technology advancements, an enterprise uses multiple
applications to operate business functions. Applications are not limited to just transactions, customer
service, sales, finance but they also include security, application logs, marketing, engineering, operations,
HR and many more. Each business vertical uses multiple applications which generate a huge amount of
data. On top of that, social media, IoT sensors, SaaS solutions, and mobile applications record exponential
growth in data volume. In almost all enterprises, data silos exist through these applications. These
applications can produce structured, semi-structured, or unstructured data at different velocity and in
different volume. Having all data sources integrated and generating timely insights helps in overall
decision making. With recent development in Big Data Integration, data silos can be managed better and it
can generate tremendous value for enterprises. Big data integration offers flexibility, speed and scalability
for integrating large data sources. It also offers tools to generate analytical insights which can help
stakeholders to make effective decisions. This paper presents the overview on data silos, challenges with
data silos and how big data integration can help to stun them.
Denodo Data Innovation Award: Digital Transformation & Regulatory Excellence ...Denodo
Watch full webinar here: https://bit.ly/33wQHCd
The much sought after Denodo Data Innovation Award. Who will be the winner this year? Listen two customers duel it out. You determine the winner.
Enterprise Information Management: Strategy, Best Practices & Technologies on...FindWhitePapers
Authored by Frank Dravis, Baseline Consulting, this paper discusses: (1) EIM strategy development and (2) enabling information management technology. Understanding these two areas is crucial to starting, planning and executing an EIM initiative.
Presentation made at IRMS (Information and Records Management Society conference, UK) May 2018.
Operational steps to make GDPR work. Based on real-world examples.
Dama Ireland slides - Data Trust event 9th June 2016Ken O'Connor
Do we need a Data Trust / Data Quality Mark?
Presentation by Data Management Specialist, Ken O'Connor:
Our food packaging provides facts about the food we buy. It's required by law. These facts enable us to make informed decisions about the food we consume. What about when we seek to make informed decisions in our business processes? What do we know about the data we're consuming? How can we trust that the data we depend on is fit for the purpose for which we need it? In this presentation, you will learn:
Your rights and responsibilities as a data consumer and provider;
The questions you should ask about the data you consume;
The facts you should provide about the data you provide;
The need for a "Data Q-Mark" or a "Data Trust-Level" ;
The presentation was followed by a panel discussion with Ronan Brennan, the CTO of Silverfinch (a MoneyMate company). In October 2015, Silverfinch announced it was handing €2.5 trillion of look-through assets for asset manager clients worldwide. Ronan shared the SilverFinch success story with the attendees, which is built on solid data management practices.
Enabling Data Governance - Data Trust, Data Ethics, Data QualityEryk Budi Pratama
Presented on PHPID Online Learning 35.
Komunitas PHP Indonesia
Title: Enabling Data Governance - The Journey through Data Trust, Ethics, and Quality
Eryk B. Pratama
Global IT & Cybersecurity Advisor
DATA VIRTUALIZATION FOR DECISION MAKING IN BIG DATAijseajournal
Data analytics and Business Intelligence (BI) are essential components of decision support technologies that gather and analyze data for faster and better strategic and operational decision making in an organization. Data analytics emphasizes on algorithms to control the relationship between data offering insights. The major difference between BI and analytics is that analytics has predictive competence which helps in making future predictions whereas Business Intelligence helps in informed decision-making built on the analysis of past data. Business Intelligence solutions are among the most valued data management tools whose main objective is to enable interactive access to real-time data, manipulation of data and provide business organizations with appropriate analysis. Business Intelligence solutions leverage software and services to collect and transform raw data into useful information that enable more informed and quality business decisions regarding customers, market competitors, internal operations and so on. Data needs to be integrated from disparate sources in order to derive valuable insights. Extract-Transform-Load (ETL), which are traditionally employed by organizations help in extracting data from different sources, transforming and aggregating and finally loading large volume of data into warehouses. Recently Data virtualization has been used to speed up the data integration process. Data virtualization and ETL often serve unique and complementary purposes in performing complex, multi-pass data transformation and cleansing operations, and bulk loading the data into a target data store. In this paper we provide an overview of Data virtualization technique used for Data analytics and BI.
Top 10 guidelines for deploying modern data architecture for the data driven ...LindaWatson19
Enterprises are facing a new revolution, powered by the rapid adoption of data analytics with modern technologies like machine learning and artificial intelligence (A).
Boosting Cybersecurity with Data Governance (peer reviewed)Guy Pearce
Data Governance has a significant role to play in information security, with special data classes beyond the regular four cyber classes (public, confidential, classified and restricted) being useful in helping the organization identify whether sensitive data was exposed in a breach.
Benefits and Challenges of Data Center ConsolidationRahi Systems
Data center consolidation enables centralization, introduces standards, integrate systems and legacy support. Learn on 10 recommended steps for firms in a consolidation https://goo.gl/ztT1w1
BigID Enterprise Privacy Management Data SheetDimitri Sirota
Read how BigID redefines privacy management from surveys and spreadsheets to data-driven compliance built on actual data. Using BigID, organizations can more easily meet global privacy requirements while also safeguarding personal data across the enterprise.
Don't Let Your Data Get SMACked: Introducing 3-D Data ManagementCognizant
Establishing data accuracy and quality is central to data management, but the SMAC stack - social, mobile, analytics and cloud - both makes it more complex to do so and offers tools for accomplishing the mission. We devised a three-tier "3-D" plan for data management based on integration, data fidelity and data integration.
Governance and Architecture in Data IntegrationAnalytiX DS
AnalytiX™ Mapping Manager™ provides this discipline and rigor through its dedicated data mapping methodology as well as its metadata management processes and powerful patented mapping technology. AnalytiX™ Mapping Manager™ was designed and developed to not only fill the gap of having the ability to manage and version mapping specifications, but to also streamline and improve current process and drive standards around the entire process and across the enterprise for all integration and governance processes.
Lessons in Information Governance was presented at AIIM's Executive Leadership Council in London. What can we learn from the openness of open source? How can you get people to govern information who have no interest in doing so? Is there any way that management can handle the glut and the risk of the information explosion? Do they even care?
Enabling Data Governance - Data Trust, Data Ethics, Data QualityEryk Budi Pratama
Presented on PHPID Online Learning 35.
Komunitas PHP Indonesia
Title: Enabling Data Governance - The Journey through Data Trust, Ethics, and Quality
Eryk B. Pratama
Global IT & Cybersecurity Advisor
DATA VIRTUALIZATION FOR DECISION MAKING IN BIG DATAijseajournal
Data analytics and Business Intelligence (BI) are essential components of decision support technologies that gather and analyze data for faster and better strategic and operational decision making in an organization. Data analytics emphasizes on algorithms to control the relationship between data offering insights. The major difference between BI and analytics is that analytics has predictive competence which helps in making future predictions whereas Business Intelligence helps in informed decision-making built on the analysis of past data. Business Intelligence solutions are among the most valued data management tools whose main objective is to enable interactive access to real-time data, manipulation of data and provide business organizations with appropriate analysis. Business Intelligence solutions leverage software and services to collect and transform raw data into useful information that enable more informed and quality business decisions regarding customers, market competitors, internal operations and so on. Data needs to be integrated from disparate sources in order to derive valuable insights. Extract-Transform-Load (ETL), which are traditionally employed by organizations help in extracting data from different sources, transforming and aggregating and finally loading large volume of data into warehouses. Recently Data virtualization has been used to speed up the data integration process. Data virtualization and ETL often serve unique and complementary purposes in performing complex, multi-pass data transformation and cleansing operations, and bulk loading the data into a target data store. In this paper we provide an overview of Data virtualization technique used for Data analytics and BI.
Top 10 guidelines for deploying modern data architecture for the data driven ...LindaWatson19
Enterprises are facing a new revolution, powered by the rapid adoption of data analytics with modern technologies like machine learning and artificial intelligence (A).
Boosting Cybersecurity with Data Governance (peer reviewed)Guy Pearce
Data Governance has a significant role to play in information security, with special data classes beyond the regular four cyber classes (public, confidential, classified and restricted) being useful in helping the organization identify whether sensitive data was exposed in a breach.
Benefits and Challenges of Data Center ConsolidationRahi Systems
Data center consolidation enables centralization, introduces standards, integrate systems and legacy support. Learn on 10 recommended steps for firms in a consolidation https://goo.gl/ztT1w1
BigID Enterprise Privacy Management Data SheetDimitri Sirota
Read how BigID redefines privacy management from surveys and spreadsheets to data-driven compliance built on actual data. Using BigID, organizations can more easily meet global privacy requirements while also safeguarding personal data across the enterprise.
Don't Let Your Data Get SMACked: Introducing 3-D Data ManagementCognizant
Establishing data accuracy and quality is central to data management, but the SMAC stack - social, mobile, analytics and cloud - both makes it more complex to do so and offers tools for accomplishing the mission. We devised a three-tier "3-D" plan for data management based on integration, data fidelity and data integration.
Governance and Architecture in Data IntegrationAnalytiX DS
AnalytiX™ Mapping Manager™ provides this discipline and rigor through its dedicated data mapping methodology as well as its metadata management processes and powerful patented mapping technology. AnalytiX™ Mapping Manager™ was designed and developed to not only fill the gap of having the ability to manage and version mapping specifications, but to also streamline and improve current process and drive standards around the entire process and across the enterprise for all integration and governance processes.
Lessons in Information Governance was presented at AIIM's Executive Leadership Council in London. What can we learn from the openness of open source? How can you get people to govern information who have no interest in doing so? Is there any way that management can handle the glut and the risk of the information explosion? Do they even care?
Governance of Data Sharing in Agri-Food - towards common guidelinesSjaak Wolfert
Big Data is becoming a new asset in the agri-food sector including enterprise data from operational systems, sensor data, farm equipment data, etc. Recently, Big Data applications are being implemented, aiming at improving farm and chain performance. Many companies are refraining from sharing data because of the fear of governance issues such as data security, privacy and liability. Moreover, they are often in a deadlock or afraid to take the first step even though they expect to develop new business with data. To accelerate the development of Big Data applications, this paper analyses governance issues and introduces a set of guidelines for governance of data sharing in agri-food networks. A framework for analysis was derived from literature and used to identify lessons learned from recent projects or initiatives. From these results, a set of draft guidelines was developed. The framework and guidelines were evaluated in a workshop. The framework consists of factors that are related to governance on data sharing in networks. Internal factors are: efficiency, effectiveness, inclusiveness, legitimacy & accountability, credibility and transparency. External factors are: political, economic, social, technological, legal and environmental factors. For each of these factors, guidelines are provided in terms of: issues to be addressed, best practices and lessons learned from other projects and initiatives. It is concluded that the framework is complete in covering all relevant issues on governance in data sharing but the guidelines must be considered as a first set, which can be further improved and extended in the future. A wiki-type-of-website could help to upscale the guidelines at a global level. The guidelines could also be further refined accounting for different maturity levels of agri-food networks. The guidelines in this paper are considered to be a valuable step into the direction of solving governance issues in data sharing, which is expected to accelerate Big Data applications in the agri-food domain.
Watch full webinar here: https://bit.ly/44Da4nP
Data mesh is a hot buzzword, but what is behind it? The concept promises to address some highly relevant problem areas. How are your peers viewing the hype vs. reality? Are the principles of data mesh suitable for solving the challenges many face?
Joined by BARC’s Senior Analyst, Jacqueline Bloemen, an industry thought leader with over 35 years of experience, currently focusing her research on the transformation to become a data-driven enterprise.
Jacqueline will dive into a recent study by BARC on how organizations view data mesh and whether they are leveraging this concept in their data modernization journeys. The survey results will explore the following:
- The relevance a data mesh has in addressing today's data challenges.
- How should a self-service data platform be shaped?
- What role is data governance playing in data modernization projects?
- How far along are your peers, and how do you compare?
After her presentation, Jacqueline will be joined by Kevin Bohan of Denodo for an open discussion on the survey findings. This discussion will include an opportunity for you to ask questions of our speakers.
leewayhertz.com-AI in Master Data Management MDM Pioneering next-generation d...KristiLBurns
Master data refers to the critical, core data within an enterprise that is essential for conducting business operations and making informed decisions. This data encompasses vital information about the primary entities around which business transactions revolve and generally changes infrequently. Master data is not transactional but rather plays a key role in defining and guiding transactions.
This paper will examine why its become important for organizations to develop an comprehensive Information Technology (IT) Modernization strategy that focuses on improving their business process and protecting their network infrastructure by leveraging modern-day cybersecurity tools. Most of the topic will focus on the U.S. Department of Defense’s (DOD) strategy towards improving their network security defenses for the department and the steps they’ve taken at the agency level where components under DOD such as DISA (The Defense Information Systems Agency) are working towards adding tools that provides additional capabilities in the cyber space. This approach will be analyzed to determine if DOD goals address any of their vulnerabilities towards protecting their networks. One of the agencies under the DOD umbrella called DISA (The Defense Information Systems Agency) provides DOD a template on how to build a network that relies upon layers of security to help it combat cyber attacks against its network. Whether that provides an effective solution to DOD remains a question due to the many components that operate under its direction. Managing these networks is the principle responsibilities for the Department of Defense. Nevertheless, it does demonstrates that there are tools available to help DOD build an strong enterprise cyber network of situational awareness that strengthens the ability to protect their network infrastructure.
Global Data Management: Governance, Security and Usefulness in a Hybrid WorldNeil Raden
With Global Data Management methodology and tools, all of your data can be accessed and used no matter where it is or where it is from: on-premises, private cloud, public cloud(s), hybrid cloud, open source, third-party data and any combination of the these, with security, privacy and governance applied as if they were a single entity. Ingenious software products and the economics of computing make it economical to do this. Not free, but feasible.
Bridging Data Gaps with a Solid Data Foundation - A Key Imperative for Today’...Denodo
Watch full webinar here: https://bit.ly/3CjoaxS
In this session, the panel will discuss the importance of laying out a solid data foundation for everything digital for any financial institution. The panelists from UFCU and DevFacto will share their journey and agile approach toward data management in a hybrid data environment.
From this session, you will learn how UFCU gained unprecedented agility in data management and built the foundation for a “member 360” view. Devfacto worked with UFCU to design and set up multiple service streams. To streamline cloud adoption, and seamlessly unify cloud and on-premise data sources. Denodo’s Logical Data Platform enabled UFCU with reusable Lego-like building blocks to create different data views for business teams.
• Democratizing data access and maturing feedback loops between producers and consumers is central to unlocking the potential of DoD’s data to deliver decision advantage
• We are building the framework for DoD’s data ecosystem – including
people, process and technology
• Quality data is the foundation for trusted analytics and AI
• DoD cannot address all data challenges at once – priorities:
• Foundational enterprise data: personnel, logistics, and finance
• End-user facing capability for marquee customers: OSD & CCMDs
value and implications of master data management.pptxMuhammad Khalid
A consistent and uniform set of identifiers and attributes that describe the core entities of the enterprise, and are used across multiple business processes.
Build a Winning Data Strategy in 2022.pdfAvinashBatham
Tredence is a leader in advanced analytics and full-stack AI services,
recognized as a Forrester Wave Leader in Customer Analytics in 2021 Q3 and the AI Gamechanger by NASSCOM.
Data Systems Integration & Business Value PT. 3: Warehousing Data Blueprint
Certain systems are more data focused than others. Usually their primary focus is on accomplishing integration of disparate data. In these cases, failure is most often attributable to the adoption of a single pillar (silver bullet). The three webinars in the Data Systems Integration and Business Value series are designed to illustrate that good systems development more often depends on at least three DM disciplines (pie wedges) in order to provide a solid foundation.
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered.
Data-Ed: Data Systems Integration & Business Value Pt. 3: WarehousingDATAVERSITY
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered.
A successful data governance capability requires a strategy to align regulatory drivers and technology enhancement initiatives with business needs and objectives, taking into account the organizational, technological and cultural changes that will need to take place.
1. Policy Awareness & Data
Reference Model (DRM)
Amit K. Maitra
AF CIO-Architecture
Inter-Agency DRM Working Group
March 21, 2005
2. Amit K. Maitra 2
CONTEXT
Global Environment
Changing Technologies
Revolutionary Moments: The Mandate
The Current Situation
The Solution: The DRM
The Architecture
The Structure
The Tools
Federated Data Management Approach
The Result
Paradigm Shift
Concern
Leadership at DoD
Decisions: Net Centric Data Strategy & Community of Interest
Processes: NCDS & COI
Example: Blue Force Tracking
3. Amit K. Maitra 3
Underlying Theme
Fully integrated information systems for a
shared data environment
4. Amit K. Maitra 4
Focus
Information, Access, Authorization, Emerging
Technologies
Data Accessibility, Commonality, and
Compatibility Design
Data Dictionary
Data Locale
Security & Privacy Assurance
5. Amit K. Maitra 5
Global Environment
Characteristics
Geographically distributed, dissimilar elements
of varying capabilities and responsibilities
Data distributed to and redistributed among
system facilities, interconnected by both
private and shared public communications
networks
6. Amit K. Maitra 6
Changing Technologies
A Gentle Transition From XML to Resource
Description Framework (RDF)
The purpose of RDF is to give a standard way of specifying data
“about” something
Advantage of using RDF
If widely used, RDF will help make XML more interoperable
Promotes the use of standardized vocabularies ... standardized types (classes) and standardized
properties
Provides a structured approach to designing XML documents
The RDF format is a regular, recurring pattern
Quickly identifies weaknesses and inconsistencies of non-RDF-compliant XML designs
Helps us better understand our data!
Positions data for the Semantic Web!
7. Amit K. Maitra 7
Changing Technologies: Web
Ontology Language (OWL)
RDF has limited expressive
capability
-- Mostly limited to taxonomic
descriptions
The things we model have
complex relationships so we
need to capture many different
facets, or restrictions on class
and property descriptions
8. Amit K. Maitra 8
Revolutionary Moments:
The Mandate
“Our success depends on
agencies working as a team
across traditional boundaries to
serve the American people,
focusing on citizens rather than
individual agency needs.” ~
President George W. Bush
9. Amit K. Maitra 9
No common framework or methodology to describe
the data and information that supports the
processes, activities, and functions of the business
No definition of the handshake or partnering aspects
of information exchange
Existing systems offer diffused content that is
difficult to manage, coordinate, and evolve
Information is inconsistent and/or classified
inappropriately
Without a common reference, data is easier to
duplicate than integrate
No common method to share data with external
partners
Limited insight into the data needs of agencies
outside the immediate domain
Data and Information context is rarely defined
Stove piped boundaries, no central registry
Lack of funding and incentive to share
Data sensitivity and security of data
New laws/issues result in continuous adding of
databases that can not share data
Primary Issues and Information
Sharing Barriers
The Current Situation:
The Federal Government is less than efficient in performing its business and meeting customer needs
due to data sharing inefficiencies caused by stove-piped data boundaries
Stove-Piped Data Boundaries
“As Is State”
HaveCreated
HHS
INDUSTRY
Illustrative
Illustrative
CDC
DHS
TSA
USDA
DOI
ENERGY
LABOR
FDA INS
Denotes data and information sets within
agencies.
10. Amit K. Maitra 10
The Solution: The Data Reference
Model (DRM)
Subject Area
Data Object
Data
Property
Data
Representation
Data
Classificatio
n
The DRM provides:
A framework to enable
horizontal and vertical information
sharing that is independent of
agencies and supporting systems
A framework to enable agencies
to build and integrate systems that
leverage data from within or
outside the agency domain
A framework that facilitates
opportunities for sharing with
citizens, external partners and
stakeholders
11. Amit K. Maitra 11
MODEL DRIVEN ARCHITECTUREMODEL DRIVEN ARCHITECTURE
A virtual representation of all physical data sources:
- Applications are to be decoupled from data sources
- Details of data storage and retrieval are to be abstracted
- Are to be easily extended to new information sources
The Architecture
12. Amit K. Maitra 12
The Structure
META OBJECT FACILITYMETA OBJECT FACILITY
14. Amit K. Maitra 14
Department of Homeland Security and
Federated Data Management Approach
15. Amit K. Maitra 15
The Result: Interagency
Information Federation
16. Amit K. Maitra 16
Paradigm Shift
MDA is fundamental change
MDA rests on MOF
It is the best architecture for integration
It shifts data architecture from Entity
Relationship Diagramming (ERD) to a
Business Context (Interoperability/Information
Sharing)
Business & Performance Driven ApproachBusiness & Performance Driven Approach
17. Amit K. Maitra 17
Concerns
To what extent the government agencies,
Customers, Partners are willing to participate
along the Lines of Business (LOB), thereby
underscoring the importance of working
toward a common goal: Collective Action IAW
National Security/National Interests criteria
These need to be tested and validated
against uniquely tailored performance
indicators: Inputs, Outputs, and Outcomes
18. Amit K. Maitra 18
Leadership at DoD
• Decisions
• Processes
19. Amit K. Maitra 19
Decisions
“Net-Centric Data Strategy
& Communities of Interest
(COI)”
20. Amit K. Maitra 20
End-User Consumer End-User Producer
B A R R I E R B A R R I E R B A R R I E R B A R R I E R
“What data exists?“
“How do I access the data?”
“How do I know this data is
what I need?”
“How can I tell someone
what data I need?”
“How do I share my data
with others?”
“How do I describe my
data so others can
understand it?”
Organization “A” Organization “B” Organization “C”
User is
unaware this
data exists
User knows this data exists
but cannot access it
because of
organizational
and/or
technical barriers
?
Processes:
The DoD Net-Centric Data Strategy aims at breaking
down barriers to information sharing…
User knows data exists and can
access it but may not
know how to make
use of it due to
lack of under-
standing of what
data represents
21. Amit K. Maitra 21
The Net-Centric Data Strategy is a key enabler
of the Department’s transformation...
The Strategy describes key goals to achieving net-centric data
management…
• The Strategy (signed May 9, 2003) provides the foundation for managing the
Department’s data in a net-centric environment, including:
Ensuring data are visible, accessible, and understandable when
needed and where needed to accelerate decision making
“Tagging” of all data (intelligence, non-intelligence, raw, and processed)
with metadata to enable discovery by known and unanticipated users in
the Enterprise
Posting of all data to shared spaces for users to access except when
limited by security, policy, or regulations
Organizing around Communities of Interest (COIs) that are supported by
Warfighting, Business, Enterprise Information Environment, and
Intelligence Mission Areas and their respective Domains.
22. Amit K. Maitra 22
COIs are a key ‘implementer’ of
data strategy goals…
Tag data assets with COI-
defined metadata that enables it
to be searched (visible)
Organize data assets using
taxonomies developed by
experts within the COI
Define the structure and
business rules for operating with
data and information (e.g. define
data models, schema,
interfaces)
Identify, define, specify, model,
and expose data assets to be
reused by the Enterprise as
services
Enable Data to beEnable Data to be
TrustedTrusted
Enable DataEnable Data
InteroperabilityInteroperability
Make DataMake Data
AccessibleAccessible
Enable Data to beEnable Data to be
UnderstandableUnderstandable
Make Data VisibleMake Data Visible
Key Goals Key COI Actions:
23. Amit K. Maitra 23
Blue Force Tracking (BFT) COI Example
Implementation of the Data Strategy…
BFT Content Providers
BFT Service
Consumers
FBCB2/EPLRS
Tactical
Internet FBCB2
JVMF
IP/MCG
BFT
SVC
XML
SOAP
FBCB2/EPLRS
Tactical
Internet FBCB2
JVMF
IP/MCG
BFT
SVC
XML
SOAP
Air Feed
ADSI
TADIL-J
L-16
BFT
SVC
XML
SOAP
FBCB2/MTS/L-Band
Ground
Station
JVMF
MTS
BFT
SVC
RM FBCB2
XML
SOAP
MMC
Ground
Station
BFT
SVC
RM MMC
XML
SOAP
MDACT/USMC
EPLRS/
CNR IOW
VDX
IP
BFT
SVC
XML
SOAP
BFT Service
PI
CI
BFT Service
PI
CI
BFT Service
PI
CI
BFT Service
PI
CI
Web Services Info Grid
BFT Service
PI
CI
NCES
Integration
BFT Service
(www.bft.smil)
Ad/Sub
Propagation
Query
Info
Delivery
Filtering
QoS Consolidation
NCES
Service
Discovery
Security Messaging ESM
Efficient
“on-demand”info service
Editor's Notes
The DRM provides a common, consistent way of categorizing and describing data to facilitate data sharing and integration
A model contains data defining the characteristics of a system. This data is used as a representation of that system for the purposes of
conceptual understanding of a system
controlling the exchange of information with that system
controlling the presentation of that system information to end users
The 'data' is typically called 'metadata' in this context
MOF is hard to teach
Too abstract to understand
But is the underlying architecture for MDA
Secret weapon
Ideal modeling technology, and
The best integration architecture available
It will be incorporated into most IT infrastructure over the next 10 years
20 years of disparate platforms
MOF is a language used to define metamodels
Metamodels define language/constructs to build models
Relational for information sources
BPEL, BPMI for business process
XML Schema for XML documents
UML for modeling applications
MOF Metamodels are defined in terms of a common set of constructs
Package, Classes, Attributes, Associations, References, etc.
All MOF metamodels can be related
MOF BENEFITS
One modeling environment
Information – data
Logic
Process
Models are relatable
Common constructs in disparate models can be related
Best integration architecture to Model Drive execution engines