Digital Intelligence
Unlocking the hidden potential in your data
Islands of data; disparate
sources of information
Outdated reporting and limited
analytics
Incoherent data management
strategy
Ever-expanding data footprint
with associated costs
Common Challenges
Ability to analyse previously
unrelated data sets
Single access point to cross-
enterprise data with
appropriate governance
Firm wide and transparent
MetaData structures and Data
Journeys
Forecast future trends based
on historic data; predictive
analytics
Benefits & Outcomes
Data is everywhere. Data is powerful. Unprocessed data remains just that – data. Turn data into a
valuable source of information and intelligence.
Analytics&
Visualisations
Tools&
Technology
Big
Data
BigData
Information
Identify Collate Consolidate
Interpre
t
Knowledge
DataLandscape
GPSLegacy
DBs
Docs CommsCustomer
Sales/Data
Our Solutions
• Good Data Attributes
• Systems
• Processes
• Discipline / Principles
• Project Considerations
• Platform Capabilities
Key Facets to consider
Data Quality: What is ‘Good’ data?
Relevance
Is the information relevant for its
intended use?
Validity
Does the information meet the
requirements of the business?
Accuracy
Is the information correct with a
traceable data lineage; can this be
validated?
Timeliness
Is the information up to date…
and is it available as and when
needed?
Integrity
Does the information have a
coherent, logical structure?
Accessibility
Can the information be easily
accessed and used in ways that
align to business processes?
Consistency
Can the same information be
recreated multiple times?
Completeness
Does the information answer all
the questions that are being
asked?
• Relevant (data & ability to correctly answers the query)
• Secure
• Durable
• Robust (supports non-standard events, like loss of network, bad input, etc.)
• Scalable (can be updated easily to support high volumes)
• Evolve (new functionalities can be added at acceptable cost)
• Distribution patterns should be fit for purpose
• Data should be easy to create / distribute / reproduce
• Present coherently across multiple channels
• Search through data
• Build the capability to discover relations between disparate datasets
Good Data Systems
Data creation / update processes:
• should be consistent & repeatable,
• responsive to changing requirements,
• secure & accessible via multiple channels
• compliant with regulations
• trace back to source of the data
• understand the data lifecycle
Good Data Processes
• data is a shared asset, share data across lines of business
• publish & use golden sources, prevent redundant copies
• provide scalable interfaces to share/access the data
(virtualization , integration, data as a service)
• enforce secure & audited access
• create metadata & documentation
(one vocabulary - product catalogs, provider hierarchies, definitions all need to be common)
• understand / document the relationship between data stores
(Logical / Conceptual / Physical models )
• robust cleansing, reconciliation & escalation
(think about non-functional requirements while planning data projects)
• business buy-in: Process + Data + Technology stewards
“put in the extra discipline to maintain healthy system”
Principles to adopt
• Legacy technical debt / Risk Log / Manual processes to be replaced?
• do existing systems meet above DQ requirements? Deltas / Upgrades / Consolidation?
• new business / regulatory problems to be targeted ?
• well defined / tangible business benefit
• priorities & budget constraints, costs measured against risk of failure
• change people & processes to enforce governance
• take measure of in-house talent / product vendor talent / independent partners
• Agile methodology (quick wins if possible to build confidence & support for project)
• Agile / Flexible to changing priorities/requirements/budgets (business reality)
• Target Data Future State - moving target - put process in place to allow for this evolution
Project considerations
• visualization
• analytics & reporting (big data, warehouse, marts, cube, relational, nosql, graph)
• streaming datasets ( spark / storm / Lambda architecture, etc.)
• learning & discovery ( fraud detection, financial crime detection, AI, machine learning, etc.)
• virtualization
• integration patterns ( API driven, shared db, real-time, micro-services, rest/soap, web-hooks,
batch reports, CQRS, near caches, etc.)
• Inhouse infrastructure: production / DR / backup / development
• Cloud offerings for existing vendor product lines (best of breed, public, private, hybrid)
“discuss each one in more depth as needed”
Platform Capabilities
• Abstraction
Abstract the technical aspects of stored data, such as location, storage structure, API, access language, and
storage technology.
• Virtualized Data Access
Connect to different data sources and make them accessible from a common logical data access point.
• Transformation
Transform, improve quality, reformat, aggregate etc. source data for consumer use.
• Data Federation
Combine result sets from across multiple source systems.
• Data virtualization software may
include functions for development, operation, and/or management.
Virtualization
• Reduce risk of data errors
• Reduce systems workload through not moving data around
• Increase speed of access to data on a real-time basis
• Significantly reduce development and support time
• Increase governance and reduce risk through the use of policies[5]
• Reduce data storage required
Virtualization benefits
• Requirements
• Case Studies
• Customer recommendations
• Platform vendors
• Partner relationships & capabilities
Discuss more details as needed

Digital intelligence satish bhatia

  • 1.
    Digital Intelligence Unlocking thehidden potential in your data Islands of data; disparate sources of information Outdated reporting and limited analytics Incoherent data management strategy Ever-expanding data footprint with associated costs Common Challenges Ability to analyse previously unrelated data sets Single access point to cross- enterprise data with appropriate governance Firm wide and transparent MetaData structures and Data Journeys Forecast future trends based on historic data; predictive analytics Benefits & Outcomes Data is everywhere. Data is powerful. Unprocessed data remains just that – data. Turn data into a valuable source of information and intelligence. Analytics& Visualisations Tools& Technology Big Data BigData Information Identify Collate Consolidate Interpre t Knowledge DataLandscape GPSLegacy DBs Docs CommsCustomer Sales/Data Our Solutions
  • 2.
    • Good DataAttributes • Systems • Processes • Discipline / Principles • Project Considerations • Platform Capabilities Key Facets to consider
  • 3.
    Data Quality: Whatis ‘Good’ data? Relevance Is the information relevant for its intended use? Validity Does the information meet the requirements of the business? Accuracy Is the information correct with a traceable data lineage; can this be validated? Timeliness Is the information up to date… and is it available as and when needed? Integrity Does the information have a coherent, logical structure? Accessibility Can the information be easily accessed and used in ways that align to business processes? Consistency Can the same information be recreated multiple times? Completeness Does the information answer all the questions that are being asked?
  • 4.
    • Relevant (data& ability to correctly answers the query) • Secure • Durable • Robust (supports non-standard events, like loss of network, bad input, etc.) • Scalable (can be updated easily to support high volumes) • Evolve (new functionalities can be added at acceptable cost) • Distribution patterns should be fit for purpose • Data should be easy to create / distribute / reproduce • Present coherently across multiple channels • Search through data • Build the capability to discover relations between disparate datasets Good Data Systems
  • 5.
    Data creation /update processes: • should be consistent & repeatable, • responsive to changing requirements, • secure & accessible via multiple channels • compliant with regulations • trace back to source of the data • understand the data lifecycle Good Data Processes
  • 6.
    • data isa shared asset, share data across lines of business • publish & use golden sources, prevent redundant copies • provide scalable interfaces to share/access the data (virtualization , integration, data as a service) • enforce secure & audited access • create metadata & documentation (one vocabulary - product catalogs, provider hierarchies, definitions all need to be common) • understand / document the relationship between data stores (Logical / Conceptual / Physical models ) • robust cleansing, reconciliation & escalation (think about non-functional requirements while planning data projects) • business buy-in: Process + Data + Technology stewards “put in the extra discipline to maintain healthy system” Principles to adopt
  • 7.
    • Legacy technicaldebt / Risk Log / Manual processes to be replaced? • do existing systems meet above DQ requirements? Deltas / Upgrades / Consolidation? • new business / regulatory problems to be targeted ? • well defined / tangible business benefit • priorities & budget constraints, costs measured against risk of failure • change people & processes to enforce governance • take measure of in-house talent / product vendor talent / independent partners • Agile methodology (quick wins if possible to build confidence & support for project) • Agile / Flexible to changing priorities/requirements/budgets (business reality) • Target Data Future State - moving target - put process in place to allow for this evolution Project considerations
  • 8.
    • visualization • analytics& reporting (big data, warehouse, marts, cube, relational, nosql, graph) • streaming datasets ( spark / storm / Lambda architecture, etc.) • learning & discovery ( fraud detection, financial crime detection, AI, machine learning, etc.) • virtualization • integration patterns ( API driven, shared db, real-time, micro-services, rest/soap, web-hooks, batch reports, CQRS, near caches, etc.) • Inhouse infrastructure: production / DR / backup / development • Cloud offerings for existing vendor product lines (best of breed, public, private, hybrid) “discuss each one in more depth as needed” Platform Capabilities
  • 9.
    • Abstraction Abstract thetechnical aspects of stored data, such as location, storage structure, API, access language, and storage technology. • Virtualized Data Access Connect to different data sources and make them accessible from a common logical data access point. • Transformation Transform, improve quality, reformat, aggregate etc. source data for consumer use. • Data Federation Combine result sets from across multiple source systems. • Data virtualization software may include functions for development, operation, and/or management. Virtualization
  • 10.
    • Reduce riskof data errors • Reduce systems workload through not moving data around • Increase speed of access to data on a real-time basis • Significantly reduce development and support time • Increase governance and reduce risk through the use of policies[5] • Reduce data storage required Virtualization benefits
  • 11.
    • Requirements • CaseStudies • Customer recommendations • Platform vendors • Partner relationships & capabilities Discuss more details as needed

Editor's Notes

  • #2 We can help you move from data to information, information to knowledge and knowledge to wisdom. Whether you’re just getting started or want to optimise your existing data, our Digital Intelligence service ensures your data is organised properly (data strategy/optimisation), implements tools that can analyse vast amounts of data (data analytics) and recommends tools for providing intuitive and informative data visualisation. Data strategy Big data optimisation Data analytics Data visualisation