Performance management capability
Upcoming SlideShare
Loading in...5
×
 

Performance management capability

on

  • 496 views

Performance management business architecture, describing the process, data, organisation and data warehouse architecture required to deliver this capability.

Performance management business architecture, describing the process, data, organisation and data warehouse architecture required to deliver this capability.

Statistics

Views

Total Views
496
Views on SlideShare
496
Embed Views
0

Actions

Likes
1
Downloads
2
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Performance management capability Performance management capability Presentation Transcript

    •  v Performance Measurement Capability A Data Warehouse Business Architecture
    •  v Balanced Scorecard Activity Based Management Performance Measurement Approaches Robert S. Kaplan & David P. Norton “Mastering the Management System”, HBR, Jan 2008.
    •  v Performance Management Capability The performance management domain defines the set of capabilities supporting the extraction, aggregation, and presentation of information to facilitate decision analysis and business evaluation Capability Description Analysis & Statistics: Defines the mathematical and predictive modelling and simulation capabilities that support the examination of business issues, problems and their solutions Business Intelligence Defines the forecasting, performance monitory, decision support and data mining capabilities that support information that pertains to the history, current status or future projections of an organization. Visualization: Defines the presentation capabilities that support the conversion of data into graphical or pictorial form. Reporting: Defines the ad hoc, standardised and multidimensional reporting capabilities that support the organization of data into useful information. Data Management: Defines the set of capabilities that support the usage, processing and general administration of structured and unstructured information. FEA Consolidated Reference Model Document v 2.3
    •  v Business Measures %Revenue by market segment %Revenue by top 20 clients %Revenue by client relationship Increase key account / high margin clientsCustomer Perspective £Sales revenue by market segment Number of new projects by top 20 clients Revenue by top 20 clients (client value) Product Time Period Region Employee Customer £ Sales Income / Revenue Calc. = quantity  price Target = Alert Threshold =
    •  v DataWarehouseArchitecture Presentation Layer Std Reports Analytics %Revenue by market segment %Revenue by top 20 clients %Revenue by client relationship ODS Data Marts 1. BI Presentation 3. Data Warehouse 4. Reconciliation Process 5. Operational Systems QQ QQ QQ QQ QQ QQ T L E 2. Metadata Repository
    •  v DataWarehouseArchitecture ODS Data Marts 1. BI Presentation 3. Data Warehouse 4. Reconciliation Process 5. Operational Systems QQ QQ QQ QQ QQ QQ BI Presentation Layer ?Ad Hoc QueryMetadata Std Reports Analytics T L E 2. Metadata Repository
    •  v Reference Architecture Components Component Description Business Intelligence Presentation Layer The presentation layer is responsible for providing tools for delivering ad hoc, standard and analytical reporting. The reporting tools available fall under the business intelligence umbrella (BI). These tool support access to and analysis of information to improve and optimize decisions and performance, i.e. data mining, analytical processing, reporting & querying data.. Information Catalogue The information catalogue (data dictionary) component is responsible for maintaining the definition of data and its lineage from the source systems through to the data warehouse. This incudes data definitions, data mapping and transformations conducted on the data. Data Warehouse Data Mart The data mart component is responsible for delivering line of business, departmental and individual information needs and key performance indicators. These information needs are reported as facts, allowing the data to be reported against standard dimensions, such as,. Customer segment, product, organisation structure, location and time. Data Warehouse Operational Data Store The operation data store (ODS) component is responsible for holding historic atomic data extracted from operational systems. This data is held in non-redundant third normal form arranged by subject area. It contains static near current data which is refreshed on a regular basis from the source operational systems, e.g. daily, weekly or monthly. It is used to support all decision support reporting needs. Data Acquisition Extract, Transform & Load Data reconciliation component is responsible for data acquisition and resolving consistencies and discrepancies between common data elements stored across the source systems, e.g. reference codes, spelling & field lengths. The reconciliation process is conducted in a separate staging area where the extracted data is reformatted, transformed and integrated into an agreed common data model. Operational Systems The transactional processing systems used to support the business operations of the enterprise. These operational systems provide the primary data used for decision support and reporting. This data is dynamic and constantly changing with each business transaction. Bill Inmon and Gartner
    •  v BI: Data Quality Scorecard Specification Approach Business Measure - Information Need Business Measure: Data Quality Types 1. Actual 2. Target ± tolerance Dimensions: Agency Data Item Location Channel Attribute Post code Segment Entity Statistical Area Organisation Data Collection Outlet Calculations: % Master data duplication % Collection submission data completeness % Data item accuracy % Consistency across data sets Statutory timeline aging of collection receipts Time Dimension: Weekly Monthly Year to date Atomic Data: Agency Agent Collection Data Item Attribute Entity Reporting Period Data Submission Validation Result Rule
    •  v Summarised Data Store: Modelling Approach Business Measure Data Model • Identify business measure (fact) • Define measure formulae • Identify measure dimensions • Identify measure source data • Entity • Attributes • Maintain measure dimension affinity matrix Business Measure Database Design • Design summarised database • Star Schema • Snowflake Schema • Prepare use case specification Ralph Kimbal
    •  v High Level Data Model • List in scope entities • Object, place, resource or event • All entities at the same level of abstraction • Entity relational model structured by subject areas • Defines scope of integration Mid Level Data Model (DIS) • Third normal form ERD • Remove repeating groups • All attributes are dependant on the primary key • Resolve all M : M relationships • Add sub types where relevant • Includes all data elements (data item set) • Primitive data elements only, no derived data Low Level Physical Model • Derived from the DIS • Identify primary keys • Add alternate keys • Define physical fields • Desc, field type & size • Default values • Value constraints • Null value support • Identification of system of record for all fields (data mapping) • Definition of access method (sequential or random) • Process data mapping (frequency & fields used) Operational Data Store: Modelling Approach Bill Inmon, “Information Engineering for the Practitioner”, Yourdon Press, Englewood Cliffs, N.J., 1988
    •  v Reconciliation Process Data Acquisition Approach Data Mapping • Identify source system fields • Map source fields to target data model • Define data transformation rules • Determine interface services • Prepare use case specification Data Quality • Determine quality grading scheme, e.g. • Platinum • Gold • Silver • Define data quality measures • Define quality measure formulae • Identify quality measure dimensions • Identify quality measure source data • Entity • Attribute
    •  v Data Validation ETL Use Cases The Solution Data Collection Custodian Monitor Data Quality KPIs Maintain Reference Data Assign Agency Collection Maintain Agency Map Entity Collection Data Define Validation Rule Load Data Submission Validate Data Submission Notify Late Collection Submission Assign Data Item Rules Turn Off Agency Rule Agency Submission Due Date Agency Record Submission Exemptions Help Desk
    •  v Data Acquisition ETL Specification Model Use Cases 1. Operation  Load Data Submission  Validate Data Submission  Report Data Quality KPIs 2. Configuration  Maintain Reference data  Maintain Agency  Assign Agency Collection  Map Collection to Entity Model  Define Validation Rule  Assign Data Item Rules  Turn Off Agency Rule  Record Submission Exemptions Operation Configuration
    •  v Contact Technology architecture & solutions are justified at a strategic and financial level by preparing a business case.