Download/display Word Perfect

  • 230 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
230
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
4
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. The Adoption of METIS GSBPM in Statistics Denmark
  • 2. Agenda
    • Background and context
    • Working with business processes
    • An example of documentation
    • Results of process analysis
    • Metadata coverage
    • Lessons learned
  • 3. Agenda
    • Background and context
    • Working with business processes
    • An example of documentation
    • Results of process analysis
    • Metadata coverage
    • Lessons learned
  • 4. Working group on standardisation
    • Multi-annual corporate strategy as basis (”Strategy 2015”)
    • Working group, that refers to Board of Directors
    • METIS GSBPM adopted as common frame
    • Dual focus
      • Process analysis and documentation
      • Coverage of metadata systems
  • 5. 2 Design 1 Specify Needs 3 Build 4 Collect 5 Process 6 Analyse 7 Disseminate 5.1 Integrate data 5.4 Impute 5.5 Derive new variables & stat. units 5.2 Classify & code 5.3 Validate & edit 1.1 Determine need for information 1.4 Identify concepts & variables 1.5 Check data availability 1.2 Consult & confirm need 1.3 Establish output objectives 2.1 Design outputs 2.5 Design stat. processing methodology 2.6 Design prod. systems / workflows 2.3 Design data collection methodology 2.4 Design Frame & sample methodology 3.1 Build data collection instrument 3.4 Test production systems 3.2 Build or enhance process comp. 3.3 Configure workflows 4.1 Select sample 4.4 Finalize collection 4.2 Set up collection 4.3 Run collection 6.1 Prepare draft outputs 6.4 Apply disclosure control 6.5 Finalize outputs 6.2 Validate outputs 6.3 Scrutinize & explain 7.1 Update output systems 7.4 Promote dissemination products 7.2 Produce dissemination products 7.3 Manage release of dissem. prod. 7.5 Manage user support 8 Archive 9 Evaluate 8.1 Define archive rules 8.4 Dispose of data & assoc. metadata 8.2 Manage archive repository 8.3 Preserve data & associated metadata 9.1 Gather evaluation inputs 9.2 Conduct evaluation 9.3 Agree action plan 1.6 Prepare business case 3.5 Test statistical business process 3.6 Finalize production system 5.6 Calculate weights 5.7 Calculate aggregates 5.8 Finalize data files Quality management / Metadata Management 2.2 Design variable descriptions
  • 6. Reference document – ”SD’s METIS”
      • METIS: confirmed standard for official statistical production
      • Adopted by some of our peers
      • Translation of document
      • Approach for SD version
      • Testing the extent to which the model apply to SD
      • An ”SD METIS” would be a milestone for business process- and architectural maturity
      • Necessary to move ahead according to our corporate objective of increasing standardisation
      • Initial focus on phases 4-7
  • 7. Agenda
    • Background and context
    • Working with business processes
    • An example of documentation
    • Results of process analysis
    • Metadata coverage
    • Lessons learned
  • 8. Model/template for statistical business processes
      • METIS level (“which phases do we open”?)
      • Control-flow level (phases, input, output, time)
      • Functional level (”who does what, and in what order?”)
      • ” AS-IS” and/or ”TO-BE”
      • BPMN: Standardized notation
      • Collect ideas and convert them into action (standardisation, efficiency and quality)
      • Form
        • Workshop
        • Facilitated by working group
        • Ownership of results to the statistical team
        • Needs a mandate!
  • 9. Selection of pilot cases
    • Social Statistics:
      • Population register
      • Student register (register updates)
    • Business Statistics
      • General account statistics (SBS)
      • Employment in construction industries
      • Retail Trade Index
      • Industrial commodity statistic
      • Farm Structure Survey
      • Car register and associated statistics
      • Use of ICT in enterprises
    • Economic Statistics
      • Consumer price index
      • Foreign trade in services
    • Sales and Marketing
      • Interview task: Yearly survey on safety
      • Key figures in housing (standardized product from SDs Customer Services Centre)
    • User Services
      • Data collection-processes/-systems (XIS, CEMOS)
  • 10. Selection of cases in Business Statistics - ICS - C-Reg - Primary statistic vs. - Derived statistic ” Type” - SBS - (RII) - Well established statistic in SD - New statistic in SD Maturity
    • ECS
    • UIE
    - Few changes by each iteration vs. - Many changes by each iteration Stability
    • SBS
    • RTI
    • Statistics with high cost vs.
    • Statistics with low cost
    Cost
    • - SBS
    • ICS
    - Positive confidentiality vs. - Negative confidentiality Confidentiality scheme
    • - ECS
    • - ICS
    • FSS
    • - Sample vs.
    • - Cut-off vs.
    • Population
    Coverage - SBS - ECS
    • Micro-based error detection vs.
    • Macro-based error detection
    Method for error detection - SBS - C-Reg - Statistics based on SBR vs. - Statistics with other units Type of Statistical Unit
    • - RTI
    • SBS
    • - Simple vs.
    • Complex
    Complexity
    • - ECS
    • SBS
    - Statistics in standardised systems vs. - Statistics in stand-alone systems Standardised system (if any)
    • - ECS
    • SBS
    - Short term vs. - Structural statistics Frequency Cases Values Dimension
  • 11. Agenda
    • Background and context
    • Working with business processes
    • An example of documentation
    • Results of process analysis
    • Metadata coverage
    • Lessons learned
  • 12. Example: METIS level
  • 13. Example: Control flow level
    • Trigger
    • Phases
    • Input
      • Regulations
      • Data
      • etc.
    • Output
      • Intermediate
      • Final
    • Time
  • 14. Example: Functional level
    • Who does what
    • Start condition
    • End condition
    • Note that…
  • 15. Agenda
    • Background and context
    • Working with business processes
    • An example of documentation
    • Results of process analysis
    • Metadata coverage
    • Lessons learned
  • 16. Results of process analysis (an overview)
    • Focus on processes is useful and has immediate effect in some cases
    • Improvements for statistical teams
      • Quality (documentation, new quality measures, etc.)
      • Standardisation (Use of standardised systems)
      • Efficiency (Eliminate manual processes)
    • Improvements in communication
      • Many project managers regarding digitalisation
      • Coordinator function
    • Improvements in efficiency for data collection
      • Focus on areas of responsibility
    • Huge difference in degree of standardisation
      • Dissemination
      • Data collection
      • Data processing
  • 17. Agenda
    • Background and context
    • Working with business processes
    • An example of documentation
    • Results of process analysis
    • Metadata coverage
    • Lessons learned
  • 18. Metadata coverage
  • 19. Metadata coverage
    • Dissemination phase is very well covered
    • Although dissemination phase is covered by four different applications the overlap is very limited
    • The vision for the future is to create a single metadata system
    • The data model should be based on three data stages (raw data, micro data, macro data)
  • 20. Metadata coverage
  • 21. Agenda
    • Background and context
    • Working with business processes
    • An example of documentation
    • Results of process analysis
    • Metadata coverage
    • Lessons learned
  • 22. Lessons learned
    • Planning a strategy for further development is better using GSBPM
    • Identify areas of interest for improvement initiatives.
    • Major challenges regarding steps where data is processed
    • Further standardization of methods is necessary
    • A clearer view of the different need for metadata and documentation
    • A better overview of the strong and the weak areas of our metadata applications