CobiT, Val IT & Balanced Scorecards

18,323 views

Published on

Published in: Technology, Business
1 Comment
66 Likes
Statistics
Notes
  • Glad you liked it Greg
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total views
18,323
On SlideShare
0
From Embeds
0
Number of Embeds
139
Actions
Shares
0
Downloads
0
Comments
1
Likes
66
Embeds 0
No embeds

No notes for slide
  • In this paper, the development and implementation of an IT BSC within the Information Services Division (ISD) of a Canadian tri-company financial group consisting of Great-West Life, London Life and Investors Group (hereafter named The Group) is described and discussed. We use an IT BSC maturity model (adapted from the capability maturity model developed by the Software Engineering Institute) to determine the maturity level of the IT BSC under review. An important conclusion of the paper is that an IT BSC must go beyond the operational level and must be integrated across the enterprise in order to generate business value. This can be realized through establishing a linkage between the business balanced scorecard and different levels of IT balanced scorecards and through the definition of clear cause-and-effect relationships between outcome measures and performance drivers throughout the whole scorecard.
  • CobiT, Val IT & Balanced Scorecards

    1. 1. Balanced Scorecards As Management Toolkit Sourced from ITGI’s Introductory CobiT Presentation, ISACA’s Implementing CobiT for IT Management & Governance, CobiT IT Governance Implementation, IT Assurance and Val IT guides & frameworks and rooted in “ Linking the IT Balanced Scorecard to the Business Objectives at a Major Canadian Financial group” by <ul><ul><li>Wim Van Grembergen - University of Antwerp (UFSIA)
    2. 2. Ronald Saull - Information Services Divisions of Great-West Life, London Life, Investors Group
    3. 3. Steven De Haes - University of Antwerp Management School (UAMS) </li></ul></ul>With thanks to R. Basham and M. Impey, there for me from the start.
    4. 4. What are Balance Scorecards <ul><li>The BSC concept recognizes that, over the last decades, the means of value creation shifted from tangible to intangible assets, and intangible assets generally are not measurable through traditional financial means. In the intangible economy, intangible assets require several interdependent ingredients for value creation </li></ul><ul><li>Strategic investments, such as leadership, process, IT, climate and skills must be bundled together
    5. 5. The BSC was created to provide an understanding of how intangible assets are used to create value. It measures performance from 3 leading perspectives - customers, internal processes and learning & growth – in addition to the traditional financial measures that describe value creation after it happens
    6. 6. Introduced by Kaplan and Norton in 1992 in response to their belief that the execution of strategy is the corporate challenge of our time </li></ul>
    7. 7. IT Service Provider or Strategic Partner As Is To Be Service provider Strategic Partner IT for efficiency IT for business growth Budgets driven by external benchmarks Budgets driven by business strategy IT separable from the business IT inseparable from the business IT seen as an expense control IT seen as an investment to manage IT managers as technical experts IT managers are business problem solvers
    8. 8. Scorecard Perspective & Mission Customer Orientation Corporate Contribution Perspective question Perspective question How should IT appear to business unit executives to be considered effective in delivering services How should IT appear to company executives & its corporate functions to be considered a significant contributor to company success Mission Mission Supplier of choice for all information services, directly & indirectly through supplier relations Business objectives enabler & contributor via effective delivery of value added i-services Operational Excellence Future Orientation Perspective question Perspective question Which services & processes must IT excel at to satisfy stakeholders & clients IT to develop effective delivery & continuous performance enhancement Mission Mission Deliver timely & effective IT services @ targeted service level & costs Continuous performance improvement through innovation, learning & growth
    9. 9. IT Strategic Scorecard Universe
    10. 10. Scorecard Cascade
    11. 11. Sample IT Service Desk Metrics cascade <ul><li>A unit scorecard, situated in the operational services scorecard group, metrics such as: </li></ul><ul><ul><li>average speed of answer,
    12. 12. overall resolution rate at initial call and
    13. 13. call abandonment rate (all three customer orientation metrics) </li></ul></ul><ul><li>Are rolled-up to service level performance metrics in the IT strategic balanced scorecard.
    14. 14. Other metrics of this unit scorecard, such as </li></ul><ul><ul><li>Expense management (corporate contribution perspective),
    15. 15. Client satisfaction (customer orientation perspective),
    16. 16. Process maturity of incident management (operational excellence perspective) and
    17. 17. Staff turnover (future orientation perspective), </li></ul></ul><ul><li>Will aggregate as part of the IT strategic scorecard. </li></ul>The overall view of the IT strategic balanced scorecard is then fed into, and evaluated against, the defined business objectives.
    18. 18. Cause and Effect Scorecard Relationships
    19. 19. Generic SLM Cause & Effect Scorecard
    20. 20. Organisations will consider and use a variety of IT models, standards and best practices. These must be understood in order to consider how they can be used together, with C OBI T acting as the consolidator (‘umbrella’). C OBI T ISO 9000 ISO 17799 ITIL COSO WHAT HOW SCOPE OF COVERAGE CobiT alongside other Management Frameworks
    21. 21. IT Governance, COSO and CobiT focus areas Enterprise Governance IT Governance Best Practice Standards QA Procedures Processes and Procedures Drivers ITIL
    22. 22. Where do BSC’s fit in Governance PERFORMANCE: Business Goals CONFORMANCE Basel II, Sarbanes- Oxley Act, etc. Enterprise Governance IT Governance ISO 9001:2000 ISO 17799 ISO 20000 Best Practice Standards QA Procedures Processes and Procedures Drivers C OBI T COSO Security Principles ITIL Balanced Scorecard
    23. 23. Tracks to a Balanced Scorecard Working closely with Val IT, CobiT proposes two methods: <ul><li>A generic Business goals to IT goals to IT Processes cascade </li></ul>See CobiT 4.1 - Appendix 1 - pages 169 & on <ul><li>The Input / Output tables </li></ul>To be found in each of the 34 CobiT 4.1 Processes. Resulting in a set of metric indicators that are the object of this presentation
    24. 24. Track 1, Example Business goals to IT goals to IT Processes AI 7 Install and Accredit Solutions and Changes - Appendix 1 - p. 169
    25. 25. Mapping to CobiT v 4.1 from Appendix 1 - p. 169 AI 7 Install and Accredit Solutions and Changes
    26. 26. CobiT 4.1 replaces KGI & KPI with: • Outcome measures or Lag indicators ex- KGIs, have the goals / results been met. • Performance indicators or Lead indicators ex-KPIs, will the goals be met. Track 2 exemple CobiT 4.0 compared to CobiT 4.1 as seen from AI 7 Install and Accredit Solutions and Changes CobiT 4.0 proposes key identifiers <ul><ul><li>IT Goals
    27. 27. Process Goals
    28. 28. Performance </li></ul></ul>
    29. 29. Process Input / Output and Goals & Metrics tables
    30. 30. The RACI Chart CobiT's Who does What As applied to AI7 - Install & Accredit Solutions & Change In this instance, Head of Development is both Accountable and Responsible regarding system conversion and integration tests on the test environment
    31. 31. Metrics driving AI 7 Install & Accredit Solutions & Change IT Metrics Lead indicators for Business objectives / Lag indicators for IT Process metrics <ul><li>% of stakeholders satisfied with data integrity of new systems
    32. 32. % of systems that met expected benefits as measured by post-implementation process </li></ul>Process Metrics Lead indicators for IT Process Metrics / Lag indicators for Activity Metrics <ul><li>Number of errors found during internal audit regarding the installation & accreditation process
    33. 33. Amount of rework after implementation due to inadequate acceptance testing
    34. 34. Number of Service Desk calls from users due to inadequate training
    35. 35. Application down-time or data fixes caused by inadequate testing </li></ul>Activity Metrics driving Process Metrics Lead indicators for Process Metrics <ul><li>Degree of stakeholder involvement in the installation and accreditation process
    36. 36. % of projects with a documented and approved testing plan
    37. 37. Number of lessons learnt from post-implementation review
    38. 38. % of errors found during QA review of installation, and accreditation functions
    39. 39. Number of changes without required management sign-off before implementation </li></ul>
    40. 40. CobiT 4.0 – Sample Interlinked Key Goal Statements to be translated into measurable metrics <ul><li>AI6 – Manage Change, Change Standards and Procedures: </li></ul><ul><ul><li>An agreed-upon and standardized approach for managing changes in an efficient and effective manner
    41. 41. Changes reviewed and approved in a consistent and coordinated way
    42. 42. Formally defined expectations and performance measurement </li></ul></ul><ul><li>DS1 – Service Level Management Framework : </li></ul><ul><ul><li>Clarified IT service responsibilities and IT objectives aligned with business objectives
    43. 43. Improved communication and understanding between business customers and IT service providers
    44. 44. Consistency promoted in service levels, service definitions, and service delivery and support </li></ul></ul><ul><li>DS8 – Service Desk Reporting and Trend Analysis : </li></ul><ul><ul><li>Decreased service downtime
    45. 45. Increased customer satisfaction
    46. 46. Confidence in the offered services
    47. 47. Help desk performance measured and optimized </li></ul></ul><ul><li>DS13 – Manage Operations, Operations Procedures and Instructions : </li></ul><ul><ul><li>Demonstration that IT operations are meeting SLAs
    48. 48. Promotion of continuity of operational support by documenting staff experience and retaining it in a knowledge base
    49. 49. Structured, standardized and clearly documented IT operations procedures and support staff instructions
    50. 50. Reduced time to transfer knowledge between skilled operation support staff and new recruits </li></ul></ul>
    51. 51. CobiT 4.0 – Sample Interlinked Key Performance Indicators <ul><li>AI6 – Manage Changes : </li></ul><ul><ul><li>Percent of changes recorded and tracked with automated tools
    52. 52. Percent of changes that follow formal change control processes
    53. 53. Ratio of accepted to refused change requests
    54. 54. Number of different versions of each business application or infrastructure being maintained
    55. 55. Number and type of emergency changes to the infrastructure components
    56. 56. Number and type of patches to the infrastructure components </li></ul></ul><ul><li>DS1 – Service Level & Operations Level Agreement : </li></ul><ul><ul><li>Number of formal SLA review meetings with business per year
    57. 57. Percent of service levels reported automated or not
    58. 58. Number of elapsed working days to adjust a service level after agreement with customer </li></ul></ul><ul><li>DS8 – Service Desk: </li></ul><ul><ul><li>Percent of incidents and service requests reported and logged using automated tools
    59. 59. Number of days of training per service desk staff member per year
    60. 60. Number of calls handled per service desk staff member per hour
    61. 61. Percent of incidents that require local support (field support, personal visit)
    62. 62. Number of unresolved queries </li></ul></ul><ul><li>DS13 – Manage Operations : </li></ul><ul><ul><li>Number of training days per operations personnel per year
    63. 63. Percent of hardware assets included in preventive maintenance schedules
    64. 64. Percent of work schedules that are automated
    65. 65. Frequency of updates to operational procedures </li></ul></ul>
    66. 66. CobiT 4.0 - Sample Interlinked Activity based Critical Success Factors <ul><li>AI6 – Manage Changes : </li></ul><ul><ul><li>Defining and communicating change procedures, including emergency changes and patches
    67. 67. Assessing, prioritizing and authorizing changes
    68. 68. Scheduling changes
    69. 69. Tracking status and reporting on changes </li></ul></ul><ul><li>DS1 – Service Level & Operations Level Agreement : </li></ul><ul><ul><li>Defining services
    70. 70. Formalizing internal and external agreements in line with requirements and delivery capabilities
    71. 71. Reporting on service level achievements (reports and meetings)
    72. 72. Ensuring that reports are tailored to recipient audience
    73. 73. Feeding back new and updated service </li></ul></ul><ul><li>DS8 – Service Desk : </li></ul><ul><ul><li>Installing and operating a service desk
    74. 74. Monitoring and reporting trends
    75. 75. Aligning incident resolution priorities with business imperatives
    76. 76. Defining clear escalation criteria and procedures </li></ul></ul><ul><li>DS13 – Manage Operations : </li></ul><ul><ul><li>Operating the IT environment in line with agreed-upon service levels, with defined instructions and close supervision
    77. 77. Preventive maintenance and monitoring of the IT infrastructure </li></ul></ul>
    78. 78. CobiT 4.0 - Sample Interlinked Monitor & Evaluate CSFs, KGIs & PKIs <ul><li>ME1 - Monitor and Evaluate IT Performance </li></ul><ul><ul><li>Capturing, collating and translating process performance reports into management reports
    79. 79. Reviewing performance against agreed-upon targets and initiating necessary remedial action </li></ul></ul><ul><li>ME1 – Performance Assessment </li></ul><ul><ul><li>Periodically review performance against targets, analyze the cause of any deviations, and initiate remedial action to address the underlying causes. At appropriate times, perform root cause analysis across deviations. </li></ul></ul><ul><li>ME1 – Monitor and Evaluate IT Performance </li></ul><ul><ul><li>Time lag between the reporting of the deficiency and the action initiation
    80. 80. Amount of delay to update measurements to reflect actual performance objectives, measures, targets and benchmarks
    81. 81. Number of metrics (per process)
    82. 82. Number of cause-and-effect relationships identified and incorporated in monitoring
    83. 83. Amount of effort required to gather measurement data
    84. 84. Number of problems not identified by the measurement process
    85. 85. Percent of metrics that can be benchmarked to industry standards and set targets </li></ul></ul>
    86. 86. Maturity Model – From As is to To Be A ‘ Rising Star Chart’ for Documenting the As-is and To-be Positions
    87. 87. Scorecard Template
    88. 88. Opportunity Template Management Area Affected Detailed Description             Quantifiable Benefit           Strategic Benefit           Risks of Implementing Risk of Not Implementing                 Groups Impacted            
    89. 89. Val IT framework & its Toolset
    90. 90. Val IT framework and tool set Val IT focuses on the Financial side of IT investments , key terms are: <ul><li>Value : financial, non-financial or both
    91. 91. Portfolio : a group of managed & monitored programs, projects & services
    92. 92. Investment Program : </li></ul><ul><ul><li>Primary unit of a structured group of interdependent projects </li></ul></ul><ul><li>Project : A structured set of activities delivering a defined capability
    93. 93. Implement : includes the full economic life-cycle of the investment
    94. 94. Processes : Value governance (VG), Portfolio Management (PM) & </li></ul>Investment Management (IM)
    95. 95. Val IT Case Processes & Practices
    96. 96. Val IT & CobiT
    97. 97. Val IT Case Fact Sheet Template
    98. 98. Val IT Metrics Template
    99. 99. Val IT - Assessing Risk
    100. 100. Any Questions?

    ×