Webinar: Metrics Management - Be Careful What You Wish For!

1,618 views

Published on

Metric management seems to be one of the hottest topics in project management today. Unfortunately, rushing into metrics management without understanding what can go wrong can lead to detrimental results.

Published in: Business, Technology
0 Comments
7 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,618
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
0
Comments
0
Likes
7
Embeds 0
No embeds

No notes for slide
  • If you still have questions, now would be a good time to ask them.
  • We really appreciate your attendance and participation in this course.
    If you found this to be a valuable experience, please recommend the course to your friends and coworkers!
    The instructor will now solicit your feedback by handing out a course evaluation sheet.
    Please complete the evaluation and place it on the instructor’s table as you leave the classroom.
    If you are taking this class in IIL’s virtual eLearning environment, complete the evaluation form online.
  • Webinar: Metrics Management - Be Careful What You Wish For!

    1. 1. © 2013 International Institute for Learning, Inc. 1 By Harold Kerzner, Ph.D. Project Management Metrics, Key Performance Indicators and Dashboards 1
    2. 2. © 2013 International Institute for Learning, Inc. 2 Material in this presentation has been taken from the following book: Harold Kerzner; Project Management Metrics, KPIs, and Dashboards John Wiley & Sons and IIL Co-publishers; 2011. Copyright 2
    3. 3. © 2013 International Institute for Learning, Inc. 3 THE DRIVING FORCES FOR BETTER METRICS PART 1 TopicTopic 3
    4. 4. © 2013 International Institute for Learning, Inc. Types of Metrics Types of Metrics Intent Traditional Metrics Primarily focus on where we are today Key Performance Indicators Extrapolate the present into the future to tell us where we will end up Value-Reflective Metrics (also called Value-Based Metrics) A combination of metrics and KPIs that tell us the growth of value as the project progresses 4
    5. 5. © 2013 International Institute for Learning, Inc. 5 Types of Metrics Timeline and Initiatives TQM and Quality Conformance Customer Relations Management (CRM) Customers’ Perception of Quality and Value Customer Value Management Programs IMPORTANCE OF VALUE Traditional Value-Based The Importance of Value 1980s 1990s 2000s 2010s 5
    6. 6. © 2013 International Institute for Learning, Inc. 6 The APQC Study According to a study (“Customer Value Management: Gaining Strategic Advantage”, The American Productivity and Quality Center [APQC], ©1998, p.8):  “Although customer satisfaction is still measured and used in decision-making, the majority of partner organizations [used in this study] have shifted their focus from customer satisfaction to customer value.” 6
    7. 7. © 2013 International Institute for Learning, Inc. 7 UNDERSTANDING METRICS PART 2 TopicTopic 7
    8. 8. © 2013 International Institute for Learning, Inc. 8 A New Job Responsibility Part of the project manager’s new role is to understand what are the key metrics that need to be identified and managed for the project to be viewed as a success by all of the stakeholders. 8
    9. 9. © 2013 International Institute for Learning, Inc. 9 New Developments in Project Management New Success Criteria Dashboard Design Governance Measurement Metrics and KPIs 9
    10. 10. © 2013 International Institute for Learning, Inc. 10 Comparing Business and Project Metrics Variable Business/Financial Project Focus Financial measurement Project performance Intent Meeting strategic goals Meeting project objectives, milestones and deliverables Reporting Monthly or quarterly Real time data Items to be looked at Profitability, market share, repeat business, number of new customers, etc… Adherence to competing constraints, validation and verification of performance Length of use Decades of even longer Life of the project Use of the data Information flow and changes to the strategy Corrective action to maintain baselines Target audience Executive management Stakeholders and working levels 10
    11. 11. © 2013 International Institute for Learning, Inc. 11 Time Cost Scope Image/ Reputation Risk Quality Value Value Quality Image/Reputation Scope Risk Cost Time Previously Today Success Through Competing Constraints 11
    12. 12. © 2013 International Institute for Learning, Inc. Disney’s Prioritization of Constraints • Safety • Aesthetic Value • Quality • Time • Cost • Scope 12
    13. 13. © 2013 International Institute for Learning, Inc. The PMBOK® Guide and Metrics 13
    14. 14. © 2013 International Institute for Learning, Inc. The PMBOK® Guide and Metrics SCOPE MGT. 14
    15. 15. © 2013 International Institute for Learning, Inc. Scope Mgt. Metrics Time Mgt. Metrics Cost Mgt. Metrics Quality Mgt. Metrics H. R. Mgt. Metrics Comm. Mgt. Metrics Procure. Mgt. Metrics PMBOK® Guide Integr. Mgt. Metrics The PMBOK® Guide and Metrics 15 Stake- holder Mgt. Metrics Risk Mgt. Metrics
    16. 16. © 2013 International Institute for Learning, Inc. 16 Types of Metrics (in Each Category) Quantitative indicators (planning dollars or hours as a percentage of total labor) Practical indicators (improved efficiencies) Directional indicators (risk ratings getting better or worse) Actionable indicators (affect change as the number of unstaffed hours) Financial indicators (profit margins, ROI, etc.) Milestone indicators (number of work packages on time) End result or success indicators (customer satisfaction) 16
    17. 17. © 2013 International Institute for Learning, Inc. INTEGRATION MGT. SCOPE MGT. TIME MGT. COST MGT. PROCUREMENT MGT. HUMAN RESOURCE MGT. COMMUNICATIONS MGT. RISK MGT. QUALITY MGT. POLITICS CULTURE & RELIGION BUSINESS & STRATEGY PROJECT VALUE MGT. The PMBOK® Guide and Metrics 17 STAKEHOLDER MGT. METRICS METRICS
    18. 18. © 2013 International Institute for Learning, Inc. Issue How do we establish a set of core metrics given the possible uniqueness of each project? Issue IssueIssue 18
    19. 19. © 2013 International Institute for Learning, Inc. 19 Time Cost Resources Scope Quality Actions Project Management Health Metrics 19
    20. 20. © 2013 International Institute for Learning, Inc. 20 Additional Core Metrics Deliverables (in progress): late versus on time Deliverables (completed): accepted versus rejected Management reserve: amount available versus used Risks: number of risks in each core metric category Action items: number of action items in each core category Action items aging: how many action items are over 1 month, 2 months, 3 or more months late 20
    21. 21. © 2013 International Institute for Learning, Inc. 21 Metric Requirements Metrics require:  A need or purpose  A target  A means of measurement  A reporting structure 21
    22. 22. © 2013 International Institute for Learning, Inc. Typical Questions Concerning Metrics Measurements:  What should be measured?  When should it be measured?  How should it be measured?  Who will perform the measurement? Collecting information and reporting:  Who will collect the information?  When will the information be collected?  When and how will the information be reported?  How will the stakeholders react to the information? 22
    23. 23. © 2013 International Institute for Learning, Inc. Issue How do we know the correct number of metrics to use? Issue IssueIssue 23
    24. 24. © 2013 International Institute for Learning, Inc. 24 How Many Metrics are Needed? Too many:  Metric management steals time from other work  Providing too much information such that stakeholders cannot determine what information is critical  Providing information that has limited or no value Too few:  Falling short of the right information  Inability to make informed decisions 24
    25. 25. © 2013 International Institute for Learning, Inc. 25 Metrics Support Information Systems There can be three information systems on a project:  One for the project manager  One for the project manager’s parent company  One for the stakeholders and the client There can be a different set of metrics and KPIs in each of these information systems 25
    26. 26. © 2013 International Institute for Learning, Inc. 26 UNDERSTANDING KEY PERFORMANCE INDICATORS PART 3 TopicTopic 26
    27. 27. © 2013 International Institute for Learning, Inc. 27 Understanding The KPI Although most companies use metrics for measurement, they seem to have a poor understanding of what constitutes a KPI, especially for projects. 27
    28. 28. © 2013 International Institute for Learning, Inc. 28 Metrics Versus KPIs Metrics generally focus on accomplishments of performance objectives KPIs focus on future outcomes 28
    29. 29. © 2013 International Institute for Learning, Inc. 29 Dissecting The KPI Key = a major contributor to success or failure Performance = measurable, quantifiable adjustable and controllable elements Indicator = reasonable representation of present and future performance 29
    30. 30. © 2013 International Institute for Learning, Inc. 30 Another Way to Define a KPI KPIs can be selected using the following: Predictive: able to predict the future of this trend Measurable: can be expressed quantitatively Actionable: triggers changes that may be necessary Relevant: the KPI is directly related to the success or failure of the project Automated: reporting minimizes the chance of human error Few in number: only what is necessary 30
    31. 31. © 2013 International Institute for Learning, Inc. 31 GRAPHICAL REPRESENTATION OF METRICS AND KPIS FOR USE ON DASHBOARDS PART 4 TopicTopic 31
    32. 32. © 2013 International Institute for Learning, Inc. SymbolSymbol MeaningMeaning JanJan FebFeb MarMar AprApr Data not entered Dissatisfied Satisfied Delighted (Adapted from Mahindra Satyam’s Customer Delight Index [CDI] ©2010 Mahindra Satyam. All Rights Reserved.) Mahindra Satyam’s Customer Delight Index (CDI) 32
    33. 33. © 2013 International Institute for Learning, Inc. 33 Metric: Best Practices Promised 7 1 2 Best Practice Used Best Practice To Be Used Best Practice Not To Be Used 3 8 23 5 17 9 11 14 18 21 NumberofBestPractices 33
    34. 34. © 2013 International Institute for Learning, Inc. 34 0 1.5 0.5 1 Average SPI 0 1.5 0.5 1 Average CPI Metric: Cumulative Month-End CPI and SPI Data 34
    35. 35. © 2013 International Institute for Learning, Inc. 35 ManagementReserve(x$1000) Metric: Management Reserve 100 30 70 60 40 80 50 35
    36. 36. © 2013 International Institute for Learning, Inc. 36 Metric: Assigned Versus Planned Resources 4 7 5 6 1 1 2 NumberofPeople 36
    37. 37. © 2013 International Institute for Learning, Inc. 37 Metric: Quality of Assigned Labor 2 5 3 2 4 5 3 1 2 2 3 3 NumberofPeople 37
    38. 38. © 2013 International Institute for Learning, Inc. 38 30 30 45 35 20 20 20 20 15 15 10 7 Metric: Total Project ManpowerNumberofEmployees 38
    39. 39. © 2013 International Institute for Learning, Inc. 39 Metric: Regular Time, Overtime and Unstaffed Hours Hours 39
    40. 40. © 2013 International Institute for Learning, Inc. 40 NumberofDeliverables Metric: Deliverables On Time or Late 10 8 2 9 2 4 1 40
    41. 41. © 2013 International Institute for Learning, Inc. 41 NumberofDeliverables Metric: Deliverables Accepted or Rejected 10 8 2 9 2 4 1 41
    42. 42. © 2013 International Institute for Learning, Inc. 42 Metric: Number of ConstraintsNumberofConstraints 42
    43. 43. © 2013 International Institute for Learning, Inc. 43Metric: Number of Critical Assumptions and Those That Have Been Revised or Added 9 1 8 7 6 63 2 Month Assumptions New Revised Jan 0 Feb 1 Mar 2 Apr 5 NumberofCriticalAssumptions 1 1 1 1 43
    44. 44. © 2013 International Institute for Learning, Inc. 44 Metric: Scope Changes Approved, Denied and Pending NumberofChanges 44
    45. 45. © 2013 International Institute for Learning, Inc. 45 Metric: Number of Baseline RevisionsNumberofRevisions 1 2 3 0 45
    46. 46. © 2013 International Institute for Learning, Inc. 46 Metric: Open Action Items Number of Action Items 46
    47. 47. © 2013 International Institute for Learning, Inc. 47 Complexity Factor Ratin g Technical 3 Business 2 Delivery 2 Table Legend 5 = Very High 4 = High 3 = Moderate 2 = Low 1 = Very Low ProjectComplexity(Risk)Factor April’s Results Metric: Project Complexity (Risk) Factor
    48. 48. © 2013 International Institute for Learning, Inc. 48 Metric Library: Project Complexity (Risk) Factor Factor Information Description Shows Changes in Project Complexity Over Time Metric Owner Ellen Stanford Advantages Directly Related to Downstream Risks Disadvantages Highly Subjective Metric or KPI Metric Value Attribute Not Applicable Type of Image Stacked Column Measurement Human Judgment PMBOK® AOK Risk Management PMBOK® Domain Execution ProjectComplexity(Risk)Factor 48
    49. 49. © 2013 International Institute for Learning, Inc. 49 UNDERSTANDING TARGETS AND MEASUREMENTS PART 5 TopicTopic 49
    50. 50. © 2013 International Institute for Learning, Inc. If it cannot be measured, then it cannot be managed. General Rule 50
    51. 51. © 2013 International Institute for Learning, Inc. Types of Targets Typical targets for a KPI might be: Simple quantitative targets Time-based targets: measured monthly or during a certain time interval  At completion targets: measured at work package or project completion  Stretch targets: exceeding customer requirements or becoming best in class  Visionary targets: well into the future such as more repeat business from this client 51
    52. 52. © 2013 International Institute for Learning, Inc. MEASUREMENT DESCRIPTION Full Project Duration Metric Measurement Exists for the duration of the project. Measurement may be real time, weekly or monthly. (e.g. CV and SV) Life Cycle Phase Metric Measurement Exists for a particular life cycle phase. (e.g. Percent of labor dollars spent on project planning) Limited Life Metric Measurement Exists for the life of a work package or element of work. (e.g. February’s deliverables or manpower per month.) Rolling Wave or Moving Window Metric Measurement Exists for a flexible start and end date. (e.g. SPI and CPI use the six most recent monthly data points.) Alert Metric Measurement Exists for the duration of an out-of- tolerance condition but may reappear. Metric Measurement Intervals 52
    53. 53. © 2013 International Institute for Learning, Inc. 53 THE UNDERSTANDING OF VALUE PART 6 TopicTopic 53
    54. 54. © 2013 International Institute for Learning, Inc. Metric Measurement Complexity Metric or KPI Measuremen t Profitability Easy Customer Satisfaction Hard Goodwill Hard Penetrate New Markets Easy Develop New Technology Medium Technology Transfer Medium Reputation Hard Stabilize Work Force Easy Efficiency, Effectiveness and Productivity Hard Utilize Unused Capacity Easy 54
    55. 55. © 2013 International Institute for Learning, Inc. 55 Some Value Measurement Techniques Observations Ordinal (i.e., four or five stars) and nominal (i.e., male or female) data tables Ranges/sets of value Simulation Statistics Calibration estimates and confidence limits Decision models (EV, EVPI, etc.) Sampling techniques Decomposition techniques Human judgment 55
    56. 56. © 2013 International Institute for Learning, Inc. 56 UNDERSTANDING VALUE- BASED OR VALUE- REFLECTIVE METRICS PART 7 TopicTopic 56
    57. 57. © 2013 International Institute for Learning, Inc. 57 The Metric/KPI Target Boundary Box Performance Characteristics Good Normal Caution Urgent Attention Target + 10% Target - 10% Target - 20% Risk of Project Failure Risk of Project Failure Unfavorable Expectation Unfavorable Expectation Performance Target Performance Target Exceeding Target Exceeding Target Very Favorably Exceeding Target Very Favorably Exceeding TargetTarget + 20% Superior 57 Performance Integrity
    58. 58. © 2013 International Institute for Learning, Inc. 58 Value Points for the Target Boundary Box Performance Characteristics Good Normal Caution Urgent Attention Risk of Project Failure Risk of Project Failure Unfavorable Expectation Unfavorable Expectation Performance Target Performance Target Exceeding Target Exceeding Target Very Favorably Exceeding Target Very Favorably Exceeding Target Superior Value Points 3 2 1 0 4 58
    59. 59. © 2013 International Institute for Learning, Inc. 59 Value Metric Measurement Value Component Weighting Factor Value Measurement Value Contribution Quality 10% 3 0.3 Cost 20% 2 0.4 Safety 20% 4 0.8 Features 30% 2 0.6 Schedule 20% 3 0.6 Total = 2.7 59
    60. 60. © 2013 International Institute for Learning, Inc. 60 Value Metric Measurement Value Component Weighting Factor Value Measurement Value Contribution Quality 10% 3 0.3 Cost 20% 2 0.4 Safety 20% 4 0.8 Features 30% 1 0.3 Schedule 20% 3 0.6 Total = 2.4 60
    61. 61. © 2013 International Institute for Learning, Inc. 61 Value Metric Measurement Value Component Weighting Factor Value Measurement Value Contribution Quality 10% 3 0.3 Cost 20% 1 0.2 Safety 20% 4 0.8 Features 30% 4 1.2 Schedule 20% 1 0.2 Total = 2.7 61
    62. 62. © 2013 International Institute for Learning, Inc. 62 Changing The Weighting Factors Value Component Normal Weighting Factor Weighting Factors If We Have A Significant Schedule Slippage Weighting Factors If We Have A Significant Cost Overrun Quality 10% 10% 10% Cost 20% 20% 40% Safety 20% 10% 10% Features 30% 20% 20% Schedule 20% 40% 20% 62
    63. 63. © 2013 International Institute for Learning, Inc. 63 Weighting Factor Ranges Value Component Minimal Weighting Value Maximum Weighting Value Nominal Weighting Value Quality 10% 40% 20% Cost 10% 50% 20% Safety 10% 40% 20% Features 20% 40% 30% Schedule 10% 50% 20% 63
    64. 64. © 2013 International Institute for Learning, Inc. 64 Value Component Weighting Factor Measurement Technique Value Measurement Value Contribution Quality 10% Sampling Techniques 3 0.3 Cost 20% Direct Measurement 2 0.4 Safety 20% Simulation 4 0.8 Features 30% Observation 2 0.6 Schedule 20% Direct Measurement 3 0.6 Identifying Measurement Techniques 64
    65. 65. © 2013 International Institute for Learning, Inc. 65 Metric: Project Value Attributes 0 0.5 1 1.5 2 2.5 3 Jan Feb Mar Apr Schedule Features Safety Cost Quality SizeofTheValueMetric Value Attribute Rating Schedule 4 Features 3 Safety 3 Cost 2 Quality 4 Rating Legend 4 = Superior 3 = Good 2 = Normal 1 = Caution 0 = Attention 65 April Measurements
    66. 66. © 2013 International Institute for Learning, Inc. Value Attributes Important to Clients Value Metric Attribute Possible Competitive Advantage Deliverables produced Efficiency Product functionality Innovation Product functionality Product differentiation Support response time Service differentiation Staffing and employee pay grades People differentiation Quality Quality differentiation Action items in the system and how long Speed of problem resolution and decision making Cycle time Speed to market Failure rates Quality differentiation and innovation differentiation 66
    67. 67. © 2013 International Institute for Learning, Inc. 67 UNDERSTANDING DASHBOARDS PART 8 TopicTopic 67
    68. 68. © 2013 International Institute for Learning, Inc. 68 Purpose of a Dashboard The purpose of a dashboard is to convert raw data into meaningful information that can be easily understood and used for informed decision making. The dashboard provides the viewer with “situational awareness” of what the information means now and what it might mean in the future. 68
    69. 69. © 2013 International Institute for Learning, Inc. Issue How do we know that the viewers are interpreting the metric information correctly? Issue IssueIssue 69
    70. 70. © 2013 International Institute for Learning, Inc. 70 When Stakeholders First Use The Dashboards… Verify that the information is understood Verify that the right conclusions are made Verify that the number and size of the images fit their comfort zone 70
    71. 71. © 2013 International Institute for Learning, Inc. 71 Dashboard Design and Layout Some rules exist for dashboard design and layout:  Rules for selecting the right artwork  Rules for screen real estate  Rules for artwork placement  Rules for color selection  Rules for accuracy of information (2D vs. 3D)  Rules for aesthetics 71
    72. 72. © 2013 International Institute for Learning, Inc. Status not addressed or recorded Meeting expectations; on course Some improvements needed now Some improvements needed in the future Not meeting expectations; critical issues Problem exists and no action taken Exceeding expectations Completed Still active and completion date has passed Multi-Color Status Reporting 72
    73. 73. © 2013 International Institute for Learning, Inc. 73 METRIC MANAGEMENT CONCERNS PART 9 TopicTopic 73
    74. 74. © 2013 International Institute for Learning, Inc. 74 Metric Naysayers Metrics are an expensive and useless measurement Metrics are costly to maintain and the benefits do not justify the cost Metric measurements are a waste of productive time 74
    75. 75. © 2013 International Institute for Learning, Inc. 75 How Employees View Metrics Employees will not support a metrics management effort that looks like a spying machine. Some people are very touchy about their performance being measured. 75
    76. 76. © 2013 International Institute for Learning, Inc. 76 Ways for Metrics Management to Fail Having too many KPIs Having too many KPIs in one area, such as financial Having no drill down capability, if needed Having too much complexity and collecting more data than needed Using metrics that nobody understands Having a poorly designed dashboard Using distracting dashboard images 76
    77. 77. © 2013 International Institute for Learning, Inc. 77 Best Practices in Metrics Management Companies that support metrics management generally outperform those that do not Confidence in metrics management can be built using success stories Senior management support is essential People must not overreact if the wrong metrics are occasionally chosen Specialized metrics generally provide more meaningful results than generic metrics Try to minimize the bias in metrics measurement 77
    78. 78. © 2013 International Institute for Learning, Inc. 78 Questions?
    79. 79. © 2013 International Institute for Learning, Inc. 79

    ×