Driving Innovative IT Metrics (Project Management Institute Presentation)
Upcoming SlideShare
Loading in...5
×
 

Driving Innovative IT Metrics (Project Management Institute Presentation)

on

  • 180 views

 

Statistics

Views

Total Views
180
Views on SlideShare
180
Embed Views
0

Actions

Likes
1
Downloads
15
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Source : McKinsey & Company in conjunction with the University of Oxford Type of survey : Study on large scale IT Projects Date : 2012 <br /> A study of 5,400 large scale IT projects (projects with initial budgets greater than $15M) finds that the well known problems with IT Project Management are persisting. Among the key findings quoted from the report: <br /> 17 percent of large IT projects go so badly that they can threaten the very existence of the company <br /> On average, large IT projects run 45 percent over budget and 7 percent over time, while delivering 56 percent less value than predicted <br /> Source material - Delivering large-scale IT projects on time, on budget, and on value <br /> <br /> Source : IBM Type of survey : Survey of 1,500 change management executives Date : Oct 2008 <br /> IBM survey in the success / failure rates of “change” projects finds; <br /> Only 40% of projects met schedule, budget and quality goals <br /> Best organizations are 10 times more successful than worst organizations <br /> Biggest barriers to success listed as people factors: Changing mindsets and attitudes – 58%. Corporate culture – 49%.  Lack of senior management support – 32%. <br /> Underestimation of complexity listed as a factor in 35% of projects <br /> For source data read – Making change work <br /> <br /> Source : KPMG (New Zealand) Type of survey : Survey of 100 businesses across a broad cross section of industries Date : Dec 2010 <br /> KPMG survey of Project Management practices in New Zealand finds some truly startling results; <br /> Survey shows an incredible 70% of organizations have suffered at least one project failure in the prior 12 months! <br /> 50% of respondents also indicated that their project failed to consistently achieve what they set out to achieve! <br /> Reference article - Most Business Experience Project Failure <br /> Source material – KPMG Project Management Survey 2010 <br />
  • Even if we say that the 39% successful project doers don’t need us, and the 18% failures were doomed from the start…that still leaves 43% who are “challenged”. If just that is the market…it’s a huge market. <br /> <br /> <br /> http://versionone.com/assets/img/files/ChaosManifesto2013.pdf <br />
  • Let’s not dwell on the negative (failure). Let’s focus on the positive…becoming more competitive. <br /> <br /> Dramatic Opportunity: <br /> <br /> Those organizations that HAVE developed mature project management offices have become more significantly more consistently reliable. Giving their organizations the ability to out-compete their rivals. <br /> <br /> http://www.pmsolutions.com/audio/State_of_the_PMO_2012_Research_Report.pdf
  • Then there is the huge cost savings potential from doing things right the first time because enough time was taken to do it right…or to cancel the project that isn’t possible at this time or worth it knowing the challenges. <br /> <br /> http://public.dhe.ibm.com/common/ssi/ecm/en/ras14021usen/RAS14021USEN.PDF

Driving Innovative IT Metrics (Project Management Institute Presentation) Driving Innovative IT Metrics (Project Management Institute Presentation) Presentation Transcript

  • Driving Innovative IT Metrics Focusing on the CONDITIONS for Success © 2013 Joe Hessmiller 1
  • 2 © 2013 Joe Hessmiller Schedule Check You Are Here
  • Joe Hessmiller • Over 30 years in IT; analyst, programmer, technical training manager, project manager, user interface designer, project management consultant, consulting sales and marketing manager. • Worked with dozens of IT managers from Fortune 50 corporations and large government agencies. • Current focus on software project risk management, specifically, identifying, monitoring and use of project risk “early warning signs”. 3 © 2013 Joe Hessmiller
  • 1. To establish the need for monitoring the metrics that really matter. 2. To identify the types of metrics that really matter. 3. Show how a familiar framework can be adapted for metrics identification (and communication). 4. Give you enough to use back at your office to improve your metrics program. 4 © 2013 Joe Hessmiller Objectives of Session
  • Agenda 1. The Innovative Metrics Opportunity 2. Why Do These Opportunities Still Exist? 3. What Metrics Should We Monitor? 4. Working With Conditions Data 5. Developing Innovative Metrics for Your Organization 5 © 2013 Joe Hessmiller
  • Part One The Innovative Metrics Opportunity (in Project Management) 6 © 2013 Joe Hessmiller
  • 7 • Despite Everything We’ve Tried, Project Success Rates Little Changed in 30 Years – McKinsey (17% threaten) – IBM (40% met 10X range) – KPMG (70% orgs) – Standish CHAOS Report Source: http://calleam.com/WTPF/?page_id=1445 The Innovative Metrics Opportunity
  • 8 • There Is Plenty of Opportunity for Improving Project Success Rates Source: http://versionone.com/assets/img/files/ChaosMa nifesto2013.pdf Source: http://versionone.com/assets/img/files/ChaosMa nifesto2013.pdf The Innovative Metrics Opportunity
  • 9 • Strategic Opportunity to Increase Market Competitiveness Organizations with a mature PMO outperform those with an immature PMO by: 28% for on-time project delivery; 24% for on-budget delivery; and 20% for meeting original goals and business intent of projects. Source: www.metier.com, According to PMI The Innovative Metrics Opportunity
  • 10 Source: Paul D. Nielsen, “About Us: From Director and CEO Paul D. Nielsen,” Carnegie Mellon Software Engineering Institute, http://www.sei.cmu.edu/about/message/ • Operational Opportunity to Address Causes of Largest Project Cost Component According to the Carnegie Mellon Software Engineering Institute, “Data indicate that 60-80% of the cost of software development is in rework.” The Innovative Metrics Opportunity
  • Part Two Why Do These “Opportunities” Still Exist? 11 © 2013 Joe Hessmiller
  • 12 • The “99% Complete” Syndrome – Everything is Fine…Everything is Fine… – Oh, Oh, SURPRISE! We’re in Trouble! • The real challenge to successful software project management is: – having the information – that is very DIFFICULT to collect, analyze and act on – in time to make a difference. What You Know What You Don’t Know… but Need To Know Why Do These Opportunities Still Exist?
  • 13 • It’s DIFFICULT to Have Project Managers Who Can Predict the Future Source: http://versionone.com/assets /img/files/ChaosManifesto20 13.pdf Why Do These Opportunities Still Exist?
  • 14 • It’s DIFFICULT to find project managers who can keep everyone on the same accurate, relevant and timely page. Source: http://versionone.com/assets /img/files/ChaosManifesto20 13.pdf Why Do These Opportunities Still Exist?
  • 15 • It’s DIFFICULT to Trust the Service Providers You Rely On When You Can’t Verify Their Behavior/Performance Source: http://versionone.com/assets /img/files/ChaosManifesto20 13.pdf Why Do These Opportunities Still Exist?
  • SO…What Does an Innovative Project Manager Need to Do? 16 • Address these DIFFICULT Project Management Challenges • By Having the Data That’s Really Needed to Be Successful
  • Part Three What Metrics Should We Monitor? 17 © 2013 Joe Hessmiller
  • What Metrics Should We Monitor? • Tracking Progress –Backward Looking • Managing Risk –Forward Looking 18 © 2013 Joe Hessmiller
  • Tracking Progress Looking Backward • Volume • Quality • Cost 19 © 2013 Joe Hessmiller What did we do?
  • • Alignment of IT Investments to Business Strategy • Cumulative Business Value of IT Investment • IT Spend Ratio – New Versus Maintenance • Critical Business Services – Customer Satisfaction – Service Level Performance • Operational Health – Outages – Security Incidents – Project Success Rate – Average Defect Rate Source: Craig Symons, Forrester Research, 4-4-08 20 © 2013 Joe Hessmiller Tracking Progress Looking Backward – Enterprise Level
  • 21 © 2013 Joe Hessmiller http://www.slideshare.net/anandsubramaniam/project-metrics-measures Tracking Progress Looking Backward – Project Level
  • 22 © 2013 Joe Hessmiller Managing Risk Looking Forward – Experts Agree on EWS What should we do?
  • Managing Risk Looking Forward – Kappelman Research • Kappelman Research – Derived List of • Six People Factors • Six Process Factors • For In-process Audits 23
  • 24 © 2013 Joe Hessmiller Managing Risk Looking Forward – Dominant Dozen
  • • Project Management Institute – 10 Knowledge Areas – Things You Should Know – Things You Should Do • For Gate Audits 25 Managing Risk Looking Forward – Practitioners Too
  • • In Addition to Traditional –Key Performance Status • What To Collect –Key Performance Conditions • Intra-process Conditions • Inter-process Conditions –Key Process/Practice Compliance 26 © 2013 Joe Hessmiller Summary: What Metrics Should We Monitor?
  • • How Collect –Intuition –MBWA –Survey Software –Purpose Designed Software 27 © 2013 Joe Hessmiller What Metrics Should We Monitor?
  • • Pattern Recognition – Correlations Between Outcomes and Precedent Conditions • Feedback Loops – Correlations Drive KPI Algorithms Need to Collect the Data for the Insights 28 © 2013 Joe Hessmiller The Future: PMBD Project Management Big Data
  • Part Four Working with Conditions Data (The Four Missing Metrics) 29 © 2013 Joe Hessmiller
  • Three Important CONDITIONS to Monitor • Expectations Management • Sponsor Involvement • Process Compliance To Minimize Project Risk Factor • Rework Probability At 40% of total project costs, Rework is the leading cause of projects running over budget and beyond schedule. 30 © 2013 Joe Hessmiller Managing Risk ESPR “The Missing Metrics”
  • The Four Missing Metrics • SMART – Are expectations clear? • SMPL – Is sponsor engaged? • PAL - Are processes being followed? • PRPL – Are causes of Rework being avoided? 31 © 2013 Joe Hessmiller
  • SMART Level Tracks the clarity of assignments. The higher the SMART Level, the higher the level of understanding of what is expected. Therefore, less Rework and less management intervention required. The SMART Level 0.0 20.0 40.0 60.0 80.0 100.0 120.0 SMART Index Upper Control Limit Lower Control Limit The SMART Level 32 © 2013 Joe Hessmiller
  • SMART Level Metric derived from participant feedback on the perceived Specificity, Measurability, Achievability, Relevancy and Timeframes for their assignments/deliverables. SMART Index Week SMART Index Upper Control Limit Lower Control Limit Specific Measurable Achievable Relevant Time- Based 8/10/2010 76.3 100 75 75 50 80 100 95 8/17/2010 76.3 100 75 75 50 80 100 90 8/24/2010 75.0 100 75 75 50 80 95 95 8/31/2010 82.5 100 75 80 70 85 95 95 9/7/2010 83.8 100 75 80 70 85 100 95 9/14/2010 82.5 100 75 90 70 85 85 90 9/21/2010 87.5 100 75 90 90 80 90 90 9/28/2010 92.5 100 75 100 90 80 100 90 10/5/2010 88.8 100 75 100 90 75 90 95 10/12/2010 86.3 100 75 90 90 75 90 95 10/19/2010 87.5 100 75 90 90 75 95 85 10/26/2010 88.8 100 75 90 90 80 95 85 11/2/2010 92.5 100 75 95 95 80 100 90 11/9/2010 92.5 100 75 95 90 85 100 95 11/16/2010 93.8 100 75 100 90 85 100 95 11/23/2010 93.8 100 75 100 100 85 90 100 11/30/2010 93.8 100 75 100 100 80 95 95 12/7/2010 88.8 100 75 90 90 85 90 90 12/14/2010 92.5 100 75 90 90 90 100 95 12/21/2010 95.0 100 75 95 95 95 95 95 12/28/2010 92.5 100 75 90 80 100 100 100 Components of the SMART Level The SMART Level 33 © 2013 Joe Hessmiller
  • SMART Level Looks like this project was back under control once the Relevance of the assignments were made clear to the staff. 40.0 50.0 60.0 70.0 80.0 90.0 100.0 110.0 SMART Index Upper Control Limit Lower Control Limit Specific Measurable Achievable Relevant Time-Based The SMART Level SMART Level Influencers 34 © 2013 Joe Hessmiller
  • SMPL Line Tracks the participation level of the senior management and/or sponsor. The SMPL Line Senior Management Participation Level Attention Needed 0.0 20.0 40.0 60.0 80.0 100.0 120.0 SMPL Upper Control Limit Lower Control Limit 35 © 2013 Joe Hessmiller
  • SMPL Line In this example, the level of participation in meetings, surveys, and other activities is compared to the planned participation by senior management Senior Management Participation Level Components of the SMPL Line 36 © 2013 Joe Hessmiller
  • SMPL Line In this example, the components of the SMPL are shown separately to identify individual areas needing improvement. Senior Management Participation Level 0.0 20.0 40.0 60.0 80.0 100.0 120.0 SMPL Upper Control Limit Lower Control Limit Status Meeting Attendenace Rate APO Response Rate Decisions Made/Delegated Rate SMPL Line Influences 37 © 2013 Joe Hessmiller
  • PAL Measures likely level of process adherence based on conditions that would tend to lead to ‘short cuts’ on process.. Process Adherence Likelihood 38 © 2013 Joe Hessmiller
  • PAL This table shows how problems with EVA lead to likely abandonment of process until project gets back on track. May be OK, must be identified to be managed. Process Adherence Likelihood Components of PAL Line 39 © 2013 Joe Hessmiller
  • PAL This illustrates the various components of PAL. PAL Line Influences 40 © 2013 Joe Hessmiller
  • Project Rework Probability Level PRPL Line Tracks the ‘probability’ of Rework based on changes in the conditions that are known to cause Rework. 0.0 5.0 10.0 15.0 20.0 25.0 PRPL Upper Control Limit Lower Control Limit Attention Needed The PRPL Line 41 © 2013 Joe Hessmiller
  • PRPL Line SME Involvement, Team Confidence Level, Technical Capability and Requirements Stability are components of PRPL Line. Project Rework Probability Level (PRPL) Week PRPL Upper Control Limit Lower Control Limit SME Involvement Technical Capability Requirements Stability Confidence Level 8/10/2010 5.0 20 0 100 100 80 100 8/17/2010 10.0 20 0 80 100 80 100 8/24/2010 12.5 20 0 80 100 80 90 8/31/2010 22.5 20 0 70 90 70 80 9/7/2010 17.5 20 0 80 90 70 90 9/14/2010 16.3 20 0 80 90 75 90 9/21/2010 11.3 20 0 80 90 85 100 9/28/2010 23.8 20 0 80 60 85 80 10/5/2010 23.8 20 0 80 60 85 80 10/12/2010 23.8 20 0 80 60 85 80 10/19/2010 18.8 20 0 65 90 85 85 10/26/2010 21.3 20 0 70 90 70 85 11/2/2010 16.3 20 0 80 90 70 95 11/9/2010 16.3 20 0 80 90 70 95 11/16/2010 16.3 20 0 80 80 80 95 11/23/2010 16.3 20 0 80 80 80 95 11/30/2010 17.5 20 0 80 80 80 90 12/7/2010 12.5 20 0 80 90 80 100 12/14/2010 8.8 20 0 85 90 90 100 12/21/2010 5.0 20 0 90 100 90 100 12/28/2010 0.0 20 0 100 100 100 100 Project Rework Probability Level Components of PRPL Line 42 © 2013 Joe Hessmiller
  • PRPL Line SME Involvement, Team Confidence Level, Technical Capability and Requirements Stability influence rework probability. Project Rework Probability Level 0.0 20.0 40.0 60.0 80.0 100.0 120.0 PRPL Upper Control Limit Lower Control Limit SME Involvement Confidence Level Technical Capability Requirements Stability PRPL Line Influences 43 © 2013 Joe Hessmiller
  • Part Five Developing Innovative Metrics for Your Organization 44 © 2013 Joe Hessmiller
  • Five “C”s of Effective Metrics • Connected to important goals (G-Q-M) • Complete so that all the factors important to achieving the goal are included • Current so that they reflect current goals • Communicated so that they drive behavior • Calibrated so that values are comparable across organizations and over time. Source: Bob Lewis, InfoWorld, 12-28-11 45 © 2013 Joe Hessmiller
  • A Gauge for Every Condition Automotive Engineers Long Ago Defined the Critical Measures for Safe, Effective Engine Operation. 46 © 2013 Joe Hessmiller
  • The Basic Automobile Dashboard Automotive Gauge Odometer Clock Fuel Level Speedometer Tachometer Oil Pressure Oil Temperature Water Pressure Water Temperature Voltmeter 47 © 2013 Joe Hessmiller
  • What Questions Are We Asking? Automotive Gauge Asks the Question Odometer How far have we gone? Clock How much time has it taken? Fuel Level How far can we go? Speedometer How fast are we going now? Tachometer How intensely is engine working? Oil Pressure Do we have enough ‘lubrication’ to keep parts working well together? Oil Temperature How smooth are interactions? Water Pressure Do we have enough coolant to keep the engine producing? Water Temperature How effective is the coolant in keeping the engine cool? Voltmeter Is enough energy being applied to the other important systems? 48 © 2013 Joe Hessmiller
  • The Basic Measures Automotive Gauge Asks the Question To Measure Odometer How far? Deliverables Delivered Clock How long? Duration Fuel Level How much further? Input Units Available Speedometer How fast? Deliverables per Unit of Time Tachometer How intensely? Effort Intensity Oil Pressure Do we have enough lubrication to smooth interactions? Supply of Lubricant to Smooth Interaction Between Components Oil Temperature How smooth are interactions? Ability of Lubricant to smooth Interaction Between Components Water Pressure Do we have enough coolant to keep the engine producing? Supply of Coolant to dissipate excess engine heat Water Temperature How effective is the coolant in keeping the engine cool? Ability of Coolant to dissipate engine heat Voltmeter Is enough energy being applied to the other important systems? Ability to Support other Control and Comfort Systems 49 © 2013 Joe Hessmiller
  • Comparative Metrics To Measure Automotive Metric IT Metric Deliverables Delivered Miles Service Level Achieved, Function Point Delivered Duration Hour Hour Input Units Available Gallons Resource Hour Deliverables per Unit of Time Miles Per Hour Earned Value Per Clock Hour Effort Intensity RPM Hours Worked Per Week/Available Hours Supply of Lubricant to Smooth Interaction Between Components PSI Stakeholder Interaction Satisfaction Ability of Lubricant to Smooth Interaction Between Components Degrees Number of Open Issues from Stakeholder Interactions Supply of Coolant to dissipate excess engine heat PSI Duration to Close Issues/Number of Issues Ability of Coolant to dissipate engine heat Degrees Number of Escalated Issues Ability to Support other Control and Comfort Systems Volts On Time Process Deliverables (Status, Reporting, Training) 50 © 2013 Joe Hessmiller
  • The More Complex the Project Environment… 51 © 2013 Joe Hessmiller
  • • To establish the need for monitoring the metrics that really matter. • To identify the types of metrics that really matter. • Show how familiar framework can be adapted for metrics identification (and communication). • Give you enough to use back at your office to improve your metrics program. 52 © 2013 Joe Hessmiller Objective of Session