Your SlideShare is downloading. ×
Measuring the Results of your Agile Adoption
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Measuring the Results of your Agile Adoption

2,699
views

Published on

Presentacion de Per Kroll en SG09 Conferencia y Expo.

Presentacion de Per Kroll en SG09 Conferencia y Expo.

Published in: Technology

0 Comments
8 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,699
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
8
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. ® Measuring the Results of Your Agile Adoption Per Kroll Chief Solutions Architect IBM Measured Capability Improvement Framework
  • 2. Everybody wants to be agile, but…. • How do choose which agile practices are appropriate for you? • How do you accelerate your agile adoption so you can more rapidly reap the inherent benefits? • How do you know how agile you are? • How do you showcase that your agile adoption is leading to better results? Most of us are guessing what works and what does not. Let’s move from guessing to knowing!
  • 3. IBM is Going Agile One of the worlds largest agile transformations • Agile key to continue “lead the pack” – Tomorrow’s leaders must adopt appropriate agile techniques • Large scale transformation – +27,000 developers • Agility at Scale is key – Team size, distribution, compliance, application complexity, … Sample Data • +6,000 developers trained • +200 teams using IBM Rational Team Concert • +1,000 members of Agile@IBM community • Measurable improvement:  Capacity to deliver and productivity (revenue/headcount)  Higher quality (# defects reaching production)  Better team morale
  • 4. Divide into 4 charts Why Agile? Because it Works!
  • 5. • Basics of measuring • Agility at Scale: Choosing your agile approach • Effective retrospectives and subjective control measures • Objective measurements • Parting thoughts 5
  • 6. The importance of measurements • At Toyota, they say – No work without process – No process without metrics – No metrics without measurement – No measurement without analysis – No analysis without improvement Taiichi Ohno (1912–1990) • Toyota became the worlds largest car manufacturer in 2007
  • 7. Going to the doctor… • What you want to know – How long will I live? • What the doctor does – A health assessment • What the doctor will measure – Heart rate, cholesterol, blood pressure, body fat, … • There is a correlation between the values of what a doctor measure and your life expectancy – But you can outlive somebody with lower cholesterol than you have… Experiences shows that there is a similar correlation between software best practices and attainment of desired business results
  • 8. Going to a personal trainer • What you want to do – Run a marathon on a personal best time • What the trainer will do – A fitness assessment to understand your strengths and weaknesses • Personalized plan based on where you are and where you want to be – Run sprints / intervals, run medium distances, strength training, eat healthy, … • Plan will be adjusted based on progress in each area – Ongoing monitoring – Your plan will be personalized and continuously adapted, but based on known patterns of success… Business value is maximized through “personalized” software process improvement plans that are continually adapted based on measurable results
  • 9. The wrong metrics can incent negative behavior
  • 10. Common Problems with Metrics • Either no metrics, or too many • Time consuming and error prone to collect metrics • Metrics are for management, not for the team • Are not used to steer – no actions associated with threshold values
  • 11. Software Measurement Status - Today Fortune 500 companies with productivity measures: 30% Fortune 500 companies with quality measures: 45% Fortune 500 companies with complete measures: 15% Fortune 500 companies with missing measures: 85% Number of software measurement personnel 5,500 Number of software projects measured: 160,000 Number of software projects not measured 50,000,000 - Capers Jones (2009)
  • 12. Does It Help to Measure? Companies that measure: Companies that don’t: On-time projects: 75% On-time projects: 45% Late projects: 20% Late projects: 40% Cancelled projects: 5% Cancelled projects: 15% Defect removal: > 95% Defect removal: Unknown Cost estimates: Accurate Cost estimates: Optimistic User satisfaction: High User satisfaction: Low Software status: High Software status: Low Staff morale: High Staff morale: Low - Capers Jones (2009)
  • 13. Lessons from the factory world: Knobs and meters • One sets the knobs hoping to achieve optimal meter readings – The meter readings are called outcome measures – Sometimes you need additional measures to ensure the system has responded to the knobs, these are called control measures Setting Better Results
  • 14. Meters for software and systems delivery improvement • Business value – Return on Investment (ROI) – Return on Assets (ROA) Business – Profit $0M Value $15M – … • Operational objectives – Productivity – Time to market – Quality 0% Operational 100% Efficiency – Predictability – … • Practice-based control measures – Test Management: Defect density, test coverage Practice – Iterative Development: Velocity, iteration burn 0% Adoption 100% down – Continuous integration: Build stability, Build frequency – …
  • 15. Software and systems need a control framework Value Business Objectives Efficiency Operational Objectives Control Process Definition / Practices Process Enactment / Governance Enforcement / Process Awareness
  • 16. Software and systems need a control framework Performance Measurement Value Value Metrics Business Objectives feedback e.g., ROI, ROA for SSD Efficiency Operational Effectiveness Metrics Measures Operational Objectives feedback e.g., Time to market, productivity Control Practice Process Definition / Practices feedback Adoption/Maturity Practice Subjective Artifacts IBM Rational Objective Self-Check Process Enactment / Governance Enforcement / Process Awareness
  • 17. • Basics of measuring • Agility at Scale: Choosing your agile approach • Effective retrospectives and subjective control measures • Objective measurements • Parting thoughts 17
  • 18. Challenges that Many Teams Adopting Agile Face • Teams choose the wrong agile approach • Teams fail once (or twice) before asking for help. Team lack experienced mentors that can tell them how to avoid pitfalls. • Teams have problems making the tools support the way they want to work, and make tools work with each other. • Teams have no structured way to drive ongoing improvements, and have no structured way in sharing lessons learnt with other teams. • Even if teams are successfully adopting agile, they have no data showcasing what success means. • Result: Many companies struggle for months to make agile work, creating skepticism within the company whether agile is right for them.
  • 19. What is Agility@Scale? Team size Compliance requirement Under 10 100’s of Critical, developers developers Low risk Audited Geographical distribution Organization distribution Co-located Global Disciplined (outsourcing, partnerships) Agile Delivery In-house Third party Environmental complexity Enterprise discipline Simple Complex Project Enterprise focus focus
  • 20. We Need a New Paradigm for Methods • No process got it “all right” • Avoid process wars • Allows you to mix-and-match between processes • Allows you to ‘right-size’ the process
  • 21. Advantages of a practice- based approach • Addresses one aspect of the software lifecycle – e.g., continuous integration, use-case driven development, performance testing, etc. • Can be incrementally and independently adopted • Can be mapped to operational objectives and development pain points • Adoption can be measured Results: Avoids self-inflicting too much process Faster and more predictable results
  • 22. Incremental Practice Adoption: Start small, add as needed Change and Release Management  Team Change Management  Formal Change Management Requirements Management  Shared Vision  Use-Case-Driven Development Quality Management  User-Story-Driven Development Agile Core  Concurrent Testing  Requirements Management  Iterative Development  Test Management  Business Process Sketching  Release Planning  Independent Testing  Whole Team  Performance Testing Teams can select  Continuous Integration  Application Vulnerability a few practices  Test-Driven Development Testing to get started Governance and Compliance Architecture Management  Risk-Value Lifecycle  Evolutionary Architecture  Practice Authoring and Tailoring  Evolutionary Design  Setting up a Performance  Component Software Measurement System Architecture  Managing Performance through  Design Driven Implementation Measurement
  • 23. Incremental Practice Adoption: Start small, add as needed Change and Release Management  Team Change Management Scrum  Formal Change Management Requirements Management  Shared Vision  Use-Case-Driven Development Quality Management  User-Story-Driven Development Agile Core  Concurrent Testing  Requirements Management  Iterative Development  Test Management  Business Process Sketching  Release Planning  Independent Testing  Whole Team  Performance Testing  Continuous Integration  Application Vulnerability XP-ish Testing  Test-Driven Development Governance and Compliance Architecture Management  Risk-Value Lifecycle  Evolutionary Architecture  Practice Authoring and Tailoring  Evolutionary Design  Setting up a Performance  Component Software Measurement System Architecture  Managing Performance through  Design Driven Implementation Measurement
  • 24. Incremental Practice Adoption: Start small, add as needed Change and Release Management  Team Change Management  Formal Change Management Requirements Management  Shared Vision  Use-Case-Driven Development Quality Management  User-Story-Driven Development Agile Core  Concurrent Testing  Requirements Management  Iterative Development  Test Management  Business Process Sketching  Release Planning  Independent Testing  Whole Team  Performance Testing  Continuous Integration  Application Vulnerability  Test-Driven Development Testing Governance and Compliance Architecture Management  Risk-Value Lifecycle  Evolutionary Architecture  Practice Authoring and Tailoring  Evolutionary Design  Setting up a Performance OpenUP  Component Software Measurement System Architecture  Managing Performance through  Design Driven Implementation Measurement
  • 25. Tailor practices to context / scaling factors High VARIANCE: Cost, Schedule, … Medium Low SCALE: Application Size, Team Size, … Small Medium High
  • 26. Tailor practices to context / scaling factors High VARIANCE: Cost, Schedule, … Medium • Key practices: Continuous integration, change management, test management, Low test-driven development SCALE: Application Size, Team Size, … Small Medium High
  • 27. Tailor practices to context / scaling factors High VARIANCE: Cost, Schedule, … Medium • Key practices: Iterative development, Release planning, Whole Team, Use-Case-Driven Development, User-Story- Driven Development, Evolutionary Architecture • Key practices: Continuous integration, change management, test management, Low test-driven development SCALE: Application Size, Team Size, … Small Medium High
  • 28. Tailor practices to context / scaling factors • Key practices: Risk based High lifecycle, Component Software VARIANCE: Cost, Schedule, … Architecture, Requirements Management Medium • Key practices: Iterative development, Release planning, Whole Team, Use-Case-Driven Development, User-Story- Driven Development, Evolutionary Architecture • Key practices: Continuous integration, change management, test management, Low test-driven development SCALE: Application Size, Team Size, … Small Medium High
  • 29. Linking operational objectives to practices and metrics High # of defects (pre/post-ship) High support or maintenance costs without CEO Business Value high defect #s High maintenance costs (devt) of fixing defects Customer downtime Low pipeline conversion Low customer satisfaction CIO IT Quality Goals Dev. Mgr. Development Quality Goals Operational Quality Goals ? ? High defects both pre- Non-functional Req. Issues and post-ship High # of Help Desk Calls Reduce Post-Delivery Defects Deliver What Stakeholder Needs Growing defect backlog H L M L High Incidence Resolution Times ? ? ? ? Increase Defect Increase Defect Deliver on Customer Improve Non- Prevention Detection Requirements Functional Quality M M H L M L Attributes M L High post-ship or Measures: customer-reported Measures: Measures: Measures:  Post-ship problem  Post-ship problem defects  Defect density  Defect density, reports reports  Defect distribution  Customer High fixes in error arrival/closure rates  Defect  Customer High Requirements Churn satisfaction satisfaction  Defect backlog arrival/closure rates  Defect removal  Support / maint.  Support / maint. High Post Delivery Support  Fixes failing costs costs verification effectiv. Aging ER Backlog  Requirem. test  Requirement test  Rework effort  Fixes failing coverage coverage verification  Requirements  Test execution Practices:  Test coverage delivery results  Test-driven  Test execution status  Survey of feature Dev. Practices:  Design-driven usage  Performance Implem. Practices:  Test management Practices: Testing  C&C management  Requirements Revenue / Cost Whole team  Continuous  Shared Vision  Use-case Driven Mgnt.  Pair Programming integration Dev  Shared Vision  Review/Inspection  Evolutionary Value (H,M or L)  Requirements  Risk-Value Lifecycle Architecture  Cost (H, M or L) Mgnt.  Evolutionary  Component Hot Component  Whole Team Architecture Architecture  Iterative Dev.  Test-Driven  Test practices  Functional Testing Development  Iterative Dev.  Iterative  Risk Value Lifecycle  C&C Management  Review/Inspection Development  C&C Management
  • 30. Choosing the Right Practices for YOUR Project • Understand your context (scaling factors, risk factors) • Understand your operational drivers • Understand your current state and how much change you can take on
  • 31. • Basics of measuring • Agility at Scale: Choosing your agile approach • Effective retrospectives and subjective control measures • Objective measurements • Parting thoughts 31
  • 32. Effective Retrospectives: IBM Rational Self Check for Software Teams – Systemic approach for teams to assess their adoption of desired practices – Enables teams to learn, improve their effectiveness, and share their learnings per. I’m a develo tices. and rem ember prac Iw ant to learn stick. I’m a coac I want ne w ideas to h. n by a stick. I want effi ’t wan t to be beate cient conti I don nuous imp I want to h rovement ear from q . uiet people I want to le on the tea arn from te m. ams like m ine. ve. I’m an executi proving. rstand if my teams are im I want to unde nge effort am memb ers in the cha I wan t to involve te
  • 33. Make it simple Experience 30 minutes Shared 15 minutes Actions Implemented 15 minutes Actions Listed Topics Listed Practices Learned Learn, act, share
  • 34. A self-assessment questionnaire: Example – Iterative Development KAIZEN: Involve the team in discussion on what works, what does not, and how to make small but frequent improvements
  • 35. Example: Are we iterative? The Big Picture We’re iterative, right? Vision Tim e boxed iterations 10 User Stories and Use Cases Automated Unit Tests 7 Scrum m eetings Working Software Iterative 3 Scrum meetings 0 Ref lections Non Solo Prioritized Backlog Feedback Us ed Custom 0 2 4 6 8 10 Es tim ating 350 people. Multi-shore. Java 8 week “iterations”. 24 Month project. New to agile. “Iterations let us realize the plan was severely overbooked so prioritization of content could begin early”
  • 36. Look at deviation among team members The Big Picture Deep dive on Iterative • 30 people. Distributed. Java • 2 week iterations. 6 Month project  “Agile has enabled the project to be responsive to changing requirements and to deploy more function in a drastically reduced timeframe”  Sustainable Pace – Testers have worked at a hectic pace to make adjustments to scope.  Recommend improving agile estimation so they have higher quality iterations.
  • 37. • Basics of measuring • Agility at Scale: Choosing your agile approach • Effective retrospectives and subjective control measures • Objective measurements • Parting thoughts 37
  • 38. 5 Dimensions of Delivery Efficiency Time-to-Value 6 4 Predictability 2 Business Value 0 Quality Cost Today 3-year goal
  • 39. From In Process (Team) to Business Value (Executive) Metrics Team Project Manager Management Executive Category (In Process) (Project Health) (Product Owner) (Business Value) Time-to- User Story Points / Use Case Points Value / Iteration Burndown, Blocking Work Schedule Item Release Burndown Cycle Time / Time to ROI Business Product Backlog, Stakeholder Mapping Value Stakeholder Feedback, # of Change Request (External): Tested and Delivered Requirements Client Sat, Benefit Realization Iteration Velocity Release Velocity Business Value Velocity Revenue, ROI, Renewal rate Effort (Man-hours) Cost / Unit of work Cost by Revenue, Dev cost
  • 40. From In Process (Team) to Business Value (Executive) Metrics Team Project Manager Management Executive Category (In Process) (Project Health) (Product Owner) (Business Value) Quality Technical Debt (In-process) Test Status, Test Coverage of Requirement Quality at Ship Quality Enhancement Req., Customer (post-ship) Defects Age of Enhancement Request, Defect repair latency Customer Quality User Story Points / Use Case Points Planned/Actual Cost (CPI) and Velocity (SPI) Cost and Schedule variance Reflection, Self-Check Survey Actions From Reflection Climate Surveys
  • 41. Metrics – Practices Mapping Dimensions Metrics Iterat TDD Rele Cont Con Test Team Reg. ive ase inuo curr Mg Chan Mgm Plan us ent mt ge t ning Integ Testi Mgmt ratio ng n Time-to-Value / Release Burndown X Schedule Iteration Burndown X Business Value Iteration Velocity X X Product Backlog X X X Cost Effort (Ideal day) X Quality Test case execution status X X X Test coverage of requirement X Defect trend / defect density X X X X X Build Health / Build Status X Enhancement Request Trend X X X Planned cost / actual cost X
  • 42. Examples of What to Measure? Burndown 1. Measuring project progress (Time) – Burndown Chart – tracks progress – Iteration Velocity – measures capability of the team 1. Measuring project quality (In-process) Shows a trend view of work items being assigned – Code Health shows the information and completed during a given time about health of a build in an iteration – Test Coverage – track % of requirement that is executed 1. Measuring use of feedback (Post-ship) – Enhancement Request Trend – monitor requests from stakeholders and resolution of those request – Defect Trends – monitor defects Iteration Velocity Used to measure the capability of the team
  • 43. Interpreting Metrics: The Good, The Bad, and Actions to Take Metric Good Bad Action to improve the agile indicator agile indicator agility Burndown The trend line is The trend line shows • Cut scope Chart going down as time rising slope that hardly • Understand why we progress going down or the line spinning the wheel. stay flat. Iteration Velocity will typically Velocity fluctuates for • Improve the way you do Velocity fluctuate within the almost every iterations agile estimation first couple iterations during development • Break down work into and remains stable lifecycle. micro-increment that is less than 3 days Enhance- Low throughout or A big spike at the end • Demonstrate early build ment Request going down of the project. to customer • Do early release of code • Postpone the non-critical enhancement to the next release
  • 44. Applying Smarter Measurements: IBM Rational Software Organize dashboard around business and operational objectives.
  • 45. Applying Smarter Measurements: IBM Rational Software Release Improve Project Health Planning Organize dashboard Track whether teams around business and exert behavior operational objectives. leading to reaching target objectives. Results for 2008:  Beta-reported defects FIXED in GA releases – grew from 19%  84%  % RFEs INCLUDED in GA releases grew from 2%  26%  Reported defect arrivals for all products – down 21%  Schedule adherence improved to 83% in 2008 and 98% to date in 2009
  • 46. • Basics of measuring • Agility at Scale: Choosing your agile approach • Effective retrospectives and subjective control measures • Objective measurements • Parting thoughts 46
  • 47. The Jazz platform for collaborative software delivery Rational Requirem Rational Rational Rational ents Team Build Rational Your existing Insight Composer Concert Forge Quality 3rd -Party Future capabilities Manager Jazz IBM Capabilities Capabilities Best Practice Processes Administration: Users, Collaboration projects, process Presentation: Storage Mashups Discovery Query Jazz is…  A scalable, extensible team collaboration platform  A community at Jazz.net, where you can see Jazz-based products being built  An integration architecture, enabling mashups and non-Jazz based products to participate
  • 48. Parting Thoughts • We can learn from other industries, including Toyota Production System – Measure, analyze, improve (kaizen) • Choose agile practices based on context and operational objectives – There is no one size fits all – To make agile scale, you need to add practices to address scaling factors • Apply a control framework to move from guessing to knowing – Do you reach your expected outcome? – Do you do a good job implementing the practices that will lead to the outcome? • Involve teams in measuring (Self-Check) – Use simple measurements linked to ongoing improvements • Adopt a tool environment that allows you to scale your agile approach – Collaborate, automate, report
  • 49. MCIF Resources • Agility@Scale ibm.com webpage – http://www.ibm.com/software/rational/agile/ • Jazz platform ibm.com webpage – http://www-01.ibm.com/software/rational/jazz/ • IBM Rational Process Solution e-Kit – http://www.ibm.com/software/info/rmckit/index.jsp – Includes MCIF whitepaper • Whitepaper: “Enable the Agile Enterprise for Incremental Adoption of Practices” – http://www.ibm.com/services/forms/preLogin.do?source=swg-eaetiaop • Self-Check article on developerWorks http://www.ibm.com/developerworks/rational/library/edge/08/may08/kroll_kr
  • 50. Per Kroll – Background • Creator / Technical lead – Measured Capability Improvement Framework • In the past - Product / Development Manager – RUP / Rational Method Composer • Project lead – Eclipse Process Framework • Advisor to companies doing agile transformations • (Co-) Author of – The Rational Unified Process Made Easy – A Practitioner’s Guide to RUP – Agility and Discipline Made Easy – Practices from OpenUP and RUP
  • 51. ® Per Kroll pkroll@us.ibm.com