The Agile Revolution of IBM

1,791 views

Published on

Presentation at AgileWelly meetup in 2013.

http://www.meetup.com/AgileWelly/events/108516432/

Published in: Technology, Business
0 Comments
5 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,791
On SlideShare
0
From Embeds
0
Number of Embeds
12
Actions
Shares
0
Downloads
0
Comments
0
Likes
5
Embeds 0
No embeds

No notes for slide
  • Self introduction – I help my customers to leverage IBM technology to deliver high quality software faster. Been in IBM for 5 years, met some great people such as Scott Ambler, Mike O’Rouke, Scott Rich, etc. The information of this presentation primarily come from talking to them.
  • IBM has 3 primary lines of business - hardware, software and services. We are talking about software group here. Size is big, but learnings are applicable to any organisations
  • In 2006, dominantly using waterfall, RUP, and the Eclipse Way. Setup Agile COE in 2006, and management has invested $5m/year over 2006 and 2008. Primary differentiator between Agile and iterative = the use of User Stories, Ranked Product Backlog, reprioritise at each iteration, Daily Standup. Also adapted other common agile practices: iterative, reflect, adapt, incremental, feedback Practices inspired by agile practices, scrum, xp, some custom ones, that work for us
  • Rewards have been significant On time delivery Enhancements Triaged, and Enhancements into release Maintenance/Innovation
  • IBM has acquired a lot of companies for the talents. Have problem with geographically distributed teams being less effective. But no point forcing people to relocate because they will leave. So we have to work with it.
  • GDD and distintegrated teams becomes quality problems. IBM figures on the slide.
  • Quality problem means we spend a lot of resource in maintenance (bug fixing) that could have been doing innovative projects (new products). The implication is that if we are behind our competitors on innovation, that means they will have a new product before we do. Eventually they have a few products and features that we don’t, and we can’t catch up. Big problem that has senior management attention – losing our competitive advantage!
  • Decided we need change in 2006.
  • Why Agile? Cost of change is lower – getting into Agile is easier, getting out is easier.
  • Good to have appropriate tools, trained up scrum masters, and people that know what they are doing. But also important is the right process, enforcement, nurturing, trust level from management to create a success agile environment.
  • 50%+ of teams failed because of water scrum fall -> recognised that we need to do more than just changing dev teams.
  • Dr Dobbs (Scott Ambler) found that effective agile teams are on the left of the scale. Mike O’Rouke found we are on the right on every count. Process-wise, use DAD. IBM has no way to deal with the situation using current tools -> need some new tools
  • IDT Process is the process followed by all hardware, software and service product creation. Funding, scope, release date determined upfront, if changes, ask for forgiveness (more $). Over 50% of product development team come back for forgiveness -> obviously the model is not working too well. This convinced management to change. changed to unlock scope to 70/30, keep others constant.
  • Generic iteration defnitions across teams in one product. Not generic across ibm.
  • Document our best practice to share with other teams
  • Agile = need to move fast. PMO idea and using excel to do status report after iterations don’t work too well. Most PMs can’t make it to agile. Need real time planning in tool, things like integration points, upgrades, migrations, still with PMs.
  • Poppendieck – train the trainer with DAD Agile leadership and PM training for existing PMs Deep dive such as forced check-in, continuous build, TDD and coding for testers
  • 2006 to 2008, we added on time delivery as a KPI to people, impacting performance review and bonus 2009 = 100% on time. But quality went down. So we added quality as another KPI. 2011 = 94% on time. But better quality.
  • CFO mindset of offshoring -> need to convince them productivity level changes and therefore success rate of project changes dramatically. Don’t outsource for cost, hire the best talent from diff locations.
  • Can’t just fund and walk away. They need to be there with us Customers are happy with the 70/30 arrangement
  • 3 key success factors Collaboration – traceable conversation to context (story/task) to help with distributed teams Automation – forcing habits, putting in process Optimisation – real time reporting to help managers to remove impediments quickly
  • Let’s say the team is largely based in Europe. The NZ guys need to wake up at 2am to do planning poker. It is not fun, and has less quality input from members. This system helps geo distributed teams to do planning poker by pairwise comparison + comment
  • System aggregates the voting and ranks user stories. Not the end of it. But it is a much better baseline to talk. Most of the time everyone agrees.
  • One source of truth
  • Capture done critiera with user story
  • Key is auditable (know when it is suggested) and actionable.
  • If independent test team needs to connect to our user stories and create test cases from there to ensure traceability, they can. Based on Jazz platform.
  • If they use Rational tools, then built in integration tried and tested.
  • One way that we are looking at organizing metrics around business and operational objectives, with direct exec-level input. Too many, only ended up with 10.
  • These are what senior management look at. Revenue by Distinguished Engineer – up a bit Employee/Revenue – down heaps = they need less people to make a dollar of revenue.
  • HC/Product GA = we only need half as many people now to create a product compared to 2006.
  • Defect Backlog Beta Defects Fixed before GA
  • Author Note: Mandatory Rational closing slide (includes appropriate legal disclaimer). Graphic is available in English only.
  • The Agile Revolution of IBM

    1. 1. The Agile Revolution of IBMAlan KanTechnical Manager, IBM Software Delivery Solutionsalankan@nz1.ibm.comalan@alankan.net
    2. 2. 2 Why we needed to change Making the change– Process– People– Tools Measuring Agile progressAgenda• How Agile is IBM now?• Why did IBM Move to Agile?• Our Journey of Becoming Agile• Process• People• Tools• Measuring the Impact
    3. 3. 3a globalteam US 10,400Canada 3, 354Latin America 303EMEA 4,713AP 8,153Japan 282Total 27,106*Toronto,Ottawa,Montreal, Vancover,VictoriaEdinburghLondon/StainesMilton KeynesHursleyWarwickHaifa/RehovotBeijingShang HaiYamatoTaipeiLaGaudeParisPornichetTouloseBeavertonKirklandSeattleAlmadenAgoura HillsCosta MesaEl SegundoFoster CitySan FranciscoSVL/San JoseLas VegasRochesterMinneapolisBoulderDenverLenexa,KATucsonPheonixAustinDallasBedford, MABedford, NHCambridgeLexingtonLittleton, MAWaltham, MAWestford – 528CorkDublinGalwayBoeblingenBangaloreGurgaonHyderabadMumbaiPuneCairoRome -Gold CoastSydneyCanberraFairfaxRaleighCharlotteLexington,KYAtlantaBoca RatonTampaPerthKrakowWarsawSao PauloMalaysiaDelft – 61StockholmMalmoNew York, NYPittsburgPiscatawayPoughkeepsiePrincetonSomersSouthburyHelsinkiEl SaltoHong KongSingapore74 Acquisitions89 Labs1198 products506 releases / year92% Growth Since 200110,000 resources fromacquisitions# customers - 11, 867% Efficiencies – 7% YoYGrowth Market 8,878 (33%)Major Market 18,227 (67%)
    4. 4. 4Practices we adoptedIterativeDevelopmentAPIfirstendgameretrospectivesalways havea clientcontinuousintegrationcommunityinvolvementnew &noteworthyadaptiveplanningcontinuoustestingconsume yourown outputdrive withopen eyesvalidatereduce stresslearnattractto latesttransparencyvalidateupdatefeatureteamsshow progressenablevalidatelivebetasfeedbacksignoffEnd of iterationdemos/reviewsRankedProduct BacklogBurndown User StoriesDaily StandupindependenttestingexploratorytestingDefinition ofDonePMCTDDPlanning Poker
    5. 5. Leadership RoleNote: Goals are either internal IBM statistics or industry benchmarks.Metric Goal2006Measurement2011 MeasurementMaintenance / Innovation 50/50 42% / 58% 31% / 69%Customer Touches / Product 100 ~10 ~ 400Customer Calls -5% YoY ~ 135,000~100,000 (-19% since2009)Customer Defect Arrivals -5% YoY ~ 6,900 ~2200On Time Delivery 65% 47% 94%Defect Backlog 3 Months 9+ Months 3 monthsEnhancements Triaged 85% 3% 100%Enhancements into Release 15% 1% 21%Customer Sat Index 88% 83% 88%Beta Defects Fixed Before GA 50% 3% 95%Rational’s rewards
    6. 6. 62006 IBM softwaregroup reality
    7. 7. 7Software Group Acquisition Milestones
    8. 8. 8cost of poor quality$25/defect $100/defect $16,000 per defect$450/defect $241,000 per defect$158,000 per defect Total Dollars Spent on Escapes Trend of Percentages in each Area over time Trend of Spend on L3 versus Technical Debt Trend of Spend vs Revenue
    9. 9. 9“A large UK bank initiated its APM effort to take a 90:10 ratio for run-the-bank / grow-the-bank down toa more reasonable 40:60 ratio. Dell shifted its maintenance-to-innovation ratio from 80:20 to 50:50.”The Application Portfolio Management Landscape — Combine Process And Tools To Tame The Beast, Phil Murphy, Forrester Research, Inc. April 15, 2011Insufficient spend onstrategic projects
    10. 10. 10we needed to change Organize differently Develop differently Deliver differently Measure differently
    11. 11. 11Respond to fast changingenvironmentReduce process overheadBetter manageoutsourcing / contractorsImprove moraleEnhance qualitywhy move to agile?
    12. 12. 12PeopleProcessToolsthree areas of Change
    13. 13. 13Leadership Roleprocess
    14. 14. 14Initial Issues – Water Scrum Fall
    15. 15. 15Domain ComplexityStraight-forwardIntricate,emergingCompliance requirementLow risk Critical,auditedTeam sizeUnder 10developers1000’s ofdevelopersCo-locatedGeographical distributionGlobalEnterprise disciplineProjectfocusEnterprisefocusTechnical complexityHomogenousHeterogeneous,legacyOrganization distribution(outsourcing, partnerships)Collaborative ContractualDisciplined AgileDeliveryFlexible RigidOrganizational complexityIssues with Agility@Scale
    16. 16. 16auditable processes needed change
    17. 17. 17generic iteration definitionsendgamereleaseM1aplandevelopstabilize4 weekswarm-upretrospectiveinitialreleaseplandecompressionM1plandevelopstabilize…plandevelopstabilizesign-offsign-offsign-off4 weeks 4 weeksfix-spit&polishtestfixtest 4 week iterations ⇒ end with an end of iteration demo 8 week milestones ⇒ announced with New & Noteworthy ⇒ retrospective at the endRetrospectiveNew&NoteworthyEnd of iterationdemo
    18. 18. 18What is in a practice? Key concepts Work products Tasks Guidance Measurements Tool mentors
    19. 19. 19(*) Based on Mike Cohn, Agile Estimating and PlanningStrategyStrategyPortfolioPortfolioProductProductProjectProjectIterationIterationDailyDailyagile planning onion Agile TeamsPlan atInnermostLevel “Required” atall levels
    20. 20. 20Leadership Rolepeople
    21. 21. 21Leadership Rolelean training evolution• Poppendieck collaboration– Two day Disciplined Agile Workshop (9000+ trained)• Additional focused workshops– Leading Agile Teams & Project Management• Deep dives on practices– “show me how its done right in SWG today”• Lean Series– Complements Agile curriculum• Collaborative leadership workshop– Focused on middle management and executives to enable collaboration overisolation or coordination
    22. 22. 22people do what you inspect47%100%2006 2009On Schedule Delivery
    23. 23. 23On-Site(San Jose)Off-Shore(Bangalore)Near-Shore(Toronto)Analysis Design ConstructionFunction &PerformanceTestComponentTestDeployment ProjectManagement100%100%40%60%70%30%60%40%80%20% 20%20%60%Geographic allocation and mapping
    24. 24. 24lessons for executives• A completion date is not a point in time, it is a probability distribution0 6 12Plans/Resource estimatesScopeProduct features/quality• Scope is not a requirements document, it is a continuous negotiation• A plan is not a prescription, it is an evolving,moving targetActual path and precision of Scope/PlanUncertainty inStakeholderSatisfaction SpaceInitial PlanInitial State
    25. 25. 25Leadership Roletools
    26. 26. 26Leadership RoletoolsOptimizing howpeople work whileminimizing face-to-face interactionsIncreasing controlby integratingworkflows and “forcing”new habitsCollaborationContinuously improvethrough real-timemeasurement andconstant steering.OptimizationProcess Automationkeys across all disciplines
    27. 27. 27Leadership RoleNumber of comparisionsHow important left vs rightpair-wise story comparison
    28. 28. 28Leadership Rolefast voting and rankingLegend:007 Integrated processtailoring…#1 for OSD directorsdrops to #6 for Rational044 Global Collaboration#1 for IT Tiger Teamdrops to #7 for Rational027 Reporting#1 for IT Accelertor teamremains on top for Rational
    29. 29. 29Leadership Roletoolsview plan by business value
    30. 30. 30Leadership Roleoverall progress tracking• End of Iteration Demos• Definition of Done• Done Criteria in Plan Items• Risk Tracking in Plan Items• Progress Reporting across Projects (planned)
    31. 31. 31Leadership Roletoolsdone criteria
    32. 32. 32continuous integration• Multi-stagedcontinuous integration• Developer (continuous)• Team (continuous)• Product (weekly)• Composite product(weekly)A Team’s Build Dashboard
    33. 33. 33Leadership Rolecomposite build
    34. 34. 34Leadership Roleretrospectives
    35. 35. 35Leadership Rolecross repository queries
    36. 36. 36Leadership Roledevelopment and test relationship
    37. 37. 37Leadership RolemeasuringAgile
    38. 38. 38ExecutiveDashboardDevelopmentHealthBusinessHealthDevelopmentQualityPerceivedQuality Defect Backlog Test Escapes Functional Test Trends Critical Situations System Test Trends S-Curve Progress Automation Percentage Customer Test Cases Consumability Scorecard Defect Latency Quality Plan Commitments Test Coverage Defect Density Build Health Project Velocity Staffing Variance Process Timeliness Iteration/Milestone Status Severity Analysis Security Vulnerabilities Static Code Analysis Requirements Met IPD Timeliness Transactional Survey PMR / Call Rates Critical Situations Cost of Support Installability RFE SLAs Usability Consumability Scalability Integrations with otherproducts User Experience / Doc Time to Resolution APAR:PMR ratio Post-GA metrics Transparency Sales Plays Partner Enablement Support Enablement Technical Enablement Sales Enablement MCIF Index Alt Packaging OEMs XL hits Tactics ROI Pipeline / Multiplier RevenuePracticesVulnerability AssessmentConcurrent TestingTest Driven DevelopmentWhole TeamTeam Change ManagementEvolutionary ArchitectureRequirements Managementexecutive measurement
    39. 39. 39Leadership Roletoolsimproving Bottom-line GrowthSW Revenue per DE HC $M2004 2005 2006 2007 2008 2009RevenueperHC$ME/Ras%Rev per DE HCE/R
    40. 40. 40Leadership Roletoolsdoing More with LessCapacity2003 2004 2005 2006 2007 2008 2009*HC/ProductGASWGRevenuein$$’sHC / Product GASWG Revenue
    41. 41. 41Leadership RoleNote: Goals are either internal IBM statistics or industry benchmarks.Metric Goal2006Measurement2011 MeasurementMaintenance / Innovation 50/50 42% / 58% 31% / 69%Customer Touches / Product 100 ~10 ~ 400Customer Calls -5% YoY ~ 135,000~100,000 (-19% since2009)Customer Defect Arrivals -5% YoY ~ 6,900 ~2200On Time Delivery 65% 47% 94%Defect Backlog 3 Months 9+ Months 3 monthsEnhancements Triaged 85% 3% 100%Enhancements into Release 15% 1% 21%Customer Sat Index 88% 83% 88%Beta Defects Fixed Before GA 50% 3% 95%Rational’s rewards
    42. 42. 42© Copyright IBM Corporation 2012. All rights reserved. The information contained in these materials is provided for informational purposes only, and is provided AS IS without warranty of any kind,express or implied. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, these materials. Nothing contained in these materials is intended to, nor shall havethe effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering the terms and conditions of the applicable license agreement governing the use of IBMsoftware. References in these materials to IBM products, programs, or services do not imply that they will be available in all countries in which IBM operates. Product release dates and/or capabilitiesreferenced in these materials may change at any time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to future product or featureavailability in any way. IBM, the IBM logo, Rational, the Rational logo, Telelogic, the Telelogic logo, and other IBM products and services are trademarks of the International Business MachinesCorporation, in the United States, other countries or both. Other company, product, or service names may be trademarks or service marks of others.Alan KanTechnical Manager, IBM Software Delivery Solutionsalan@alankan.netwww.linkedin.com/in/zenmaster/
    43. 43. 43Thank you to our sponsorsThank you to oursponsors

    ×