Agil udvikling i Maersk Line v. BestBrains


Published on

Published in: Technology, Business
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Agil udvikling i Maersk Line v. BestBrains

  1. 1. Chris Berridge24th April 2012Lean-agile practice adoption
  2. 2. About Maersk Line• Worlds largest container fleet• Truely global business• 14.5% world market share [1] • 570 container vesssels • Turnover $24 billion [2][1] Source: Alphaliner Jan 2011[2] Source: Annual Report 2010
  3. 3. Fragmented IT Landscape• Thin outsourcing model• Tier 1 vendors only• 2,500 applications• Core applications are tightly coupled• 23,000 bookings/day
  4. 4. How we started the lean-agile journey? New ExistingProject, Platform, Team Project, Platform, TeamRevolutionary Evolutionary Lean Product Development
  5. 5. X-leap: The goalUnder Maersk Lines paraplystrategi - streamLINE - er der i værksat en rækkeinitiativer, der sikre at rederiet bliver endnu mere konkurrencedygtige gennemindustriens bedste leveringssikkerhed, fortsatte CO2-reducerende initiativer ogsidste men ikke mindst ved at sætte kunden i fokus, oplyser Eivind Kolding til KlausLund & Partneres nyhedsbrev.X-Leap er Maersk Lines største og vigtigste af disse programmer.Formålet er at gøre det ligeså enkelt at booke en container hos ossom en bog hos Maersk Line CEO Source:
  6. 6. X-leap: How we sold agile to our stakeholdersMaersk is complex Two delivery approaches are common SAF 1. Waterfall 2. Prototyping career. marine iReceivab WebSimo P&O eProfile USI maersk.c sailing les n Nedloyd (SCV) om schedule (MLIS) s www. Mondo- LivePerso Emergen World einfo maersk.c RKST search n cy pages map om VMLO GSMS (CAF) MARS Referenc Portal GUPS service e-Data MARS ATS2 SAF marine Rates Followup shipment ≈ s eXport eRates booking Message MEPC W8 broker eDB eXport CCC documen Schedule -tation MEPC s Phone SFX Office WS book 3 ePaymen (docume NGP3 client/por t nt pouch) GEO tal Tracking No customer facing Lots of functionality service sROE 3 NGP3 Payment office MailServi system functionality for the early, but no ce service NGP3 mall SAF marine RKEM GEO mainfram MCS GCSS MEX IBM payment SCV first 18-24 months connection to backend (MLIS) portal e systems Our approach is fundamentally different ▪ 100’s of backend systems ▪ Convoluted and unstable application Agile SOA architecture ▪ Inconsistent master data Minimal set of customer ▪ High product complexity facing functionality – More than 20 000 lines in some contracts delivered with true backend – More than 500 commodity types Service bus connections as early as possibly (in our case 9–10 months)
  7. 7. X-leap: What we got right from the outset• Strong customer focus • Clear customer experience vision created• Co-location• Shared Key Performance Indicators for whole team• Onboard experienced people• Willingness to experiment with new approaches• Great senior leadership support
  8. 8. X-leap: 22 practices we (now) know thatneed to master• Visualised Flow and Process• Continuous Delivery• Continuous Integration• Test Driven Development• Automated Developer (Unit) Tests• Release Often• Evolutionary Design• Simple Design• Automated Acceptance (Functional) Tests• Refactoring• Collective Code Ownership• Definition of Done• End2End Iterations• Single Prioritised Backlog• Limit Work-in-Progress • Demo• Test Driven Requirements • Pair Programming (To Drive Standards)• Feature Teams • Pair Programming (To Ease Platform• Customer (proxy) Part Of The Team Constraints)• Stand Up Meetings
  9. 9. X-leap: A feature team in action
  10. 10. X-leap: Learnings within teamManage requirements• Prioritise effectively between functional & non-functional requirements• Break down requirements and agree on what size is appropriate• Need a process vision to support a customer experience visionIteration 0 is surprisingly large• e.g. Reducing hardening phase took forever
  11. 11. X-leap: Value stream analysis for a featureX-leap: Root cause analysis for why hardening phase takes so long
  12. 12. X-leap: Learnings within teamManage the change• Engage advisors who focus on optimising the whole• Own and manage practice adoption progressMinimise thrashing• E.g. We still struggle to measure velocity due to constant changes
  13. 13. X-leap: Learnings outside teamStakeholders need careful management• Reluctant to exchange predictability for speed• Difficult to explain refactoring & technical debt• High expectations of delivering fastDependencies external to the development team are aheadache• Feature teams help but are no silver bullet• There’s no replacement for good project management to identify and manage external dependencies• Others have to change their working practice (architects, infrastructure, other applications)
  14. 14. How we are completing the lean-agile journey. New Existing Project, Platform, Team Project, Platform, Team Revolutionary Evolutionary Lean Product Development
  15. 15. 600 Cycle Time Analysis Median = 150 days 400# Requirements GCSS Over last 24 mo Med = 280 days GCSS Over last 12 mo 200 Med = 373 days 0 DaysSource: Focal Point – requirements that have been put into production over the last 2yrs, measured from date of creation to when set to working-in-production
  16. 16. GCSS: Framing the methodologies Lean Product Enterprise Development Practices Agile Project Practices SCRUM Team Practices XP* Engineering Practices * Extreme Programming
  17. 17. Enabling Agility Business Agility Fast cycle Smooth Fast Value Time Flow Feedback Maximised Discovery MindsetCustomer doesn’t really The developer doesn’t Things know what they want really know how to build it change
  18. 18. GCSS: 8 selected practices1. Get to initial prioritisation faster2. Improve prioritisation3. Pull Requirements from Dynamic Priority List4. Reduce size of requirements5. Get to the point of writing code quickly6. Actively manage Work-In-Progress (WIP)7. Enable Faster Feedback8. Enable more frequent releases
  19. 19. GCSS: Release Frequency The effect of creating large release batches upstream S Des Dev TRequirements R25 S Des Dev T R24 S Des Dev T R23 S Des Dev T R22 Jan Apr Jul Oct Jan 2011 2012 Development Perspective: Dev Dev Dev Dev Estimated ~10,000 hours of idle time in 2010
  20. 20. GCSS: More Frequent ReleasesEnable the smooth flow of requirementsRequirements S Des Dev T Releases
  21. 21. Faster Feedback Eight Standard Measures Requirement Started Completed Launched captured coding coding Requirement Integrated Decided in production validated & built for launchRequire- Feasible Demonstrated ments Accepted Code complete Code Feature complete Release Launchablecandidate Launched
  22. 22. Faster Feedback Comparing GCSS with the X-leap on the Eight MeasuresAll times are in days
  23. 23. WIP LIMIT of 8 on bottleneck GCSS: Actively Manage Work-in-ProgressDepartment Slide no.
  24. 24. GCSS: Work-in-Progress reduced …whilst at least maintaining throughput 8,8 7,9 7,1 6,0 6,1 6,4 5,2 190# Requirements* Rel 19- R23 R24 R25 R26 R27 R28 22 76% Guesstimate points/week 46 Oct 2010 Jan 2012 *”Authorized” to “Launched”
  25. 25. GCSS: Requirement size variability Before Max. size# Requirements <2 weeks After Guesstimate Points
  26. 26. GCSS: Standardized Upstream ProcessGet to initial prioritisation fasterGet to point of writing code quickly DynamicTriage Priority Refine Dev Pull to coding… Buffer<1 week List <2 weeks Max 5 Quickly identify the Expect >10% attrition ideas that will be the otherwise upstream most profitable process is too heavy
  27. 27. GCSS: Quality improvements Releases 2010-2011 12,0 -88% 10,0 8,0 Defects 6,0 -80% Patches Defects Delays 4,0 Delays 2,0 0,0E1+E2 Defects raised in HOATProduction slippage (in days) Average Rel18-Rel22 8,2 11,2 Average Rel23-Rel28 1,0 2,2 -85% PatchesPatches 2wks after Prod 2,0 0,3 Up to June 2011 Since July 2011
  28. 28. GCSS: Cycle time Average time elapsed from starting work to released Refine Realise Release 0 50 100 150 200Releases 11 to 22* 208 days Rel 23 Half Rel 24 Rel 25 Rel 26 the Rel 27 104 time Rel 28 Days*No data for R18, R19
  29. 29. Rolling out!Feb 2011 May 2011 Sept 2011 Jan 2012 Aug 2012 GCSS Pilot Hermes Rollout Starter Pack to all delivery streams SOA SAP Masterdata London EDI Systemic issues
  30. 30. Department Slide no.
  31. 31. Engineering Quality ChecklistNew delivery teams need to adopt these as soon as possible in order to build quality in and establish a Deploymentfoundation for sustainable delivery of value. All batch testing of requirements and the subsequent deployment to production takes 7 days or less Development Build & test 18 All assets are checked into a single The build runs all unit tests, All environments can be recreated repository (code, config., test scripts, regression tests and all non-manual using the same automated process schemas, migration scripts etc) 1 acceptance tests 9 19 Broken builds are fixed (or the check- Developers check-in code to the in is reverted) before more code is Updates are deployed to production repository at least daily checked-in without customer downtime 2 10 20 Test stubs ensure all automated tests All deployments are automated Developers have collective code are independent of other systems (including schemas, migrations & ownership & responsibility (excl. network & integration tests) 3 11 platform/application configuration) 21 Non-functional requirements are A build is completed within 20 mins of A developer’s environment & tools are identified and prioritised alongside code check-in and is then deployed to built from a standard configuration other requirements 4 a non-production environment 12 within 2 hours 22 User interface tests & unit tests are Test coverage and code quality run by the developer before code Environment provisioning metrics are monitored check-in 5 13 Any new environments (excluding production) required are provisioned Source control branches are Testing is prioritised using a risk- within a week 23 frequently merged (every 2 weeks or based approach less) 6 14 Any standard production environments required are provisioned within a month 24 Technical debt Some performance tests are run at least daily 15 The team regularly takes time to Monitoring & improvement identify and record technical debt 7 Build, test & deployment process The load-to-failure threshold is performance is measured and identified 16 continually improved upon 25 Repaying technical debt is prioritized alongside other requirements 8 All programmatic interfaces are How to monitor production health is permanently available to other an integral part of the designv1.0 systems for integration testing 17 2612-2-2012
  32. 32. Learning from rollout so far • Practices seem to work everywhere • Mature teams are generally more receptive than newer ones • The know their process and that it needs improvement • As with all change programmes, a couple of key individuals in the team can make a huge difference • Personnel turnover make changes hard to stick • There are systemic issues which need addressingDepartmentSlide no.
  33. 33. What next for Maersk Line? Max• Legacy: Rollout 8 starter pack practices for all legacy applications 90 days cycle time Max 30• New: Additional practices for our new Service Oriented ”vision platform” days cycle time Department Slide no.
  34. 34. Department Slide no.
  35. 35. Slow burn - stakeholder education
  36. 36. Key Performance Measures for IT Variable Typical measures Usual outcomes Alternative measuresTime Delivering on a Incentivises hidden time Maximise speed in getting to predicted date buffers and slower delivery the point where value starts to be realisedScope Delivering all of the Incentivises gold plating and Minimize size of work originally predicted discourages exploitation of packages to maximize both scope learning. learning and early release of valueCost Delivering at or Incentivises hidden cost Maximize value delivered below a predicted contingencies, pushing (trade development cost development cost costs up. against the opportunity cost of delay)Quality Delivering changes Resistance to making any Shorten feedback cycles at with zero changes. Overinvestment in many levels (coding, defects…) downtime and no testing & documentation. errors
  37. 37. KPIs not strictly defined within the control BPO ML-IT cycle-time New IBM quality New availability Triage Refine Realise Release on Cost on Scope Dynamic Authorised for Launch Launched Prioritised List development authorised on Timeaverage < 1 week average < 2 weeks 90% < 90 days 70% < 90 days
  38. 38. Questions? Chris Berridge Programme Manager Lean Product Development Maersk Line IT +45 3363 8165 chris.berridge@maersk.comAgile Project/Programme Manager of the Year 2011
  39. 39. BACKUP SLIDESSlide no.Department
  40. 40. Quick Win Opportunity Increments of a Process Vision Lightbulb 2 Live from Idea to Working-in-Production Triage Refine Realise Release Idea Generation Problem StatementsIdea/Problem Broken down into small increments or iterations No obvious Dynamic solution & big changes Triage Priority List Refine Solution Theories Pull ideas that need to be Timeboxed broken < 2-4 weeks down Small Change Obvious Candidate Application RQ5620 Filter of RQ5827 Triage DPL Feature Team Refine RQ5864 RQ4659 Pull Realise X Pull RQ6843 Service Redirect Requests RQ5620 Filter of RQ5827 Triage DPL GCSS Team Refine RQ5864 RQ4659 Pull Realise B Pull RQ6843 RQ5620 Filter of RQ5827 Triage DPL SCV Team Refine RQ5864 RQ4659 Pull Realise Pull RQ6843 Parts to be Filter of delivered Triage DPL Coordination Refine by teams Release Team Pull