Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Prioritizing Process Improvement And Ignoring The Rating


Published on

Presented at the 2008 Software Engineering Process Group Conference, Munich Germany

  • Login to see the comments

Prioritizing Process Improvement And Ignoring The Rating

  1. 1. Prioritizing Process Improvement and Ignoring the Rating Bruce Pedersen 12 June 2008 SEPG Europe 2008
  2. 2. Case Study <ul><li>This is a case study of a small high technology software company that decided to strategically apply the benefits of CMMI process improvement to meet immediate business goals and ignore the rating. It will be shown that for this company, applying just 3 of the Level 2 process areas (PPQA, M&A, CM) struck the right balance in achieving cultural buy-in, allowing efficient utilization of available resources, all while reaching a significant return on investment. </li></ul>
  3. 3. Topics <ul><li>Background </li></ul><ul><li>What is the problem? </li></ul><ul><li>Management Speaks </li></ul><ul><li>Determining the Best Approach </li></ul><ul><li>Implementation Strategy </li></ul><ul><li>Did all of this Work Make a Difference? </li></ul><ul><li>Next Steps </li></ul>
  4. 4. Background <ul><li>NavCom Technologies Inc. is a John Deere company that supplies the enterprise with high precision Global Positioning System (GPS) navigation solutions for integration into the farm tractor auto-track system </li></ul><ul><li>Software releases are transmitted to another John Deere company, Intelligent Vehicle Systems (IVS), who performs system integration of the GPS component with other components on the farm tractor </li></ul><ul><li>IVS also operates the customer service center and manages complaints from the farmers about tractor operation </li></ul><ul><li>IVS attempts to separate out and communicate the GPS related farmer “bug reports” to NavCom as well as “bug reports” generated during system test </li></ul><ul><li>Increasingly high number of software defects are being reported back to IVS from the farmer </li></ul><ul><li>The software release quality issue has become visible to farming customers around the world and action has to be taken to correct the problem </li></ul><ul><li>NavCom is directed by John Deere corporate to “fix” the software release quality problem </li></ul>
  5. 5. Who is NavCom Technology? <ul><li>125 Employees located at the Torrance, California facility </li></ul><ul><li>35 software engineers supporting six concurrent GPS Software Development projects </li></ul><ul><li>Average GPS Product development cycle 6 months to 2 years </li></ul><ul><li>Process improvement has been on-going for less than two years </li></ul><ul><li>Established in 1992, NavCom Technology, Inc, a John Deere Company, is a leading provider of advanced GPS receivers </li></ul><ul><li>NavCom provides John Deere with a centimeter level GPS position solution that is integrated into the farming tractor Autotrack system (40,000 units in the field) </li></ul><ul><li>Autotrack combined with terrain compensation provides straight line operations on hillsides or rough terrain “hands free” </li></ul><ul><li>For more information visit: </li></ul>
  6. 6. GPS at Work for John Deere
  7. 7. Software Bundle Releases <ul><li>Software releases are tied to farming seasons </li></ul><ul><li>Two bundle releases (summer/winter) are made per year </li></ul><ul><li>Major activity by both NavCom and IVS in pre-bundle release testing (see right) </li></ul><ul><li>NavCom performs an accelerated software development life cycle </li></ul><ul><li>After bundle release, reported software bugs from both IVS and farmers fixed and saved for the next bundle release </li></ul><ul><li>Feature requests by the farmer are also saved for a bundle releases </li></ul><ul><li>Show stopper bugs found after a bundle release are fixed and bundle updates made </li></ul>Software Engineering Test Product Verification & Validation Bug Reports from both IVS and Farmers Analysis and Implement Bug Fixes NavCom IVS IVS NavCom
  8. 8. What’s the Problem? Software Defect Visualized <ul><li>NavCom </li></ul><ul><ul><li>Reactive software development culture (i.e. firefighting) </li></ul></ul><ul><ul><ul><li>Bug fixes and schedule agreed to over the phone without requirement traceability </li></ul></ul></ul><ul><ul><ul><li>Feature requests taken over the phone for next release </li></ul></ul></ul><ul><ul><li>Lack of engineering test resources (personnel and tools) available to apply the necessary pre-release software engineering test rigor </li></ul></ul><ul><li>IVS - System Integrator </li></ul><ul><ul><li>Lack of GPS expertise to accurately report bug details necessary for bug environment recreation at NavCom </li></ul></ul><ul><ul><li>Lack of system test capabilities to isolate bugs caused by GPS vs. other integrated tractor component sources </li></ul></ul><ul><li>General </li></ul><ul><ul><li>Poor communication path between NavCom and the IVS (very little paper trail) </li></ul></ul><ul><ul><ul><li>Priority differences based on different business models </li></ul></ul></ul><ul><ul><ul><li>Trust </li></ul></ul></ul>
  9. 9. Management Speaks <ul><li>Management defines business goal related to software releases </li></ul><ul><ul><li>“ Significantly reduce the total number of software defects released to the customer in delivered software” </li></ul></ul><ul><li>Choose a methodology that will minimize </li></ul><ul><ul><li>Cost to Implement </li></ul></ul><ul><ul><li>Business Disruption </li></ul></ul><ul><ul><li>Cultural Impact </li></ul></ul><ul><ul><li>Time for improvement visibility </li></ul></ul><ul><ul><ul><li>Noticeable reduction in customer reported defects </li></ul></ul></ul><ul><li>Make recommendations for what’s needed to apply additional software engineering test rigor </li></ul><ul><ul><li>Internal infrastructure changes </li></ul></ul><ul><ul><li>Additional test tools </li></ul></ul><ul><li>There is no corporate or customer pressure for: </li></ul><ul><ul><li>A particular methodology </li></ul></ul><ul><ul><li>Schedule target for a particular methodology level or rating </li></ul></ul>
  10. 10. Determining the Best Approach <ul><li>What methodology? </li></ul><ul><ul><li>What improvement scope? </li></ul></ul><ul><ul><li>What representation? </li></ul></ul><ul><li>What implementation strategy? </li></ul><ul><ul><li>What Organizational infrastructure changes are needed to support the initiative? </li></ul></ul><ul><ul><li>What software engineering infrastructure changes? </li></ul></ul><ul><ul><li>What additional test tools are needed? </li></ul></ul>
  11. 11. What Methodology? <ul><li>The CMMI model selected for implementation </li></ul><ul><ul><li>CMMI used by other John Deere divisions </li></ul></ul><ul><ul><li>World wide software industry acceptance of the CMMI model and its best practices </li></ul></ul><ul><ul><li>CMMI model flexibility </li></ul></ul><ul><ul><ul><li>Tailorable for unique business cases </li></ul></ul></ul><ul><ul><ul><li>Selectable representation staged vs. continuous to better fit company objectives </li></ul></ul></ul><ul><ul><li>The “I” in CMMI allows phased integration of Systems and Hardware into the process improvement scope </li></ul></ul><ul><ul><li>Management directives can be satisfied using CMMI </li></ul></ul>
  12. 12. What Improvement Scope? <ul><li>Software only with future integration of systems and hardware engineering </li></ul><ul><li>Product development software projects in scope of CMMI process improvement </li></ul><ul><ul><li>Research and development software projects out of scope </li></ul></ul><ul><li>Test group in scope </li></ul><ul><li>Hardware only projects out of scope </li></ul><ul><li>Maintenance software projects to use portions of the CMMI model that makes sense (e.g. peer reviews, test) </li></ul>
  13. 13. What Representation? <ul><li>Representation selected should allow process improvement focus to be directly tied to the company business goal </li></ul><ul><ul><li>“Significantly reduce the total number of software defects released to the customer in delivered software” </li></ul></ul><ul><li>Representation selected should also take into consideration Management directives </li></ul><ul><ul><li>Cost to Implement </li></ul></ul><ul><ul><li>Business Disruption </li></ul></ul><ul><ul><li>Cultural Impact </li></ul></ul><ul><ul><li>Time for improvement visibility (noticeable reduction in customer reported defects) </li></ul></ul>
  14. 14. Representation Comparison <ul><li>Provides freedom to select the order of improvement that best meets the organization’s business objectives </li></ul><ul><li>Increased visibility into the capability achieved within each individual process area </li></ul><ul><li>Quick wins increase buy-in </li></ul><ul><li>Comparison across the divisions can only be made on a process area by process area bases (no maturity rating) </li></ul><ul><li>A relatively new approach which has limited data to demonstrate its ROI </li></ul><ul><li>Provides a predefined sequence of improvement areas based on proven path of successive levels each serving as a foundation for the next </li></ul><ul><li>Provides visibility at the maturity with limited visibility at the process area level </li></ul><ul><li>Permits easy comparison across divisions due to single maturity rating </li></ul><ul><li>Builds on a relatively long history of use that builds on case studies and data that demonstrates proven ROI </li></ul>Continuous Representation Staged Representation
  15. 15. Representation Selection <ul><li>The continuous representation chosen </li></ul><ul><ul><li>Satisfies Management directives </li></ul></ul><ul><ul><li>No customer pressure for CMMI capability </li></ul></ul><ul><ul><li>level rating </li></ul></ul><ul><ul><li>Organic process improvement growth works </li></ul></ul><ul><ul><li>for NavCom </li></ul></ul><ul><li>Mapping of process areas to business goals imperative </li></ul><ul><ul><li>Process Areas chosen that focuses on the immediate business goals </li></ul></ul><ul><ul><li>5 year plan created to phase in future process areas to support projected business goals </li></ul></ul><ul><ul><li>Plan updated every year to reflect changing business strategy </li></ul></ul>CMMI Business Goals
  16. 16. Implementation Strategy <ul><li>Process Area Selection </li></ul><ul><li>Establish a Capability Level Profile </li></ul><ul><li>Define the current project processes </li></ul><ul><li>Establish an organizational infrastructure that will support continuous process improvement </li></ul><ul><ul><li>Involve practitioners early on to foster process ownership </li></ul></ul><ul><li>Develop plans and processes that will enable selected process area support of the current project processes </li></ul><ul><li>Pilot processes and obtain practitioner feedback for constant improvement </li></ul><ul><li>Measure improvement progress using metrics </li></ul>
  17. 17. Process Area Selection <ul><li>Select Process Areas based on: </li></ul><ul><ul><li>Direct contribution to business goal </li></ul></ul><ul><ul><li>Provide foundation for future process area integration </li></ul></ul><ul><ul><li>Consider dependencies between process area’s and generic practices </li></ul></ul><ul><ul><li>Ease of implementation - complexity </li></ul></ul><ul><ul><li>Amount of support processes in place already at NavCom </li></ul></ul><ul><li>Choose the desired capability level for each of the selected Process Areas (1-5) </li></ul><ul><ul><li>The continuous model allows additional capability levels to be achieved on focused process areas </li></ul></ul><ul><li>Construct a capability level profile </li></ul><ul><ul><li>Demonstrate process improvement progress to management </li></ul></ul><ul><ul><li>Guide process improvement activities </li></ul></ul><ul><ul><ul><li>Process improvement can be conducted on individual PA’s at different rates </li></ul></ul></ul><ul><ul><ul><li>Resources can be focused on lagging PA activities </li></ul></ul></ul>
  18. 18. Process Area Selection (cont) <ul><li>The process areas that meet the selection criteria are a subset of the categorized “Basic Support” process areas: </li></ul><ul><ul><li>Process and Product Quality Assurance (PPQA) </li></ul></ul><ul><ul><li>Configuration Management (CM) </li></ul></ul><ul><ul><li>Measurement & Analysis (MA) </li></ul></ul><ul><li>Peer Reviews and Test were also selected from the Verification process area (Specific Goals 2 & 3) to directly support “reducing the number of defects in software releases” </li></ul>
  19. 19. Process Area Mapping Management Products Project Managers Level 2 PAs Management Requirements Med Med Low Time for ROI Visibility Low Low Med Cultural Impact Low Low Med Business Disruption Low Med Med Cost to Implement CM PPQA MA Minimize PP PMC SAM MA PPQA CM Code Product Practitioners VER Peer Reviews and partial implementation of testing REQM
  20. 20. Process Area Description <ul><li>PPQA – provide staff and management with objective insight into processes and associated work products via independent auditing </li></ul><ul><li>CM – establish and maintain the integrity of work products using configuration identification, configuration control, configuration status accounting, and configuration audits </li></ul><ul><li>MA –develop and sustain a measurement capability that is used to support management information needs </li></ul><ul><li>Peer Reviews and Test – ensure that selected work products meet their specified requirements (VER process area) </li></ul><ul><li>Note: Test refers to system test prior to Product Verification and Validation (PV&V) testing </li></ul>
  21. 21. Establish a Capability Level Profile <ul><li>Determine what capability level to target </li></ul><ul><li>for each selected process area </li></ul><ul><li>Establish a method for overlaying the </li></ul><ul><li>current achievement profile to the target </li></ul><ul><ul><li>Determination of current % complete report to </li></ul></ul><ul><ul><li>Management </li></ul></ul><ul><ul><li>Used for appraisal readiness check </li></ul></ul><ul><li>Use target staging to establish a path of process improvement to be followed by the organization </li></ul>
  22. 22. Capability Level Profile GG3 Institutionalize a Defined Process
  23. 23. Define the Current Project Process <ul><li>The Current Project Process defines the way a project </li></ul><ul><li>develops software today </li></ul><ul><li>Current Project Process is unique to each project </li></ul><ul><li> (Level II) </li></ul><ul><li>Provides awareness to project practitioners of gaps </li></ul><ul><li>in current unmanaged project software process </li></ul><ul><ul><li>Written processes not consistently followed </li></ul></ul><ul><ul><li>(e.g. skipped in schedule crunch) </li></ul></ul><ul><ul><li>Unwritten processes verbally communicated by peers </li></ul></ul><ul><ul><li>Non-Existent processes made-up by individuals to do their job </li></ul></ul><ul><li>Starting point for application of selected CMMI Process Area support </li></ul><ul><li>Opens up communication path between the process group and project practitioners </li></ul><ul><ul><li>Flow chart used to communicate current process at the 50,000 foot level </li></ul></ul><ul><ul><li>Allows further drill down in focused process areas using non-CMMI jargon or concepts </li></ul></ul>
  24. 24. PA Support of Current Project Process <ul><li>Fosters practitioner process ownership in helping the process group define needed processes </li></ul><ul><li>Identifies areas were application of process improvement could be focused for near term ROI </li></ul><ul><li>Allows groups supporting the project to have a voice </li></ul><ul><li> </li></ul><ul><li>Can be used for training new project personnel </li></ul>Configuration items and change requests Information Needs Measurements and analysis Baselines and audit reports Quality and noncompliance issues Standards and procedures Defect Reports Peer Reviews & Testing PR + Test PPQA MA CM L2 L2 L2 L3 Current Project Process
  25. 25. Establishing an Organizational Process Improvement Infrastructure <ul><li>SPI Sponsor selected (Director of Systems) </li></ul><ul><li>Software Process Improvement group formed </li></ul><ul><ul><li>SPI Manager </li></ul></ul><ul><ul><li>PPQA Lead Auditor </li></ul></ul><ul><ul><li>Process Engineer </li></ul></ul><ul><ul><li>Administrator (training records, PAL administrator, etc.) </li></ul></ul><ul><li>SEPG formed with representation from each project and functional group (7 members) </li></ul><ul><ul><li>SPI Manager Chair </li></ul></ul><ul><ul><li>Program Management </li></ul></ul><ul><ul><li>Software Development </li></ul></ul><ul><ul><li>Test </li></ul></ul><ul><li>Technical Working Groups managed by SEPG to accomplish work </li></ul><ul><li>3 member reserve audit team established to support the PPQA Lead </li></ul><ul><ul><li>Made up of project practitioners </li></ul></ul><ul><ul><li>Reserve auditor training performed by PPQA Lead </li></ul></ul><ul><ul><li>Auditor performs cross project audits only (not own project) </li></ul></ul>
  26. 26. Focused Efforts at Cultural Buy-In <ul><li>Early participation by practitioners in defining </li></ul><ul><li>processes foster process ownership </li></ul><ul><li>Cross functional representation in </li></ul><ul><li>the SEPG </li></ul><ul><ul><li>Membership rotation scheme introduced </li></ul></ul><ul><ul><li>to expose a greater % of the practitioners </li></ul></ul><ul><ul><li>to process improvement activities </li></ul></ul><ul><li>Tool deployed to facilitate practitioners process </li></ul><ul><li>improvement suggestions </li></ul><ul><li>Monthly published process improvement </li></ul><ul><li>newsletter that features SPI topics and progress </li></ul><ul><li>SCAMPI B and C appraisals conducted every 6 months </li></ul><ul><ul><li>Focus process improvement efforts towards a near term goal </li></ul></ul><ul><ul><li>Align culture to the appraisal process </li></ul></ul><ul><li>Periodic Workshops held with individual project to discuss new process deployment and lessons learned </li></ul><ul><li>Training program initiated to teach new processes to project personnel </li></ul>
  27. 27. Process Creation and Deployment <ul><li>The SPI Group creates plans and processes for direct support of projects defined process </li></ul><ul><ul><li>Project practitioners participate in defining the current project process </li></ul></ul><ul><ul><li>Practitioners participate in content review of project plan/process documents </li></ul></ul><ul><li>The SPI Group creates workshops to familiarize project practitioners on new processes </li></ul><ul><li>New processes are piloted first and practitioner feedback used for process improvement updates </li></ul><ul><li>Metrics are defined to measure process effectiveness </li></ul><ul><li>Lessons learned are managed and used for process updates </li></ul>
  28. 28. What Else Do We Need <ul><li>Test – additional rigor focused on finding defects before release to the customer </li></ul><ul><ul><li>Formalize testing using historical test scenarios </li></ul></ul><ul><ul><li>Dedicated group formed to perform software engineering test function </li></ul></ul><ul><li>Tools –leverages process improvement advantage </li></ul><ul><ul><li>Artifact storage </li></ul></ul><ul><ul><li>Automated CM </li></ul></ul><ul><ul><li>Bug tracking </li></ul></ul><ul><ul><li>Central repository for process improvement documents (PAL) </li></ul></ul>Reduced Customer Reported Softare Defects CMMI Tools Test Cultural Buy-In
  29. 29. Establishing a Test Organization <ul><li>A Software Engineering Test Group (SETG) was formed with a charter to formalize the software pre-release engineering test process of the organization </li></ul><ul><ul><li>SETG Lead </li></ul></ul><ul><ul><li>Three Test Engineers </li></ul></ul><ul><li>In parallel, two tasks were conducted to support the SETG </li></ul><ul><ul><li>A PC based GUI tool was developed to automate regression testing against baseline and new releases </li></ul></ul><ul><ul><ul><li>GUI allows test engineer to select and control the battery of tests to be run </li></ul></ul></ul><ul><ul><ul><li>Automatic report generation and plotting used for analyzing test results </li></ul></ul></ul><ul><ul><li>Regression library created to furnish regression test scenarios that have been created to solve problems during the past seven years </li></ul></ul><ul><ul><ul><li>Scenarios generated to recreate conditions that would reproduce reported bugs </li></ul></ul></ul>
  30. 30. What Test Tools are Needed? <ul><li>CM tool (Microsoft Team Foundation Server) selected to help software development teams communicate and collaborate more effectively </li></ul><ul><ul><li>Streamlines the implementation of the CM process area across the organization </li></ul></ul><ul><ul><li>Enforces CM on both source code and documents via predefined rules </li></ul></ul><ul><ul><li>Provides a central repository for process documents, templates (Process Assets Library) and artifacts supporting the appraisal </li></ul></ul><ul><ul><li>Integrated CMMI methodology component for “head start” in process definition and implementation </li></ul></ul><ul><ul><li>Integrated work item tracking and reporting (e.g. bug tracking) </li></ul></ul><ul><ul><li>Extensive suite of “out of the box” reports for tracking progress and metrics mining (e.g. defect phase containment, density) </li></ul></ul>
  31. 31. Did All of This Work Make a Difference?
  32. 32. SSB = Summer SW Bundle, WSB = Winter SW Bundle NavCom’s SETG On-Line IVS Test Group Revamped
  33. 33. Customer Defect Metric Explanation <ul><li>The total reported defects “green column” are what the system integrator (IVS) reported back to NavCom </li></ul><ul><ul><li>Customer reported defects are included in the green column </li></ul></ul><ul><li>More feature requests and bug fixes are included in the winter bundles thus generating more defects </li></ul><ul><li>With each bundle release comes increased complexity from newly implemented GPS features and capabilities </li></ul><ul><li>A major technology upgrade to increase both GPS accuracy and reliability was made in the Winter Software Bundle 07 </li></ul>
  34. 34. Bottom Line <ul><li>The drop in customer reported bugs between the Winter and Summer bundle 06 is due to the activation of the SETG and early adoption of peer reviews </li></ul><ul><ul><li>SETG able to detect complex navigation related bugs via simulation </li></ul></ul><ul><li>The dramatic increase of defect detection by IVS in SSB06 was due to the addition of testing resources and training by NavCom on effective GPS testing techniques </li></ul><ul><li>Although NavCom’s defect metric database is still immature, indirect evidence indicates that NavCom’s cultural change has started and defect detection and correction is getting better </li></ul><ul><li>NavCom’s cultural change is rippling over to IVS causing them to also get better at detecting defects and reporting back accurate test results </li></ul><ul><li>The farmer is benefiting from all of this by reducing the number of defects seen in the field from 17.7% to 1.6% in two years </li></ul><ul><li>Shown is a downward trend in total software defects released to the customer </li></ul>
  35. 35. Next Steps <ul><li>Align the process improvement 5 year plan to </li></ul><ul><li>the newly created 5 year strategic business plan </li></ul><ul><li>for NavCom </li></ul><ul><li>Based on lessons learned over the past two years, </li></ul><ul><li>select the next group of process areas for future implementation </li></ul><ul><ul><li>Requirements Management </li></ul></ul><ul><ul><li>Technical Solution </li></ul></ul><ul><ul><li>Project Planning </li></ul></ul><ul><li>Continue refining the metrics database to give better visibility to defect detection effectiveness </li></ul><ul><li>Continue laying the infrastructure foundation to allow seamless transition to level 3 process areas </li></ul>