After an agile pilot, ETM decided to fully roll out the agile way of working. All projects have setup cross functional teams, some including I&V. Stakeholders like product managers, system managers and projects managers enable that teams can produce the right software. Support functions, like CM, QA, CPI, and managers, and process responsibles are also using agile techniques, to prioritize work and deliver value to the projects. So how does this impact process management? This presentation shows that processes are still important at ETM, but in a different way: agile integrated into enhanced streamline development. The main goal here is to get an "every employee improving our value stream and efficiency continuously" environment/culture.
Color coded agenda, so that the audience can see where we are. Intro: Problem statement, approach as SE affiliate, overview of technologies used Business Cases: Defect slippage as target, overview of the BBN/Monte Carlo model Quality Factors: Details of the model, what influences quality Pilot: Survey to determine improvement area (BBN), agile improvement (Monte Carlo) Conclusions: Is this approach useful, what did we learn?
Link between improvement and a business case: Investment/benefit
Why the current approach doesn’t work. What do we need to solve it?
Show quickly to let the audience know that this was a collaboration (maybe too much text?)
Position Fault Slip Through as main indicator of Quality, since: Accepted within Ericsson Indicates significant savings Can also be used to get cost/time savings
First sheet of the Quality Factors part: Zoom into factors that determine quality. This is the BBN model. Overview of the Quality phases: Management factors Defect Insertion Defect detection Background. This model has been made to investigate quality improvement at Ericsson R&D The Netherlands. It includes quality factors that most probably will have an impact on quality at that development site. It is by no means intended to be a complete model, and different factors may be applicable for other companies.
Zoom into the 4 areas of management factors Explain shortly why the influence of strategic and operational line management is indirect (via project/process)
Shortly show the main phases in software development where potentially defects are included.
Show the main phases where defects are detected. Explain that defects left is input for the defects slippage (main indicator of quality).
Show some detail on how the quality phase performance is build up, example on code inspection.
How the assessment was done: Survey based SEI survey tool Questions reviewed with the SEI Representation of compete R&D flow 2 axes: Relevant & performance
Show some detail on how the quality phase performance is build up, example on code inspection.
First sheet of the pilot part of the presentation. Explain the 2 steps: Assessment to determine potential areas of improvement (BBN) Actual improvement (agile requirements), calculate & validate results
Very low number of requirement defects, from which Some on not yet release functionality (implemented later) Some due to changes in platform software, could not be prevented Only 1 defect which could have been prevented in planning game
Mention the technologies used, and that the project was run in a Six Sigma way.
Animated sheet! BBN technology is used to model the quality phase performance, based up Quality factors. This is used to models the target that we focus upon: defect slippage. Historical and industry data have been used to quantify the relationship between quality phase performance and fault slippage. The BBN models current performance. Based upon expert opinion, using Monte Carlo, a calculation is done of the quality phase performance after the improvement. This is fed into the BBN to calculate the impact on defect slippage, and the potential savings.
Explain how the Quality Phase performance is linked to defect slippage, though the use of industry and historical data. This will probably trigger questions, propose to discuss them after the presentation (BoF or …)?
Problem Statement IntroductionQuality improvement needed in many organizationsBusiness case • Identification of problem areas • Selected improvement • DecisionQuantified • Costs & benefits • Lead time to resultAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 3 3
Quantification problems IntroductionMuch time needed to gather dataDifficult to measure thingsHard to keep management commitmentExpensiveRequired: Business case, with limited but sufficient measurement effort, to gain management commitment and fundingAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 4 4
Affiliate Collaboration IntroductionSEI Pittsburgh, PA:Software Engineering Measurement & Analysis GroupEricsson Netherlands:Market Unit Northern Europe & Main R&D CenterThe Software Engineering Institute Affiliate Program providessponsoring organizations with an opportunity to contribute their bestideas and people to a uniquely collaborative peer group who combinetheir technical knowledge and experience to help define superiorsoftware engineering practices.Affiliates: http://www.sei.cmu.edu/collaborating/affiliates/affiliates.html Agile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 5 5
Two models IntroductionDefect Estimation Model Resident Defects in Design Base • Data, tuned with expert opinion Design Process Competence, skills Defects Inserted (documentation, Defect Density Tools, environment code) Estimate Fault Slip Through Detection Rate • Test Process Competence, skills Test Capacity Defects Detected (Inspection, test) Fault Slip Through Tools, environment Defect Classification • Project/Product Quality Resident Defects in Delivered Product (Un)happy customers Process Inputs and outputs Influencing factors Defect Level MeasurementQuality Factor Model • Expert opinion, extend with data • Quick Quality Scan • Prediction Fault Slip Through • Improvement AreasAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 6 6
Measuring quality Business CasesInsertion: Where are defects made? How to prevent?Detection: Where are defects found? Early/economic removal?Quality: How many defect are left in the product at release?Agile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 7 7
Process View Business Cases Resident Defects in Design Base Design Process Defects Inserted Defect Density Competence, skills (documentation, Tools, environment code) Detection Rate Test Process Competence, skills Defects Detected Fault Slip Through Test Capacity (Inspection, test) Tools, environment Defect Classification Resident Defects in (Un)happy customers Process Delivered Product Inputs and outputs Influencing factors Defect Level MeasurementAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 8 8
Fault Slip Through Business Cases Lead ??? Time Cost ??? FST Quality ???Fault Slip Through = Number of defects detected in integration & customer test that should have been detected earlier “Should” implies that the defect is more cost effective to find earlier. Agile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 9 9
Quality performance assessment Agile Req.Survey based upon Quality Factors • 34 respondents from management & technical roles • 4 management areas & 7 technical areas2 sub questions for each quality factor: • How relevant is the factor when we want to improve quality? “little if any,” “moderate,” “substantial,” or “extensive,” • How well are we doing currently? “poor,” “fair,” “good,” and “excellent.”Agile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 15 15
Pilot “Business Case for Quality” Agile Req.Context: • Process management • Quality steering • Starting with AgilePilot: Agile for Requirements • Calculate value of process change • Run the pilot • Evaluate the resultAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 17 17
Improve: Requirements Stability Agile Req.Requirements Stability – Inverse of the amount of requirement changes over time. (The less changes, the higher stability.)Agile deployment • Backlog with Prioritized User Stories • Product manager as Product Owner • (Pre-) Planning game • Architecture team • Stand up meetingsAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 18 18
Improve: Scope Stability Agile Req.Scope Stability – Impact of major changes in projects that are related to changes in the product roadmap, including stability of the products to be developed, development teams involved in projects, and major changes in project funding or delivery dates.Agile deployment • Backlog • Responsibility of Agile teams and Product Owner • (Pre-) Planning game • RetrospectivesAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 19 19
Improve: Requirement Definition Agile Req.CapabilityRequirements Definition Capability – The skill and experience level of the people doing requirements definition (e.g., product managers).Agile deployment • (Pre-) Planning game • Stand up meetings • Collaborative Culture • RetrospectivesAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 20 20
Steering Agile Quality Agile Req.• Estimate latent defects after demo (planning game)• Collect defects during test (after demo).• Classify defects: • “introduction phase“ • “should have been detected phase”• Root cause analysis: Prevention• Decide improvement actions and communicate• Re-estimate and predict release quality.Agile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 21 21
Results Agile for Requirements Agile Req.• Very low number of requirement defects• Previous projects also had a low number• Based upon the data no conclusion could be drawnRoot Cause Analysis: • understanding requirements increased: planning game & stand-up meetings. • Improvements from retrospectives increased cooperation between development team and product owner.Requirements quality performance increased! Agile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 22 22
Conclusions ConclusionsQuicker Business Case: • Quality Factors/Performance • Fault Slip Through • Combining data and expert opinionImproved Requirements Performance • Agile increased requirements quality • Less defects after release • Increased flexibility and collaborationAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 23 23
More information ConclusionsPublications: • Building Process Improvement Business Cases SEI Technical Note: http://www.sei.cmu.edu/library/abstracts/reports/09tn017.cfm • Controlling Project Performance by Using the Project Defect Model in proceedings PSQT West Conference 2005 • The Business Benefit of Root Cause Analysis in proceedings SM/ASM conference 2003 • SPI, the agile way! To be presented at the SPIder conference, october 2009 www.spiderconferentie.nlContact: • Email: email@example.com • http://www.linkedin.com/in/benlindersAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 24 24
Solution IntroductionTechnologies • Bayesian Belief Networks (BBN) • Monte Carlo Simulation • Root Cause Analysis • Cost of Quality, Defect SlippageSix Sigma DMAIC Approach • Modeling Business Cases • Research Quality Factors & quantify Quality Improvement • Validate “Business Case for Quality”Agile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 26 26
Building a business case Business Cases Quality Quality Fault Quality Factor SlipBBN Quality Factor Phase Through Performance Factor Quality Factor Historical Industry Project Data DataMonte Current Improved Quality Phase Quality PhaseCarlo Performance Performance Subjective Expert OpinionAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 27 27
Bayes Belief Network (BBN) Business Cases • Probabilistic graphical model, to model uncertainty • Diagnose and explain why an outcome happened • Predict outcomes based on insight to one or more factors Used: • Modeling Quality Factors • Predicting Quality Phase Performance • What if ScenarioAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 28 28
Monte Carlo Simulation Business Cases • Compute a result based on random sampling • Modeling distributions of data • Can make uncertainty visible Used: • Calculate value of process changesAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 29 29
Quality Prediction Business CasesCurrent Model: Estimation • Extrapolate past performance • Based on inserted/detected defects • Plan & trackWanted: Prediction • Causes of defects • What if Scenarios All models are wrong • Decision taking Some models are useful DemingAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 30 30
Step 2: Defect Prediction Business CasesFault Slip ThroughDefect found in a (later) test phase that should have been found earlier “Should”: More Cost effective (economical)Predict Defect Reduction • Determine process impact • Simulate quality change • Predict savingsPilots • Agile • Model Driven DevelopmentAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 31 31
Quantify Quality Improvement Quality FactorsConnect defect data with Quality performance • Maximum quality factor => Industry best in class Published industry data from various sources • Distribution: Linear (keep it simple)Extend BBN to calculate remaining defects after each phaseResult: Model for “what if scenario’s” • Calculate defects in release products, when quality performance improves • Cost of Quality data to calculate savingsAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 32 32
Monte Carlo: Quality performance Quality FactorsMonte Carlo simulation • Input from 5 experts • Estimated chance of occurrence and impact on FST (1-5 scale) • Simulation done to calculate impact on quality factors • Result used in BBN model to calculate effect on defect slippageExpected result: • Reduced number of requirement defects introduced • Increased effectiveness of late testing phases • Less defects in products shipped to customers • Cost saving: — Limited saving in the project — Major saving during maintenanceAgile Requirements Agile Consortium Benelux, sep 30, 2009 (C) Ben Linders 33 33