Feedback-focussed process improvement (2006)

546 views

Published on

EuroSTAR conference Dec 2006, Manchester. #3 in what I pretentiously call my Goldratt Trilogy.

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

Feedback-focussed process improvement (2006)

  1. 1. Feedback-Focussed Process Improvement Neil Thompson14th European Conference on Software Testing, Analysis & Review 4-7 December 2006: Manchester, England, UK Track Session T6 ©
  2. 2. Can Process Improvement forInformation Systems learn from these? TOYOTA PRIUS © TOYOTA CELICA GT4 2
  3. 3. Contents of presentation1. Traditional process improvement in STAR2. “New” methods in manufacturing: how Toyota has been so successful, comparison with Goldratt Theory of Constraints3. How that new paradigm translates into IT/IS: • Agile methods • Lessons for process improvement in general…4. Example: extending TPI® (using Strengths, Weaknesses, Opportunities & Threats)5. Thinking tools & feedback loops6. Tipping points7. Other ways to apply the above (eg TMMSM, TOMTM, DIY)8. Integrating Goldratt-Dettmer Thinking Tools into Systems Thinking9. Implications for Time-Cost-Quality-Scope © 3
  4. 4. “Traditional” Process Improvement inInformation Systems Review & Testing Test  Some analogies Medical flexibility Organisation MaturityTM MATURITY LEVELS Test Maturity ModelSM PREDEFINED SUBJECT AREAS Test ProcessSources: Improvement®TMMSM - http://www.stsc.hill.af.mil/crosstalk/1996/09/developi.aspTPI® – based on http://www.sogeti.nl/images/TPI_Scoring_Tool_v1_98_tcm6-30254.xls ©TOMTM – based on http://www.evolutif.co.uk/tom/tom200.pdf, as interpreted by Reid, S. Test Process Improvement – An Empirical Study (EuroSTAR 2003) 4
  5. 5. How Toyota progressed through Qualityto Global dominance, & now Innovation • Quality (top reputation): – has dominated JD Power satisfaction survey for decade – Toyota Production System (TPS): 14 principles across Philosophy, Problem-Solving, Process and People & Partners • Global dominance: – market value > GM, Chrysler & Ford combined – on track to become (2006) world’s largest-volume car mfr • Innovation (fast): – Lexus: invaded the “quality” market and won – Prius: not evolutionary but revolutionary – and launched 2 months early and sold above expectations – Toyota Product Development System (TPDS): 13 (!) principles across Process, People and Tools & Technology © 5
  6. 6. Toyota‟s TPS & TPDS merged withBalanced Score-Card PEOPLE & PARTNERS (Respect, Challenge & Grow) FINANCIAL Quality 11b. Suppliers 11a. Partners 3. Pull to avoid over-production 12. See for yourself to thoroughly understand 2. Continuous process flow to surface problems 4. Level workload PROBLEM - SOLVING 9. Leaders 7. Visual control to see problems (Continuous Learning PROCESS 10. People & Teams: & Improvement) • func exp PROCESS & cross- USER PHILOSOPHY 14. Learning org func int (Eliminate Quality (Long-Term) via reflection & improvement Waste) 1. …even at short-term expense • Build learning culture Quality 6. Standardised tasks for continuous improv & empowerment • Tools 8. Reliable tested technology that serves Process & People 5. Stop-to-fix: “right first time” PRODUCT Quality © 6
  7. 7. The “new” paradigm in manufacturing:value flow, pull not push, problem-solving TOYOTA: GOLDRATT:• Customer-defined value 3. Pull to avoid Takt (rhythm), Drum-(to separate value-added from waste) over-production Low-Inventory (“lean”), Buffer- 4. Level workload Just-In-Time Rope Minimise waste Maximise throughput 2. Continuous process flow to surface problems Andon (stop and fix) Critical Chain manag’t 7. Visual control to see problems Kanban cards Tagging slow movers Monitoring buffers One-page metrics 12. See for yourself to thoroughly understand Chain of 5 “why”s Cause-effect trees Conflict resol diagrams 14. Learning org via reflection & improvement Plan-Do-Check-Act Identify constraint, “elevate” & iterate • And now these principles have been successfully applied beyond actual manufacturing, into product development • But what about development of software?...13. Decide slowly (all options) by consensus © 7
  8. 8. The new paradigm in IS development:agile methods • Alistair Cockburn: – Increasing feedback & communication reduces need for intermediate deliverables – Efficiency is expendable in non-bottleneck activities • Mary & Tom Poppendieck: – Map value stream to eliminate waste – Critical Chain project management – Decide as late as possible • David J Anderson: – Throughput of value through stages of spec, dev & test – “Stratagrams” (my term) • But whether or not we are using agile methods, we can use new paradigm principles to improve processes… Sources: Cockburn, A. Agile software development (Addison-Wesley Pearson Education 2002) Poppendieck, M&T. Lean software development (Addison-Wesley 2003) © Anderson, David J. Agile management for software engineering (Prentice Hall 2004) 8
  9. 9. Extending the new paradigm to “STAR”:By rearranging TPI‟s key areas…1.Test strategy2.Lifecycle model3.Mom of involv‟t4.Estim & plan5.Test spec techn6.Static techn‟s7.Metrics8.Test automation9.Test env‟t10.Office env‟t11.Commit & motiv12.Test func & train13.Scope of meth‟y14.Communication15.Reporting16.Defect mgmt17.Testware mgmt18.Test proc mgmt19.Evaluation20.Low-lev testing …we can begin to see cause-effect trees… © 9
  10. 10. Cause-effect trees: can start with TPI‟sinbuilt dependencies 4.Estim & plan 6.Static techn‟s 5.Test spec techn 7.Metrics A:Substantiated A:Informal techniques A:Product for project B:Formal techniques 3.Mom of involv‟t 1.Test strategy 19.Evaluation 20.Low-lev testing 2.Lifecycle model 18.Test proc mgmt A:Single hi-level test A:Plan, spec, exec A:Planning & exec’n A:Compl test basis B:+Monitoring & adjustment 13.Scope of meth‟y 14.Communication 11.Commit & motiv 12.Test func & train 16.Defect mgmt 15.Reporting A:Budget & time A:Internal A:Defects A:Project-specific B:Test int in proj org B:Progress, activities, prioritised defects A:Testers & Test Mgr 10.Office env‟t 9.Test env‟t 8.Test automation 17.Testware mgmt A:Managed-controlled A:Internal ©…eg for getting to at least level A throughout 10 (slightly simplified)
  11. 11. Can add extra “key areas”, lifecycleinputs & outputs, general categories INPUTS & 4.Estim & plan + Risk-Based 6.Static techn‟s 5.Test spec techn TECHNIQUES 7.Metrics STAR INFLUENCES in general on STAR 3.Mom of involv‟t 1.Test strategy 19.Evaluation 20.Low-lev testing 2.Lifecycle model 18.Test proc mgmt LIFECYCLE in general 13.Scope of meth‟y 14.Communication 11.Commit & motiv 12.Test func & train 16.Defect mgmt 15.Reporting ORGANISATION in general 10.Office env‟t 9.Test env‟t INFRA- 8.Test automation + Test data 17.Testware mgmt OUTPUTS STRUCTURE from STAR in general ©…eg TPI / TMap’s four “cornerstones” 11
  12. 12. Can go beyond the fixed questions:SWOT each subject area Source: solid borders denote as in TPI; dashed borders denote additional INPUTS & INFLUENCES on STAR 4.Estimating & planning STRENGTHS OPPORTUNITIES STRENGTHS OPPORTUNITIES Some managers are considering Monitored, and adjustments agile methods made if needed Business analysts may be motivated by UML training Too busy for well-considered estimating & planning System requirements are The most experienced The squeeze on testing is agreed too late business analysts are leaving, likely to worsen more may follow Release dates are fixed System specs & designs are defective, just timeboxed Can’t recruit more staff System specs are heavy text documents Not substantiated, just “we did it as in previous project” WEAKNESSES THREATS WEAKNESSES THREATS ©(small Post-it® notes are good for this) 12
  13. 13. Some cause-effect trees may “re-root”to form loops Note: solid borders denote as in TPI; dashed borders denote additional; items without borders have been added after considering the SWOT INPUTS & INFLUENCES 4.Estim & plan + Risk-Based STAR 6.Static techn‟s on STAR Too busy for well-considered Also too busy to do Risk-Based 5.Test spec techn System specs & designs are estimating & planning STAR defective, just timeboxed Work takes VARIOUS If they can Text-only format makes defect longer than ADVERSE timebox, “planned” … detection less likely so can we EFFECTS (+other areas, System specs are heavy text Not substantiated, just “we + knock-ons) documents did it as in previous project” (STAR) LIFECYCLE 3.Moment of involvement 1.Test strategy in general 19.Evaluation Testing starts later than when 20.Low-lev testing Culture of our testers is to test basis is complete OUTPUTS from STAR prefer large text documents to diagrams Live systems are buggy Key testers are still involved in “firefighting” previous release Testers spend much time … … helping diagnose, then retest Management demand inquests, even more time “wasted” (or opportunity for causal analysis?) © 13
  14. 14. A second example: Test Design Too many failures in Live Not trained in formal test techniques Not time to attend courses Too busy “firefighting” Formal techniques difficult even after training “Informal” techniques not really defined Don’t know how to define & use test coverage Unable to tabulate coverage for new tests No time to tabulate coverage from existing scriptsOmissions Overlaps Tests probably include Don’t know actual test coverage low-value conditions & casesIncreased likelihood of Write numerous tests, “to be safe”coverage omissions & overlapsescaping rectification Difficult to select regression tests Test specs are large & “texty” Test specifications difficult to review When execution time runs short, unsure which tests safest to omit Increased likelihood of coverage omissions & overlaps escaping detection Test execution takes long time Add even more testsTest Design does appear in TPI, TMM & TOM, but ©does this loop indicate it deserves more prominence? 14
  15. 15. New paradigm problem-solving: theGoldratt-Dettmer* “Thinking Tools” What to ........... What to change to ....... change (1) (2) (3) Core problem+ Prerequisites+ CURRENT REALITY(other) Root causes Conflicts + Injections Intermediate Intermediate FUTURE Requirements + effects effects INJECTIONS REALITY Undesirable Desired Objective effects effects CURRENT CONFLICT .... How to change .... REALITY RESOLUTION (4) Intermediate Needs+ (5) objectives Specific actions * very slightly paraphrased here Obstacles Intermediate effects PRE- TRANS- Objective Objective REQUISITES ITIONSources: Dettmer, W. Goldratt’s Theory of Constraints (ASQ 1997) © Thompson, N. “Best Practices” & Context-Driven – building a bridge (StarEast 2003) 15
  16. 16. Applying the Thinking Tools toinformation from SWOT analysis Using extracts from both 1st & 2nd examplesSTRENGTHS OPPORTUNITIES TACTICAL: Address culture by worked examples of diagrams Some managers are considering agile methods (Use Strengths to help TACTICAL: Include tables & diagrams in test specifications Business analysts may be amplify opportunities) motivated by UML training ONGOING: Techniques training & Action planning Can still improve coverage coaching at macro level with informal techniques STRATEGIC: (80/20) Improve SDLC method TRANS- CONFLICT FUTURE REALITY ITIONCURRENT REALITY RESOLUTION PRE-Culture of our testers is toprefer large text documents SDLC method does not encourage diagrams REQUISITESto diagrams System specs are heavy text documents (Use Threats to help Anticipating & Test specs are large & “texty” identify obstacles) overcoming obstacles Test coverage omissions & overlapsWEAKNESSES Too many failures in Live THREATS The SWOT method can be “nested”, eg aggregate up © from individual subject areas to whole lifeycle 16
  17. 17. A third example: Defect fixing &retesting Source: Ennis, M. Managing the end game of a software project (StarWest 2000) Code turmoil - Code fixes are good, fixing faults Test completion “understand relationship between metrics” slowness Defect detection rate Let us reverse 2 labels, to Less slowness Fewer fixes Fewer defects, backlog can reduce needed make all 5 quantities bad things Fewer defects means fewer Defect backlog Test failure rate failing tests Code turmoil + Too many bad fixes, knock-on faults Its shape, changing Test completion slowness  Defect detection rate over time, could show us More defects whether we have Failing tests Failing tests slow completion need fixes add to backlog a feedback loop, Tests fail if either bad… … or good they expose Defect backlog Test failure rate defects © Then the “risk spider” might be renamed a „correlation amoeba‟ 17
  18. 18. Tipping Points Source: Gladwell, M. The tipping point (Abacus 2000) • Reversing the vicious loop into a virtuous loop: – example on previous slide required only one relationship change… – ie effect of code turmoil on defect detection rate to tip from positive to negative • Examples of Tipping Points: – removing New York graffiti reduced serious crime rate – Stanford “Prison” Experiment: spiral of vindictiveness • Achieving Tipping Point involves: – concentrating resources on a few key areas – a way to make a lot out of a little – fresh thinking, sometimes counterintuitive ©Note the similarities with Goldratt’s Theory of Constraints, and more generally the Pareto principle (80-20) 18
  19. 19. Fourth example: Documentation • Documentation can also become a vicious loop: – documentation tends to get out of date – “do you want us to fix the documentation or deliver software?” – can then become unclear whether tests passed / failed: what is it meant to do? – no-one reads the documentation any more, so defects in it are no longer detected – so it gets even further from reality… © 19
  20. 20. Documentation and flow of valueLEVELS OF DOCUMENTATION,pushed by specifiers FLOW OF FULLY- WORKING SOFTWARE, pulled by customer demand + Test SpecificationsRequirements Accepted System- tested + Func Spec Integrated + Technical Design WORKING SOFTWARE Unit / Component-tested + Unit / Component specifications © 20
  21. 21. Difficulties in problem-solving:conflict resolution (eg for documentation) Agree in a “We need workshop what “If it’s not written, “Test reports need to “Reviews are powerful at more documentation finding defects early, but it’s is needed it can’t be signed off” be formal documents” difficult to review just speech” documentation” “What documentation is needed “Sign-off can be by for contractual reasons? Still agreed meeting outcomes” “Are there few enough people time to negotiate?” Yes! Documentation to make frequent widespread doesn‟t have to meetings practical?” No! Objectives of be paper: use documentation wikis etc CONFLICT Can mix Documentation varies: exploratory are to help build & scripted need to distinguish necessary Documentation from unnecessary and maintain a is still needed testing Make maximum fit-for-purpose for maintenance use of tables Need to distinguish quality after go-livesystem by knowing & diagrams of documentation, not just quantity and agreeing what built and “They will when their what tested memories have faded “Are test analysts writing “Will the live system be Our users cannot be on-site or when there’s a tests for others to run?” No! maintained by its with the project throughout contract dispute” its developers?” No! “Signed-off requirements “People never read “Documented test plans Specifications are like any documentation” are counterproductive to inventory, no end value are counterproductive to systems meeting real “We need the best testing” user needs now” less documentation” ©Developed further to Daich, GT. Software documentation superstitions (STAREast 2002)See also Rüping, A. Agile documentation (Wiley 2003) 21
  22. 22. Resolving the “conflict” between agileand outsourcing “Outsourcing / offshoring is “How will we communicate “How much documentation “Low cost is the most across inter-company / will we need to control important factor for us” the way to go” international boundaries? project & products? Objectives of methodology CONFLICT are to help build “What are the and maintain a regulatory “Does the time-cost-scope-quality/risk system to requirements? pyramid always apply?” No! appropriate time-cost-scope- quality/risk balance etc… “How confident are we in “Fast delivery is the most “Agile is knowing requirements? important factor for us” the way to go”Inspired by Guckenheimer, S. with Perez, JJ. Software engineering with Microsoft Visual Studio © Team System (Addison-Wesley Pearson Education 2006) 22
  23. 23. Can use these principles to focus TPI,TMM, TOM etc… • TPI: (as in earlier slides) – use “off-question” information directly via SWOT – use relationships between key areas to identify causes-effects, loops, and hence biggest-payback improvements • TMM: – choose between staged & continuous – consider moving selected items between levels, up or down • TOM: – the “low=1” scores are weaknesses (but you may be able to think of others); similarly “high=5” scores strengths – look for additional symptoms (via SWOT) • Generally: – look for Tipping Point improvements from among the many candidates – seek Tipping Point insights into the change management challenges (Connectors, Mavens & Sellers) © 23
  24. 24. …or invent your own methodology… Effective STAR Build lessons learned into checklists Improve efficiency of STAR Risk management Quality management Decide process targets Assess where errors originally made & improve over time Insurance Assurance Be pragmatic over quality targets Plan early, then Define & use metrics Give confidence (AT) rehearse-run, Use handover & acceptance criteria Define & detect errors (UT,IT,ST) acceptance tests V-model: what testing against W-model: quality management Use independent system & acceptance testersRisks: list & evaluate Tailor risks & priorities etc to factors Use appropriate skills mixUse Risk-Based Do Reviews  Refine test specifications progressively: Define & agree roles & responsibilitiesSTAR & Analysis  Plan based on priorities & constraints  Design flexible tests to fit Use appropriate techniques & patternsDefine & measure  Allow appropriate script format(s)test coverage  Use synthetic + lifelike data Use appropriate toolsAllow & assess for coverage changes Document execution & management procedures Distinguish problems from change requests Measure progress & problem significance Prioritise urgency & importanceQuantify residual risks & confidence Distinguish retesting from regression testingModified after Thompson, N. “Best Practices” & Context-Driven – building a bridge (StarEast 2003) © 24
  25. 25. … or just do simple lifecycle-focussedprocess improvement 11.Commit & motiv 1.Test strategy 2.Lifecycle model 12.Test func & train 13.Scope of meth‟y 15.Reporting 10.Office env‟t 9.Test env‟t 18.Test proc mgmt 14.Communication 4.Estim & plan 3.Mom of involv‟t 17.Testware mgmt (a) MANAGE STAR END-TO-END (whole systems…………...………….whole lifecycle architecture as part of QM/QA) 7.Metrics (b) USE RISK-BASED STAR (f) USE METRICS AND ACT (c) USE (d) MANAGE ACCORDINGLY STRUCTURED REVIEWS POSITIVE COVERAGE 19.Evaluation FEEDBACK OF TESTS (involving 16.Defect mgmt roles & checklists) (including…) 20.Low-lev testing (g) USE 8.Test (e) USE TOOLS automation FORMAL APPROPRIATELY 6.Static techn‟s TECHNIQUES 5.Test spec techn ©…eg “7 Habits of Adequately Effective STAR-ers” 25
  26. 26. Process Improvement is itself areinforcing feedback loop… “Now we’ve got “Now we’ve got “No point in these “No point in writing end-to-end responsibility, root cause analysis, improvements because checklists because these improvements will we can make our not responsible people know they’re be even more effective” checklists even better” end-to-end” just wishful thinking”  “Now we’re trained in “Can‟t do structured reviews “Can‟t do test coverage “Now we’re trained in risk analysis, can do because not trained in because not trained in formal techniques, can do structured reviews risk analysis” formal techniques” test coverage even better” even better”…but first we may need to reverse the negative loop of process stagnation (through its © Tipping Point) 26
  27. 27. Systems thinking: more holistic view,eg people issues Long working hours “Stretch goals” Too many live failures?Project “delivered” Overall “adequate” Fatigue “Pizza “Must need a “Mad buton time work Hero” Time Management Exploitable” Tolerable degree of Course” Culture Culture Culture live failures? More work “Pizza absorbed Parasite” Culture Even more work wantedMore remedial work by managementneeded Money now: Money later: Guilt, overtime promotion feelings of prospects inadequacy More mistakes, defects, faults, Higher “efficiency” failures in dev & test… (per week) Abandon healthy shopping, So what? There are more hours regular exercise & “Zombie” social activities working Lower effectiveness (per hour) Psychological Health damage Alternating Expert or damage illness & replaceable? Lower output (per year) hard work Both? Give up! © Is this a vicious or virtuous feedback loop? 27
  28. 28. Each author seems to vary;Systems Thinking notation this is Neil Thompson’s, incorporating elements of Gerald Weinberg’s & Dennis Sherwood’s Duration of  working hours  Degree      Health of live failures Peer- Short-term approval Management- financial reward approval reward reward BALANCING REINFORCING BALANCING LOOP 1: LOOP LOOP 2: Quality targets  Capacity of staff     Overall Long-term   acceptability financial of projects “Coping” reward  mechanisms Effectiveness for fatigue  Efficiency (per week) (per hour) etc  A loop is balancing if it contains an odd number of opposing links; ©else it is reinforcing 28
  29. 29. Connections between & beyondfeedback loopsMistakes madein CM Don’t know whether Difficult to select regression tests problems are bad fixes  or CM mistakes Code turmoil + Too many bad fixes, Difficult to diagnose faults knock-on faults Everyone too Test completionCM not properly busy to arrange a CM team slowness  Defect detection rate Everyone toocoordinated  busy to produce concise Failing tests More defects Failing tests add to backlog overviews & slow completion need fixes distil expertise No dedicated Tests fail if configuration they expose Defect backlog System documentation Test failure rate defects management team is too “texty” • Goldratt’s Theory of Constraints: – sometimes cause-effect trees can form loops • Systems Thinking: – loops can have “dangles” • So let’s fit the two together…! © 29
  30. 30. What the new paradigm means for thetrad. Time-Cost-Quality-Scope pyramid Current OLD Quality/Risk PARADIGM Quality Reality   Trees Note: this loop has an even number of opposing links, so START should be reinforcing: Speed HERE Scope but could be vicious or virtuous? Time Cost Scope  Low-cost Conflict Resolution Quality inbuilt not imposed, Diagram less waste (eg rework), Quality working software is iterated… ?  NEW START PARADIGM Speed HERE Scope Future  Mythical Man Month: ? Reality small teams can be Low-cost surprisingly productive TreesInspired by Guckenheimer, S. with Perez, JJ. Software engineering with Microsoft Visual Studio © Team System (Addison-Wesley Pearson Education 2006)Old pyramid reprised from Gerrard, P. & Thompson, N.Risk-Based Testing (EuroSTAR 2002) 30
  31. 31. Summary • Traditional process improvement in testing may be improved by building in principles which made Toyota (and others) global successes • This involves thinking beyond just STAR: need to consider whole lifecycle and quality regime • You can either fine-tune an existing method (eg TPI, TMM, TOM) or build your own method • The principles are not proprietary and require no special training • Focussing on feedback loops is an example of the Pareto principle (80-20) © 31
  32. 32. Way forward & Where to find out more • Try it for yourself: – Strengths, Weaknesses, Opportunities & Threats on Post-itTM notes – Draw pencil connectors for current & future causes & effects – Move items around, look for loops – What would it take to balance / reverse vicious loops? • Toyota: (see slide 5) • Agile methods: (see slide 8) • Selected reading on Goldratt: – Goldratt: mostly “novels” (everyone quotes The Goal) – so try this… – William Dettmer: 1st of 2 books on the Thinking Tools… • For Systems Thinking: – Gerald Weinberg: IT/IS context… – Peter Senge: business mgmt context… – Dennis Sherwood: many examples of loops… • Practical application of some new paradigm principles: © – Sam Guckenheimer 32
  33. 33. Thanks for listening! • Contact details: Neil Thompson Thompson information Systems Consulting Ltd www.TiSCL.com NeilT@TiSCL.com 23 Oast House Crescent Farnham Surrey GU9 0NP England, UK +44 (0)7000 NeilTh (634584) • Questions? © 33

×