• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Neil Thompson - Thinking tools: from top motors, through software process improvement, to context-driven
 

Neil Thompson - Thinking tools: from top motors, through software process improvement, to context-driven

on

  • 1,199 views

Visit SoftTest Ireland www.softtest.ie and sign up for access to free Irish Software Testing events.

Visit SoftTest Ireland www.softtest.ie and sign up for access to free Irish Software Testing events.

Statistics

Views

Total Views
1,199
Views on SlideShare
1,118
Embed Views
81

Actions

Likes
0
Downloads
9
Comments
0

1 Embed 81

http://softtest.ie 81

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Neil Thompson - Thinking tools: from top motors, through software process improvement, to context-driven Neil Thompson - Thinking tools: from top motors, through software process improvement, to context-driven Presentation Transcript

    • September 2007 Neil ThompsonSoftTest Ireland with the support of the All Ireland Software NetworkBelfast 20 Sep 2007 Thinking tools: from top motors, through software process improvement, to context-driven Neil Thompson Thompson information Systems Consulting Ltd 23 Oast House Crescent Farnham, Surrey England, UK GU9 0NP www.TiSCL.com ©
    • September 2007 Can software process improvement learn Neil Thompson from these? TOYOTA PRIUS © 2TOYOTA CELICA GT4
    • September 2007 How Toyota progressed through quality to Neil Thompson global dominance, and now innovation• Quality (top reputation): – has dominated JD Power satisfaction survey for decade – Toyota Production System (TPS): 14 principles across Philosophy, Problem-Solving, Process and People & Partners• Global dominance: – market value > GM, Chrysler & Ford combined – on track to become (2006) world’s largest -volume car mfr• Innovation (fast): – Lexus: invaded the “quality” market and won – Prius: not evolutionary but revolutionary – and launched 2 months early and sold above expectations – Toyota Product Development System (TPDS): 13 (!) principles across Process, People and Tools & Technology © 3
    • September 2007 Neil Thompson Agenda• Contents: – Analogies between world -leading improvements in manufacturing and what we may do in the software development lifecycle (SDLC): • 1. Things that flow through a process: inventory, value (EuroSP3 2004) • 2. Constraints on process, and thinking tools to improve (EuroSTAR 2006) • 3. From process improvement to process definition, eg context-driven (STAREast 2003)• Acknowledgements: – Jens Pas (EuroSTAR 1998) – my introduction to Goldratt – Greg Daich (STAREast 2002) – I generalised his idea, then worked backwards to the roots• Objectives for audience: – entertainment? – something a bit different – appreciate some fundamental principles – take away a set of simple diagrammatic thinking tools which are useful in many situations – think about your particular SDLC – where are the constraints? – go on to read some of the references – benefit by then improving your own processes – be more ready to learn from other disciplines & industries © 4
    • September 2007 The “new” paradigm in manufacturing: value Neil Thompson flow, pull not push, problem-solving TPS & TPDS TOYOTA: GOLDRATT:• Customer-defined value 3. Pull to avoid Takt (rhythm), Drum-(to separate value -added from waste) over-production Low-Inventory (“lean”), Buffer- 4. Level workload Just-In-Time Rope Minimise waste Maximise throughput 2. Continuous process flow to surface problems Andon (stop and fix) Critical Chain manag’t 7. Visual control to see problems Kanban cards Tagging slow movers Monitoring buffers One-page metrics 12. See for yourself to thoroughly understand Chain of 5 “why”s Cause-effect trees Conflict resol diagrams 14. Learning org via reflection & improvement Plan-Do-Check-Act Identify constraint, “elevate” & iterate • And now these principles have been successfully applied beyond actual manufacturing, into product development • But what about development of software?...13. Decide slowly (all options) by consensus • Front-load product dev to explore thoroughly alternatives while max design space © • I prefer Goldratt for thinking tools... 5
    • September 2007Goldratt’s Theory of Constraints: an analogy Neil Thompson to explainGoal: to win the warObjective: to maximise throughput (right soldiers doing right things)Constraint on throughput: slowest marcherDrum Rope Buffer diagram based on those in “The Race”, E.M. Goldratt & R. Fox 1986Critical chain: weakest link is all we need fix, by means of...Five focussing steps: identify constraint, exploit it, subordinate all else, elevate it (i.e. strengthen so not now weakest), then... identify next constraintBut now it’s no longer simple: so need iterative tools for: what to change, what to change to, howFive thinking tools (based on sufficient causes & © necessary conditions) 6
    • September 2007 Neil ThompsonApplicability of TOC beyond manufacturing• Military logistics• Marketing, sales & distribution• Project management• Measurements, human relationships, medicine etc• Using technology, eg assessing benefits of functionality• IT systems development: – focussing of Goldratt’s Critical Chain on hi-tech projects Robert C Newbold – methodology design Alistair Cockburn – “Lean software development” Mary & Tom Poppendieck – Agile management using Feature Driven Development David J Anderson © 7
    • September 2007 Neil Thompson But software development isn’t like manufacturing?• Software isn’t like hardware• Intellect adds value less predictably than machines• The manufacturing part of software development is disk duplication: “development” is really a design activity• People are more important than processes• Software development doesn’t repeat exactly, people are always tinkering with the processes• Development involves discovery, production involves reducing variation• But I say: does that make all the analogies worthless, or do they just need interpreting? I suggest the latter… © 8
    • September 2007 Neil Thompson A code factory and a bug factory• No-waste factory a b c Stated Demonstrations & requirements acceptance tests Programming• Now here’s some waste: meeting/escalation (and loaded personal memories), or inventory? a b c a b’ d Implicit requirements Documented ? Acceptance tests I I requirements ? Meeting / escalation to agree Programming © 9
    • September 2007 Neil ThompsonSpecs & unfinished software are inventory• Specifications are generally not needed after go-live (I will come later to exceptions) so they are not end -product, they are work-in- progress (especially intermediate levels like functional & non- functional specs)• Untested software, and even finished software not paid for, is a lso inventory• Iterative lifecycles help if “adaptive” (product-based) rather than “transformational” (where specifications multiply!) a b Revised & new requirements a’ b’ c I May include Programming Programming redesign © 10
    • September 2007 The full traditional W-model bulges with Neil Thompson inventory!Businessobjectives Test against Post-implement - Make into Verify ation review Requirements Verify & Validate Acceptance Acc. retest, Statement RS, + spec. AT test fix & reg. testValidate(incl.“QA”) Functional Verify & Validate System Sys. retest, Spec. FS, + spec. AT test fix & reg. test Retest lower Technical V&V TD, Integrat- Int. retest, levels + spec. IT fix & reg. test where Design ion test necessary Module V&V MS, + Unit Unit retest, Specs. spec. UT test fix & reg. test Code Static-check © 11
    • September 2007 Neil ThompsonIn a factory, small batches reduce inventory Stages & units / hour (i) 1.3 (ii) 10.0 Multi-batch 1000 (iii) 1.0 200 200 200 200 200 (ie “iterative”) Single-batch (ie “waterfall”) (iv) 10.0 (v) 2.00 1 2 3 4 0 1 2 3 4 months monthsInventory Inventory Based on Goldratt, The race (North River Press 1986) © 12
    • September 2007 Neil Thompson Drum-buffer-rope approach to constraints • Optimise throughput by: – (1) drumbeat based on constraining stage (a) & orders (b) – (2) buffer to protect constraining stage from upstream disruptions – (3) rope to prevent leader extending gap on constraining stage • For subassemblies feeding in, have additional buffers “troops marching” materials flow bufferraw materials (1)in constraining ”leader” stage (a) assembly (2) (3) buffer subassembly orders (b) © 13
    • September 2007 In software development & testing, small Neil Thompson batches = agile methods: consider inventory moving through SDLC Amount of functionality Requirements Inventory in this stage Lead time for Specification of process this stage Design of process Inventory in process overall Programming & Within each stage unit testing of testing, can Integration subdivide by testing pass/fail, bug states etc System testing Acceptance testing Live and paid-for Date If lines not approx parallel, inventory is growing © 14Based on Anderson, Agile management for software engineering (Prentice Hall 2004)
    • Agile methods: pull value instead of pushing September 2007 Neil Thompson documentationLEVELS OF DOCUMENTATION,pushed by specifiers FLOW OF FULLY- WORKING SOFTWARE, pulled by customer demand + Test SpecificationsRequirements Accepted System- tested + Func Spec Integrated + Technical Design WORKING SOFTWARE Unit / Component -tested + Unit / Component specifications © 15
    • September 2007 Neil ThompsonBut even if our context is not suitable (or ready) for agile methods, we should understand flow raw materials in Where is/are the Requirements constraining stage(s)? Where should buffers Specification be / not be? Acceptance order(s ) testing System testing Design assembly Integration testing sub-assemblies Programming Unit testing © 16
    • September 2007New paradigm problem-solving: the Goldratt- Neil Thompson Dettmer* “Thinking Tools” What to ........... What to change to ....... change (1) (2) (3) Core problem+ Prerequisites+ CURRENT REALITY(other) Root causes Conflicts + Injections Intermediate Intermediate FUTURE Requirements + effects effects INJECTIONS REALITY Undesirable Desired Objective effects effects CURRENT CONFLICT .... How to change .... REALITY RESOLUTION (4) Intermediate Needs+ (5) objectives Specific actions * very slightly paraphrased here Obstacles Intermediate effects PRE- TRANS- REQUISITES Objective Objective ITIONSources: Dettmer, W. Goldratt’s Theory of Constraints (ASQ 1997) Thompson, N. “Best Practices” & Context-Driven – building a bridge (StarEast 2003) © 17
    • September 2007The thinking tools are complementary diagrams Neil Thompson http://www.osaka-gu.ac.jp/php/nakagawa/TRIZ/eTRIZ/eforum/eETRIACon2003/Fig11TillmannB.jpg • Causes and effects © • Necessary and sufficient conditions 18
    • September 2007 Why better than “traditional” process Neil Thompson improvement in software testing Test J Some Lanalogies Medical flexibility Organisation MaturityTM MATURITY LEVELS Test Maturity ModelSM PREDEFINED SUBJECT AREAS Test ProcessSources: Improvement ®TMMSM - http://www.stsc.hill.af.mil/crosstalk/1996/09/developi.aspTPI® – based on http://www.sogeti.nl/images/TPI_Scoring_Tool_v1_98_tcm6 -30254.xlsTOMTM – based on http://www.evolutif.co.uk/tom/tom200.pdf, as © interpreted by Reid, S. Test Process Improvement – 19 An Empirical Study (EuroSTAR 2003)
    • September 2007 Neil Thompson Extending the new paradigm to testing: by rearranging TPI’s key areas…1.Test strategy2.Lifecycle model3.Mom of involv’t4.Estim & plan5.Test spec techn6.Static techn’s7.Metrics8.Test automation9.Test env’t10.Office env’t11.Commit & motiv12.Test func & train13.Scope of meth’y14.Communication15.Reporting16.Defect mgmt17.Testware mgmt18.Test proc mgmt19.Evaluation20.Low-lev testing …we can begin to see cause-effect trees… © 20
    • September 2007Cause-effect trees: can start with TPI’s inbuilt Neil Thompsondependencies 4.Estim & plan 6.Static techn’s 5.Test spec techn 7.Metrics A:Substantiated A:Informal techniques A:Product for project B:Formal techniques 3.Mom of involv’t 1.Test strategy 19.Evaluation 20.Low-lev testing 2.Lifecycle model 18.Test proc mgmt A:Single hi-level test A:Plan, spec, exec A:Planning & exec’n A:Compl test basis B:+Monitoring & adjustment 13.Scope of meth’y 14.Communication 11.Commit & motiv 12.Test func & train 16.Defect mgmt 15.Reporting A:Budget & time A:Internal A:Defects A:Project -specific B:Test int in proj org B:Progress , activities, prioritised defects A:Testers & Test Mgr 10.Office env’t 9.Test env’t 8.Test automation 17.Testware mgmt A:Managed-controlled A:Internal…eg for getting to at least level A throughout © (slightly simplified) 21
    • September 2007Can add extra “key areas”, lifecycle inputs & Neil Thompsonoutputs, general categories INPUTS & 4.Estim & plan + Risk-Based 6.Static techn’s 5.Test spec techn TECHNIQUES 7.Metrics STAR INFLUENCES in general on testing 3.Mom of involv’t 1.Test strategy 19.Evaluation 20.Low-lev testing 2.Lifecycle model 18.Test proc mgmt LIFECYCLE in general 13.Scope of meth’y 14.Communication 11.Commit & motiv 12.Test func & train 16.Defect mgmt 15.Reporting ORGANISATION in general 10.Office env’t 9.Test env’t INFRA- 8.Test automation + Test data 17.Testware mgmt OUTPUTS STRUCTURE from testing in general…eg TPI / TMap’s four “cornerstones” © 22
    • September 2007Can go beyond the fixed questions: SWOT Neil Thompsoneach subject area Source: solid borders denote as in TPI; dashed borders denote additional INPUTS & INFLUENCES on STAR 4.Estimating & planning STRENGTHS OPPORTUNITIES STRENGTHS OPPORTUNITIES Some managers are considering Monitored, and adjustments agile methods made if needed Business analysts may be motivated by UML training Too busy for well-considered estimating & planning System requirements are The most experienced The squeeze on testing is agreed too late business analysts are leaving, likely to worsen more may follow Release dates are fixed System specs & designs are defective, just timeboxed Can’t recruit more staff System specs are heavy text documents Not substantiated, just “we did it as in previous project” WEAKNESSES THREATS WEAKNESSES THREATS ©(small Post-it® notes are good for this) 23
    • September 2007Applying the thinking tools to information Neil Thompsonfrom SWOT analysis Using extracts from both 1 st & 2nd examplesSTRENGTHS OPPORTUNITIES TACTICAL: Address culture by worked examples of diagrams Some managers are considering agile methods (Use Strengths to help TACTICAL: Include tables & diagrams in test specifications Business analysts may be amplify opportunities) motivated by UML training ONGOING: Techniques training & Action planning Can still improve coverage coaching at macro level with informal techniques STRATEGIC: (80/20) Improve SDLC method TRANS- CONFLICT FUTURE REALITY ITIONCURRENT REALITY RESOLUTION PRE- Culture of our testers is to prefer large text documents SDLC method does not encourage diagrams REQUISITES to diagrams System specs are heavy text documents (Use Threats to help Anticipating & identify obstacles) overcoming Test specs are large & “texty ” obstacles Test coverage omissions & overlapsWEAKNESSES Too many failures in Live THREATS The SWOT method can be “nested”, eg aggregate up © from individual subject areas to whole lifeycle 24
    • September 2007Difficulties in problem-solving: Neil Thompsonconflict resolution (eg for documentation) Agree in a “We need workshop what “If it’s not written, “Test reports need to “Reviews are powerful at more documentation finding defects early, but it’s is needed it can’t be signed off” be formal documents” difficult to review just speech” documentation” “What documentation is needed “Sign-off can be by for contractual reasons? Still agreed meeting outcomes” “Are there few enough people time to negotiate?” Yes! Documentation to make frequent widespread doesn’t have to meetings practical?” No! Objectives of be paper: use documentation wikis etc CONFLICT Can mix exploratory Documentation varies: are to help build & scripted need to distinguish necessary Documentation from unnecessary and maintain a is still needed testing Make maximum fit-for-purpose for maintenance use of tables after go-live Need to distinguish qualitysystem by knowing & diagrams of documentation, not just quantity and agreeing what built and “They will when their what tested memories have faded “Are test analysts writing “Will the live system be Our users cannot be on-site or when there’s a maintained by its with the project throughout tests for others to run?” No! contract dispute” its developers?” No! “Signed-off requirements “People never read “Documented test plans Specifications are like are counterproductive to any documentation” are counterproductive to inventory, no end value systems meeting real “We need the best testing” user needs now” less documentation”Developed further to Daich, GT. Software documentation superstit ions (STAREast 2002) ©See also Rüping, A. Agile documentation (Wiley 2003) 25
    • September 2007Not only process improvement – we can apply Neil Thompsonthe thinking tools to defining “appropriate” practices! Always- Good What “appropriate” means Context good practices in this context principles in this REALITY + Injections Root “Prerequisites” context 3 causes Intermediate FUTURE Extremes Intermediate Requirements effects REALITY effects Sub-requirements (valid & invalid Desired Effects effects Objectives assumptions) + INJECTIONS CURRENT 2a Questions to Choice categories REALITY 1 POSITIONING consider & actions + Justifications 5a Choice • Methodology 2b Intermediate categories + unhappy with CONFLICT sub-prerequisites NEEDS + Specific (à actions) RESOLUTION actions INTERACTIONS • Unsure how best to test Obstacles Actions & sub- (à conditions) 4 Intermediate objectives effects PRE- 5b Sub-prerequisite Objectives© REQUISITES Sub- © 26 objectives 26 TRANSITION
    • September 2007This is a structure I argued could build a bridge Neil Thompson between “best practices” and context-driven Unifying points Best Practice “Always-Good” Principles Goldratt’s“formalised “fossilised What How “thinking tools”sloppiness” thinking” Constraints, Context- Requirements, Driven Objectives etc Expert pragmatism with structure © 27
    • September 20071 Context (CURRENT REALITY) Neil Thompson Business/ App type Corporate culture org. sector NationQUALITY / RISK FACTORSSCOPE, COST, TIME, (eg USA) Technology Legal Moral Job type & size: • project/programme constraints: constraints, eg: • bespoke/product • regulation • human safety • new/maintenance • standards • money, property • convenience Process Resources: constraints, eg: • money (à skills, environments) • quality management • time • configuration mgmt • Unsure how best Methodology • Methodology to test happy with unhappy with © 28
    • September 20072a Always-good principles Neil Thompson (CONFLICT RESOLUTION upper level) Always-good Effectiveness Efficiency Risk management Quality management Decide process targets Assess where errors originally made & improve over time Insurance Assurance Be pragmatic over quality targets Plan early, then Define & use metrics Give confidence (AT) rehearse -run, Use handover & acceptance criteria Define & detect errors (UT,IT,ST) acceptance tests V-model: what testing against W-model: quality management Use independent system & acceptance testers Risks: list & evaluate Tailor risks & priorities etc to factors Use appropriate skills mix l Refine test specifications progressively: Define & agree roles & responsibilities Prioritise tests based on risks l Plan based on priorities & constraints l Design flexible tests to fit Use appropriate techniques & patterns Define & measure l Allow appropriate script format(s) test coverage Use appropriate tools l Use synthetic + lifelike data Allow & assess for coverage changes Document execution & management procedures Optimise efficiency Distinguish problems from change requests Measure progress & problem significance Prioritise urgency & importance Quantify residual risks & confidence Distinguish retesting from regression testing © 29
    • September 2007 Neil ThompsonConflicting interpretations of these principles Next diagram will take each box from the previous diagram and as sess it on a formal-informal continuum, so... In preparation for this: what do we mean by “formality”? • adherence to standards • consistency and/or proprietary • contracted-ness methods • trained-ness and • detail certification of staff • amount of • “ceremony”, eg degree documentation to which tests need to • scientific-ness be witnessed, results • degree of control audited, progress • repeatability reported • any others? © 30
    • September 20072b Neil Thompson Good practices in this context (CONFLICT RESOLUTION lower level) We’re too • THEY ARE LEVELS, All systems are integrated from parts Use a lazy to think NOT STAGES • SOME SPECS ARE waterfall OUT OF DATE / IMPERFECT, We want baselines to test against BUT WE COPE V-model APPROPRIATE We want to test viewpoints of: • NEED NOT BE 1-1 CORRESP USE OF • users SPECS-LEVELS • someone expert & independent V-model: • designers • NEED NOT BE 4 LEVELS what testing • programmers Two heads are better than one against • MULTIPLE PARTIAL PASSES “Conflict” We’re doing iterative development Different levels mitigate different risks We’re doing adaptive development (no specs) • CAN USE EXPLORATORY Documentation must be minimised We have little time TESTING AGAINST CONSENSUS BASIS Don’t We’re object-oriented • V-MODEL IS IMPLICIT IN BINDER’S BOOKTesting OO systems: models, use a patterns & tools V-model V-model is MANY PEOPLE We want to discredited STAND BY V-MODEL be trendy, anyway © 31
    • September 20073 What “appropriate” means in this context Neil Thompson (FUTURE REALITY)The system has:• users Our user requirements• (potentially) expert & independent testers are out of date and• designers (where significant) were vague when written• programmers We do have a good functional spec and independent testers available Our system is very simple We do need separate development & acceptance V-model with test levels • NEED NOT BE 4 LEVELS only 3 levels: We don’t need a l acceptance (v. consensus) separate l system (v. spec) integration test level l unit (informal) Our programmers hate documentation © 32
    • September 2007 And so on… overall what we have done is Neil Thompson deconstruct then reconstruct: the framework is a “meta-V-model” ChoiceAll possible CONFLICT RESOLUTION TRANSITION categories contexts upper upper & actions Your context CURRENT TRANSITION REALITY Choices lower CONFLICT Each practice PRE- Questions to RESOLUTION to examine REQUISITES consider lower FUTURE FUTURE REALITY REALITY What “appropriate” © 33 means in your context
    • September 2007 Neil Thompson Conclusions• Summary: – Toyota’s success (and penetration of Just In Time) – “The Goldratt Trilogy”: • 1. Things that flow through a process: inventory, value • 2. Constraints on process, and thinking tools to improve • 3. From process improvement to process definition, eg context- driven• Lessons learned: – three papers are enough?• Take away: – read references – Dettmer is key• Way forward: – examples! © 34
    • September 2007 Neil Thompson Key references• Context-Driven: – Kaner, Bach & Pettichord (2002) Lessons learned in software testing, Wiley• Best Practice: – ISEB, ISTQB??• My inspiration: – Jens Pas (EuroSTAR 1998) Software testing metrics – Gregory Daich (STAREast 2002) Software documentation superstitions• Theory of Constraints understanding: – Eliyahu M. Goldratt (1984 then 1992 with Jeff Cox) - The Goal; (1986 with R. Fox) The Race; (1997) Critical Chain• TOC overview and the thinking tools: – H. William Dettmer (1997) Goldratt’s TOC: a systems approach to cont. improv’t, ASQ• Related (but differently-specialised) thinking from Agile community: – Alistair Cockburn: A methodology per project, www.crystalmethodologies.org – Mary Poppendieck: Lean development: an agile toolkit, © 35 www.poppendieck.com