Value Flow ScoreCards - For better strategies, coverage & processes (2008)
Upcoming SlideShare
Loading in...5
×
 

Value Flow ScoreCards - For better strategies, coverage & processes (2008)

on

  • 391 views

British Computer Society Specialist Interest Group in Software Testing (SIGiST) Sep 2008, London. Co-authored with Mike Smith.

British Computer Society Specialist Interest Group in Software Testing (SIGiST) Sep 2008, London. Co-authored with Mike Smith.

Statistics

Views

Total Views
391
Views on SlideShare
391
Embed Views
0

Actions

Likes
0
Downloads
12
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Value Flow ScoreCards - For better strategies, coverage & processes (2008) Value Flow ScoreCards - For better strategies, coverage & processes (2008) Presentation Transcript

  • BCS SIGiST British Computer Society Specialist Interest Group in Software Testing 18 Sep 2008 Neil Thompson 18 Sep 2008 “Testers of Tomorrow” v1.1 & Mike Smith Value Flow ScoreCardsfor better strategies, coverage and processes Neil Thompson Thompson information Systems Consulting Ltd 23 Oast House Crescent & Mike Smith Testing Solutions Group Ltd Farnham,UK St Mary’s Court England, Surrey GU9 0NP 20 St Mary at Hill www.TiSCL.com London England, UK EC3R 8EE www.testing-solutions.com ©
  • BCS SIGiSTWhat is a Value Flow ScoreCard? 18 Sep 2008 Neil Thompson & Mike Smith SIX VIEWPOINTS of what stakeholders want Supplier Process Product Customer Financial Improvement & InfrastructureObjectives WHY we do thingsMeasures WHAT (will constituteTargets success) HOW toInitiatives do things well It‟s a simple table which we can use to help control our work: • do things “well enough” for an appropriate balance of stakeholders • in this presentation, test strategy, test coverage, process improvement and process definition © • (but arguably can apply it to anything!) 2
  • BCS SIGiSTBackground 18 Sep 2008Mike Smith: Neil Thompson & Mike Smith• white papers on test process (1999 & 2002)• keynote presentation to Ericsson on measurement in testing (2007)Neil Thompson:• Organisation before automation (EuroSTAR 1993, multidimensionality of test coverage)• Goldratt‟s Theory of Constraints & Systems Thinking in process definition (STAREast 2003), SDLC (EuroSP3 2004), and process improvement (EuroSTAR 2006) Both of us: participation in the Software Testing Retreat – “Test entities” and “Appropriate Testing” (ApT) Holistic Test Analysis & Design, STARWest 2007 • a flexible tabular format used for test coverage • relating this to Balanced ScoreCards (Kaplan & Norton, business strategy etc) Separating “what” from “how”, ICST 2008: • Test Conditions as the keystone test entity ©Value Flow ScoreCards take this further… 3
  • BCS SIGiSTRationale: Why invent Value Flow ScoreCards? 18 Sep 2008 Neil Thompson• Trends in Information Systems:• Trends in Information Systems: & Mike Smith – More agility: lean lifecycles, rapid testing, “good enough quality” (eg – More agility: lean lifecycles, rapid testing, “good enough quality” (eg James Bach) James Bach) – More control: outsourcing, offshoring, Sarbanes-Oxley – More control: outsourcing, offshoring, Sarbanes-Oxley• However, these trends seem to pull in opposite directions!?• However, these trends seem to pull in opposite directions!? – see “Balancing Agility and Discipline” (Boehm & Turner) – see “Balancing Agility and Discipline” (Boehm & Turner) – … but agile is also disciplined! (or should be) – … but agile is also disciplined! (or should be)• So – what can IS development & testing learn from:• So – what can IS development & testing learn from: – Business Performance Measurement & Management? – Business Performance Measurement & Management? – Lean manufacturing // agile & Systems Thinking? – Lean manufacturing agile & Systems Thinking?• Our agenda for doing things “well enough” then better:• Our agenda for doing things “well enough” then better: – The Systems Development LifeCycle as a flow of value – The Systems Development LifeCycle as a flow of value – Balanced ScoreCards beyond strategy & six-sigma – Balanced ScoreCards beyond strategy & six-sigma – Test & measurement models combined – the Treble-V model, – Test & measurement models combined – the Treble-V model, informing development through early Test Analysis informing development through early Test Analysis – Practical uses of Value Flow ScoreCards in test – Practical uses of Value Flow ScoreCards in test © strategy, coverage, process improvement & definition strategy, coverage, process improvement & definition 4
  • BCS SIGiSTThe SDLC as a flow of value 18 Sep 2008 Neil Thompson & Mike Smith• Working systems have value; documents in themselves do RAW MATERIALS FINISHED not; so this is the PRODUCT quickest Demonstrations & a b c Stated route! requirements acceptance tests Programming• SDLCs are necessary, but introduce impediments to value flow: misunderstandings, disagreements… documents are like inventory/stock, or “waste” b Implicit a b c a d requirements ’ Documented ? Acceptance tests I I requirements ? Meeting / escalation to agree Intermediate documentation! © Programming 5
  • BCS SIGiSTLean manufacturing, Goldratt’s Theory of Constraints…agile IS methods… 18 Sep 2008 Neil Thompsoncustomers should pull “good enough” value & Mike SmithLEVELS OF DOCUMENTATION, FLOW OF FULLY-WORKINGpushed by specifiers SOFTWARE, pulled by customer demand + Test SpecificationsRequirements Accepted System- tested + Func Spec WORKING SOFTWARE Integrated + Technical Design Unit / Component -tested + Unit / Component © specifications 6
  • BCS SIGiSTBut customers are not the only stakeholders 18 Sep 2008 Neil Thompson & Mike Smith• ScoreCards – first published by Kaplan & Norton: – “Translating Strategy into Action” – Using four complementary views… Financial• Intentions to: – Drive behaviour Customer Vision & Internal – Measure outcomes Strategy Processes – Improve predictability Learning & Growth• Now software testing is not only finding bugs, but measuring quality, ScoreCards seem useful here… © 7
  • BCS SIGiSTScoreCard principles we can use 18 Sep 2008 Neil Thompson & Mike Smith • For all four views (Financial, Customer, Internal & Learning) – “what” needs doing, and “why”: • Objectives, with associated… • …Measures & Targets – „how‟ to achieve that: • Initiatives • Cascading Scorecards – One person‟s “how” is another person‟s “what” – Measures & Targets are cascaded down to subordinates • Lead & Lag indicators (Measures & Targets) – “Goal” indicators (reactive, known when achieved) – “Performance” indicators (proactive, ongoing monitoring) © 8
  • BCS SIGiSTTaking Balanced ScoreCard beyond strategy: TSG’s views of quality 18 Sep 2008 Neil Thompson & Mike Smithwww.balancedscorecard.org (Financial) © Paul Arveson 1998version after valueKaplan & Norton Efficiency Productivity On-time, in budget Customer Improve Process (User) (“Manufact‟g”) ment Compliance Benefits eg TPI/TMM… Acceptance eg ISO9000 Predictability Satisfaction Repeatability Learning - Complaints Innovation - Mistakes Product Risks - Faults Software Quality version - Failures published by Isabel Evans www.testing-solutions.com, adapted here by Neil Thompson • We can apply these WHY (complementary) WHAT views of quality HOW to testing © 9
  • BCS SIGiSTCascading Balanced ScoreCards:“Translating strategy into action”, eg… 18 Sep 2008 Neil Thompson & Mike Smith • Organisation‟s WHY objectives WHAT HOW WHY • Objectives of an WHAT HOW IS/IT project WHAT HOW WHY • Project Test Plan WHAT HOW (a) A top-down view: down the business / organisation © 10
  • BCS SIGiSTBut then also: the test process as a scorecard 18 Sep 2008 Neil Thompson & Mike Smith System development starts with TEST BASIS (EG SYSTEM SPEC) TEST ANALYSIS TEST DESIGN TEST EXECUTION the logical (“what”) before specifying the physical (“how”), so let’s do this for testing also! WHY WHAT HOW WHY WHAT HOW (b) This is a left-to-right view (complementary to the top-down view which is aligned with the project and business / organisation‟s, © objectives) 11
  • BCS SIGiSTAnd then: if we add distinct Test Analysis to theW-model… the Treble-V model! 18 Sep 2008 Neil Thompson & Mike Smith PROJECT STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST REQ‟TS SPEC TESTING ANALYSIS DESIGN EXECUTION LOGICAL STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST DESIGN TESTING ANALYSIS DESIGN EXECUTION PHYSICAL STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST DESIGN TESTING ANALYSIS DESIGN EXECUTION COMPONENT STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST DESIGN TESTING ANALYSIS DESIGN EXECUTION BUILD(c) This is a third “cascade” view – down then up, through:• Layers of stakeholders• Levels of integration © 12(It‟s not only for waterfall SDLCs, eg iterative… )
  • BCS SIGiSTThe Treble-V model develops cascadingScoreCards a little further 18 Sep 2008 Neil Thompson & Mike Smith ORGANISATION & PROJECT OBJECTIVES TEST POLICY, STRATEGY, PROJECT TEST PLAN (incl. reviews, inspections etc) 1 1 3 3 3 3PROJECT 2 STATIC 2 DYNAMIC TEST 2 2 DYNAMIC TEST DYNAMIC TESTREQUIREMENTS TESTING ANALYSIS DESIGN EXECUTION 4 4 4 4? FUNCTIONAL SPECIFICATION 1: “Translating strategy into action” 2: The test process as a scorecard (at each test level) TECHNICAL 3: Scorecard applied to activities in test process DESIGN and… (even more interestingly!) 4: Scorecard applied to activities in development process COMPONENT SPECIFICATIONS BUILD © 13
  • BCS SIGiSTThere exists a “Six-Sigma Business ScoreCard”:but is Six-Sigma applicable to IS value flow? 18 Sep 2008 Neil Thompson & Mike Smith • Principles: DEVELOPMENT – In a multi-step manufacturing MODEL process, if „quality‟ of any step is <100%, overall quality WORKING simplification SYSTEM falls dramatically with ATREAL Requirements numerous steps & ExecutionWORLD components – for overall quality to be refinement Functional with risk of Specification „good enough‟, each Execution ST distortion step / component should be within 6σ, ie 99.9996% perfect test execution Technical Design IT with risk of • IS is not exactly Execution compromises like manufacturing, Component but we can Spec CT learn… Execution programming © with risk of mistakes SOFTWARE 14
  • BCS SIGiSTWe learn two things from Six-Sigma: (i) confirmswe need Validation in addition to Verification 18 Sep 2008 Neil Thompson & Mike Smith validation Customer DEVELOPMENT TEST testing MODEL MODEL Product WORKING simplification SYSTEM Acceptance Test ATREAL Requirements Analysis & Design ExecutionWORLD verification testing refinement Functional System Test ST with risk of Specification Analysis & Design Execution distortion test execution Technical Integration Test IT with risk of Design Analysis & Design ExecutionBased on flipchart drawn by compromisesNeil Thompson,Software Testing Retreat ,Llangadog, Wales Component Component Test CT Spec Analysis & Design Execution programming © with risk of mistakes SOFTWARE 15
  • BCS SIGiST(ii) the Six-Sigma ScoreCard includes Suppliers 18 Sep 2008 Neil Thompson & Mike Smith• Value chain ≈ Supply chain! Financial Efficiency – in the IS SDLC, each Productivity On-time, participant should try to in budget „manage their supplier‟ Supplier Upward - Cost of quality Customer VALIDATION – this is an instance of Risks management Benefits Acceptance (test) scorecard applied to Information gathering Satisfaction - Complaints activities in development Improve process ment – we add this to the other 5, eg TPI/TMM… Predictability Learning giving a 6th view of quality Innovation• Now each step in the value chain can manage its inputs, outputs and other stakeholders Process Compliance Product VERIFICATION eg ISO9000 Risks Repeatability Test coverage - Faults - Mistakes - FailuresSix-Sigma Business ScoreCard published by ©Praveen Gupta (2nd ed. McGraw Hill 2007), but 16this slide shows Neil Thompson‟s version
  • BCS SIGiSTThe Treble-V model is a cascade ofValue Flow ScoreCards 18 Sep 2008 Neil Thompson & Mike Smith Organisation & Project Objectives 1 from TEST POLICY, STRATEGY, PROJECT TEST PLANCoverage 2Objectives Financialfrom 3REQUIREMENTS, STATIC FinancialFUNCTIONAL SPEC, TESTING Static Testers’ DYNAMIC 3TECHNICAL DESIGN, own Objectives TEST Test Analysts’MODULE SPECS ANALYSIS own Objectives Supplier Improv’t Customer Supplier Improv’t Customer Process Product Initiatives for next Process Product stage of (etc) test processFeedback 4objectivesfor InitiativesBusiness Analysts, for next (etc)Architects & level down theDevelopers Treble-V model © 17
  • BCS SIGiSTPotential number of ScoreCards depends onhow your SDLC is handled by different roles 18 Sep 2008 Neil Thompson & Mike SmithBusiness Analysts Requirements Reviewers Acceptance Test Analysts AT Designers & Scripters Acceptance Testers Architects Func Spec Reviewers Sys Test Analysts ST Designers & Scripters Sys Testers Designers Tech Design Reviewers Int Test Analysts IT Designers, Scripters & Executers Component Test Analysts, Designers & Executers?Pieces of a jig-saw! via pair programming?This example is a “full-ish” set: Developers• higher-level tests are scripted – other staff © may then execute 18
  • BCS SIGiSTThe Value Flow ScoreCard in action 18 Sep 2008 Neil Thompson & Mike Smith Financial Customer Financial Improv’t Supplier Process Product Supplier Improv’t Customer Process Product • Yes – it‟s just a table! …Into which we can put useful things… • We start with repositionable paper notes, then can © put in spreadsheet(s) 19
  • BCS SIGiSTValue Flow ScoreCard contents 18 Sep 2008 Neil Thompson & Mike Smith Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Info from other Acceptance in budget Learning levels of - Faults Satisfaction Innovation Treble-V model - Mistakes - Failures - Complaints - Cost of quality • GiveObjectives input to WHY upstream reviewsMeasures • Staff-days invested WHAT (“Indicators”) • 1 Staff-dayTargets per Test Policy • Send DenisInitiatives every time HOW • What kind of useful things? • Here‟s a simple example © 20
  • BCS SIGiSTExample set of Objectives, Measures, Targets &Initiatives 18 Sep 2008 Neil Thompson & Mike Smith Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Info from other Acceptance in budget Learning levels of - Faults Satisfaction Innovation Treble-V model - Mistakes - Failures - Complaints - Cost of quality • Appear • Give input to • Maintain • Run enough • Get users to • Gain Industry-Objectives upstream minimum tests (?) “sign off” successful as standard WHY Project reviews compliance respectability Manager • Staff-days • Frequency • Test cases • Signatures • Go-live dateMeasures invested of audits executed • Expenditure • Maturity levels WHAT • One for User • Go-live date (“Indicators”) • 1 Staff-day • 1 audit per • 1479 (?) Acceptance as originally • Level 2 by 2008Targets per Test Policy year <see next • One for planned • Level 3 by 2010 slide!> Operational • Expenditure Acceptance < budget • Send Denis • React to • Invite users & • <see later • ImprovementInitiatives every time correspondence slides!> operators to actions HOW from auditors specify Accep- tance Tests • These are for testing in general, in a project context © 21
  • BCS SIGiSTLag & Lead indicators; Goal-Question-Metric;making Measures & Targets SMART 18 Sep 2008 Neil Thompson & Mike Smith Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Info from other Acceptance in budget Learning levels of - Faults Satisfaction Innovation Treble-V model - Mistakes - Failures - Complaints - Cost of qualityObjectives GOAL ……...The six viewpoints assure “Relevance”……... • Staff-days • Frequency • Test cases • Signatures • Go-live date • Maturity levels invested of audits executed • ExpenditureMeasures Specific • Test conditions QUESTION agreed Measurable • 3 Staff-days • 1 audit per • 1479 (?) • One for User • Go-live date • Level 2 by 2008 Achievable per year Acceptance as originally • Level 3 by 2010 RelevantTargets Test Policy • One for planned • (in language of Operational • Expenditure Timely METRIC stakeholders Acceptance < budgetInitiativesMany of these are “lag” indicators:• reactive, known only when achieved Here are two “lead” indicators, proactive: • Timely influence on quality, in advance • help assess & maintain Achievability © 22
  • BCS SIGiSTFour practical uses of Value Flow ScoreCards 18 Sep 2008 Neil Thompson A. Test coverage & Mike Smith – extend control to stakeholders – transcend the small „textbook‟ repertoire of techniques – holistic Test Analysis & Design: integrates and clarifies test items, features, bases and product risks – better information traceability B. Test Policy, Strategy & Planning – ensure alignment with organisational objectives – help completeness, no subjects forgotten – Goal–Question–Metric traceability C. Process Improvement – not just test, but whole lifecycle – prioritised treatment of symptoms – transcend limitations of TMMTM or TPI® D. Process definition – “Appropriate Testing” (ApT) in © different project/product circumstances 23
  • BCS SIGiSTA. Test coverage: Do you control yourtesting, or does your testing control you? 18 Sep 2008 Neil Thompson & Mike Smith • Test Cases thought of FLEXIBLE, • Scripts / Procedures written RISK-MANAGED • Expectation that “those are the tests” TEST EXECUTION • What you really want to coverTHE REMAINDER OF YOUR LIFE • Governance / management needs or… • Product risks ©(ON THAT PROJECT) 24
  • BCS SIGiSTTest coverage: other common problems 18 Sep 2008 Neil Thompson & Mike Smith• Have you seen any of these? – important tests omitted – large numbers of low-value tests – higher levels of testing merely repeating Component Testing – insufficient attention to non-functional tests – unstructured piles of detailed scripts – difficult-to-maintain testware… © 25
  • BCS SIGiSTNumerous test cases & scripts are almostmeaningless to stakeholders, without a “map” 18 Sep 2008 Neil Thompson & Mike Smith 1479 Now, let‟s test cases, so start with ait must be good, right? classification tree Test specification process Documentation to agree coverage © 26
  • BCS SIGiSTTestware: not a rigid hierarchy 18 Sep 2008 Neil Thompson & Mike Smith WHY WHAT HOW System‟s Test Test Test Scripts / Test Execution specifications Conditions Cases Procedures Schedule Does this work? No, because of these hierarchies So, we need System‟s specifications Test Test Test Scripts / Test Execution many-to-many Conditions Cases Procedures Schedule entity relationships © 27
  • BCS SIGiSTAn easy way for many-to-many relationships: aflexible table 18 Sep 2008 Neil Thompson & Mike Smith System‟s Test Test Test Scripts / Test Execution specifications Conditions Cases Procedures Schedule “ “ “ “ “ “ “ “ • But merely decomposing What about: TEST ITEMS? the system‟s specification is not a recipe for very good tests FEATURES TO BE TESTED? • We want a Validation (customer) PRODUCT RISKS? view of quality in addition to the traditional Verification (product) © view… 28
  • BCS SIGiSTSo – Value Flow ScoreCards can measure testcoverage: for Test Analysts… 18 Sep 2008 Neil Thompson & Mike Smith (from Supplier Process Product Customer Financial Improvement & LEVEL TEST PLAN Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… and Info from other Repeatability Test coverage Benefits On-time, Predictability TEST BASES) levels of Acceptance in budget Learning Treble-V model - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality Test Items Product Constraints (level of benefits Objectives Features to be integration) Test Basis Features to be tested References tested Product Product Product Product Product Risks Risks Risks Risks Risks Measures Test Conditions (we could cover) Test Conditions Targets we intend to cover Objectives for Initiatives Test Cases NB this is the “manual” Holistic Test Analysis & Design xls perspective. Formal relational (to test design & execution) © database implementations, eg T-Plan, 29 may require a more rigorous treatment.
  • BCS SIGiST… then for Test Designers, (+Scripters if used), and Executers 18 Sep 2008 Neil Thompson & Mike Smith(from Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency InfrastructureTest management eg ISO9000 Repeatability Risks Test coverage Risks Benefits Productivity On-time, eg TPI/TMM… PredictabilityAnalysis) Info from other levels of - Faults Acceptance Satisfaction in budget Learning Innovation Treble-V model - Mistakes - Failures - Complaints - Cost of quality Objectives for Objectives for TestObjectives Test Cases Execution Schedule S-curves of: • Test Cases executed,Measures passed, failed • Incidents fixed, retested, closed Coverage of Test ConditionsTargets executed S-curves projected to target datesInitiatives (to next level of Treble-V model) Objectives for incident-fixers © (to Developers) 30
  • BCS SIGiST Holistic Test Analysis & Design spreadsheet centres on Test Conditions: usable also with Exploratory Testing? 18 Sep 2008 Neil Thompson & Mike Smith TESTING MISSION Determine Determine Determine Configure Operate Observe Evaluate Report Model test space oracles coverage test procedures test test test test test PRODUCT system system system results results DOMAIN PROBLEM DOMAIN TEST LAB Product Quality Project Tests Perceived Elements Criteria Environment Quality Test Script or Exploratory Regime EXPLORATORY1. Test Items 2. Test Features 3. Test Basis 4. Product 5. Test Ver / Val Test Data Technique Test TEST EXECUTION& Sub-items & Sub-features References Risks Conditions Mechanism Indications Names Objectives RECORD (and/or TEST SCRIPT REF) + whether Behavioural or Structural A Elements from “Heuristic Test Strategy Model”, “ “ B “Universal Testing Method v2.0” & “Improving By Doing” “ “ C quoted from Rapid Software Testing v2.1.2, training from James Bach & Michael Bolton … www.satisfice.com, www.developsense.com cross-referred here by Neil Thompson Overview A B C … © 31
  • BCS SIGiSTB. Test Policy, Strategy & Planning: somecommon problems 18 Sep 2008 Neil Thompson & Mike Smith • Testing not obviously (or at all) aligned to organisation‟s objectives • Test Policy, Strategy & Plan Documents which are: – cut-and-paste, “boilerplate”, same for all projects, copied from textbooks… – tedious, dreary verbiage, too long – too short! – wishful thinking – unbalanced – etc? © 32
  • BCS SIGiSTValue Flow ScoreCards for Test Policy, Strategy& Planning 18 Sep 2008 Neil Thompson & Mike Smith Organisation‟s ScoreCards, Goals, Objectives…ORGANIS-ATIONLEVEL Test Policy Org Test Strategy Prog Test StrategyPROGRAMMELEVEL Proj Test Strategy Project Test PlanPROJECT /PRODUCT Requirements Reviewers Acceptance Test Analysts AT Designers & Scripters Acceptance TestersTESTLEVELS Func Spec Reviewers Sys Test Analysts ST Designers & Scripters Sys Testers (continues as on earlier slide) © 33
  • BCS SIGiSTExample for Test Policy 18 Sep 2008 Neil Thompson & Mike SmithOrganisation‟s Organisation‟s ScoreCardsGoals & Supplier Process Product Customer Financial Improvement & InfrastructureObjectives Upward Compliance VERIFICATION VALIDATION Efficiency management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Information Acceptance in budget Learning gathering - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality • ConstantObjectives improvement of GOAL development & test processesMeasures and for • TMM levels development? QUESTION • TMM level 2Targets at least, now METRIC • TMM level 3 within 2 yearsInitiatives (for Test Strategy / Strategies) © 34
  • BCS SIGiSTTest Policy (more): Have we thought of all the viewpoints?Do we have Measures, Targets & Initiatives? 18 Sep 2008 Neil Thompson & Mike SmithOrganisation‟s Organisation‟s ScoreCardsGoals & Supplier Process Product Customer Financial Improvement & InfrastructureObjectives Upward Compliance VERIFICATION VALIDATION Efficiency management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Information Acceptance in budget Learning gathering - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality • IS actively • Products • Products • Proj Mgr is • Staff must be • Constant • Use TestFrameObjectives supports to satisfy to be fit responsible certified improv‟t of for test analysis employees specified for purpose for quality dev & test & execution GOAL requirements processes • Indep- • (comprehensive • Automate regr • Bus Mgt is • Testing endence scope) • Detect tests as much responsible prioritised & increases defects as possible for enforcing managed with early Test Policy test type • Defect source analysis • Both static • Defect • Product • TMM levelsMeasures & dynamic Detection risks • ISTQB • Freq of process adjustments • Planning, Percentage QUESTION preparation • Importance heeding metrics & evaluation of req‟ts • Software & • Advisors • TMM level 2 • Twice per yearTargets related work Expert at least, now products • Managers METRIC Advanced • TMM level 3 • Analysts within 2 years FoundationInitiativesSource: summarised from an example in TestGrip by Marselis, van Royen, Schotanus & Pinkster (CMG, 2007) © 35
  • BCS SIGiSTSummary for Test Policy, Strategy & Planning:what’s the point of Value Flow ScoreCards? 18 Sep 2008 Neil Thompson & Mike Smith • Remind the authors of all the viewpoints which should be considered • Encourage balance across those viewpoints • Focus on ways of measuring success – expose vague / wishful thinking assertions • Get the key points recorded & agreed, before writing indigestible documents • Pave the way for achievably implementing these aspirations • Consider qualitative measures, eg rubrics, where quantitative seems inappropriate (“Tyranny of Numbers”) © 36
  • BCS SIGiSTC. Process improvement 18 Sep 2008 Neil Thompson & Mike Smith • In software testing, the popular process improvement methods have fixed subject areas (“Moment of involvement” etc) • TMMTM and TPI® ascend through maturity levels – “these things are good for you, in this sequence” • TOMTM is symptom-driven but still has a fixed structure, and suggested causes (a “built-in improvement model”) • But some other fields (eg manufacturing, supply chains) use Goldratt-Dettmer (or Toyota-style equivalent) on symptoms, giving flexible focus • Driving principle: the “constraint” is the weakest link, address that first before attacking everything • Look for causes not just symptoms… © 37
  • BCS SIGiSTProcess improvement with Goldratt-Dettmerthinking tools 18 Sep 2008 Neil Thompson & Mike Smith CURRENT CONFLICT FUTURE ILLS RESOLUTION REMEDIES Symptom x Alleviation Symptom y of symptoms “It seems here that…” Symptom z Revelation(s) Distilled “truth” Fix for “On the other intermediate hand…” cause n Intermediate cause n Fix for intermediate Intermediate cause o cause o For use where: • Different stakeholders want different things Fix for (can’t fix • Evidence seems root cause a root cause b) Root Root contradictory cause a cause b • Are rival theories for © 38 for remedies
  • BCS SIGiSTAn easy route into Goldratt-Dettmer:Strengths, Weaknesses, Opportunities & Threats 18 Sep 2008 Neil Thompson & Mike SmithSTRENGTHS OPPORTUNITIES TACTICAL: Address culture by worked examples of diagrams Some managers are considering agile methods (Use Strengths to help TACTICAL: Include tables & diagrams in test specifications Business analysts may be amplify opportunities) motivated by UML training ONGOING: Techniques training & Action planning Can still improve coverage coaching at macro level with informal techniques STRATEGIC: (80/20) Improve SDLC method TRANS- CONFLICT FUTURE REMEDIES ITIONCURRENT ILLS RESOLUTION PRE-Culture of our testers is toprefer large text documents SDLC method does not encourage diagrams REQUISITESto diagrams System specs are heavy text documents Testers not trained in techniques (Use Threats to help Anticipating & Test specs are large & “texty” identify obstacles) overcoming obstacles Test coverage omissions & overlapsWEAKNESSES Too many failures in Live THREATS (actually, as shown here Goldratt-Dettmer has five diagram sets in its © full version, to cater for “making change stick”) 39
  • BCS SIGiSTValue Chain ScoreCards allow us to “swimlane”symptoms & causes (and proposed remedies) 18 Sep 2008 Neil Thompson & Mike Smith(TMap/TPI LIFECYCLE…………………………………………approximation) ORGANISATION …...………………………………….INFRASTRUCTURE TECHNIQUES………… & TOOLS Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Information Acceptance in budget Learning gethring - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality CURRENT ILLSObjectives CONFLICT RESOLUTION FUTURE REMEDIESMeasures PRE- REQUISITESTargets TRANSITIONInitiatives ©Note: this is similar to Kaplan & Norton‟s “Strategy Maps” (Harvard Business School Press 2004) 40
  • BCS SIGiSTSystems Thinking: some cause-effect trees “re-root” to form vicious / virtuous feedback loops 18 Sep 2008 Neil Thompson & Mike Smith CURRENT Example (vicious) Systems Thinking Each author seems to vary; ILLS notation this is Neil Thompson‟s, incorporating elements of Too many failures in Live Impact Jerry Weinberg‟s &Symptom x Testers too busy Dennis Sherwood‟s of live to “weed” tests Redundant failures Symptom y regression tests More tests added take too long Symptom z Test suites grow to run    ever larger REINFORCING LOOP Test coverage overlaps omissions Size of Ignorance test suites  of test coverage Regression tests not automated Intermediate cause n  Test specs are large & “texty”  BALANCING Intermediate Time LOOP cause o System specs spent Reluctance are heavy text “firefighting”  to remove tests documents from suites Testers not trained in techniques A loop is balancing if it contains an odd number of opposing links ;  else it is reinforcing Root Root Culture of our testers SDLC method cause a cause b is to prefer does not © large text documents encourage to diagrams diagrams 41
  • BCS SIGiSTValue Flow ScoreCards give two nested sets of feedbackloops: do things well enough now, then improve 18 Sep 2008 Neil Thompson & Mike Smith TIME, COSTVia cascade of Supplier Process Product Customer Financial Improvement & InfrastructureValue Flow SCOPE………………………………… …………………………………..RISK & QUALITY……………………………………………………Role 1 Role 2 Role n  Find vicious loops… Balance them, and/or… maybe even reverse them (“Tipping Point!) … but tailor quality to balance of stakeholders. Then iterate It‟s a Test Policy, process improvements, Jim, but… but focussed on Is it a good Test “constraints” ie Policy? where they will have most payback. Can still use structure of TMMTM, TPI®, TOMTM… if desired, © they may be mapped on to these value flow columns 42
  • BCS SIGiSTD. Process definition: where we want to be in the rangeformal-informal for these circumstances, and how 18 Sep 2008 Neil ThompsonFrom & Mike Smith Supplier Process Product Customer Financial Improvement &Context / Upward Compliance VERIFICATION VALIDATION Efficiency InfrastructureCircumstances management eg ISO9000 Repeatability Risks Test coverage Risks Benefits Productivity On-time, eg TPI/TMM… Note: comparable with Predictability Paul Gerrard’s “Axioms”, Information Acceptance in budget Learning gathering - Faults Satisfaction but in our version these Innovation - Mistakes - Failures - Complaints - Cost of quality are the only two axioms Legal: Process Application Sector Job type & size Efficiency • regulation constraints, eg: characteristics Resources:Objectives • standards • quality mgmt Culture • money ( skills, environments) • configuration Technical • time Moral: mgmt risks Effectiveness Business risks • safety Technology CURRENT SITUATION • Methodology unhappy with Insurance Assurance Efficiency • Unsure how best to test Type of “V-model” Handover & acceptance etc (about 30Measures criteria categories) CONFLICT formal informal formal informal RESOLUTION Where in the range Where in the range DESIREDTargets (specific aspects) (specific aspects) SITUATION ©Initiatives Appropriate Testing in this context / circumstances 43
  • BCS SIGiSTReferences & acknowledgements 18 Sep 2008 Neil Thompson & Mike Smith• ScoreCards: – Kaplan & Norton; Isabel Evans; Praveen Gupta• Lean & agile: – Toyota; Poppendiecks; Alistair Cockburn; David Anderson• Goldratt: – Jens Pas; William Dettmer ; Greg Daich• Systems Thinking: Colour-coded – Jerry Weinberg; Dennis Sherwood sources of testing ideas: - Rob Sabourin• Appropriate Testing (ApT) & test entities: © – the Software Testing Retreat 44
  • BCS SIGiSTTake-away messages 18 Sep 2008 Neil Thompson & Mike Smith Financial Customer Financial Improv’t Supplier Process ProductSupplier Improv’t CustomerProcess Product • In the pieces in which you participate, consider your inputs, outputs and other influences in• Think of any SDLC as a terms of different stakeholder views flow of value, from • Balance those views for your situation requirements to working • Look at your measures and targets for success and maintained software before deciding exactly how to do things• It should be a jig-saw of • Write them in your Value Flow ScoreCard cascading, value-adding • Note: org / project does not necessarily © pieces need a complete set; some selected 45 ScoreCards are immediately useful!
  • BCS SIGiSTBritish Computer SocietySpecialist Interest Group in Software Testing 18 Sep 2008 Neil Thompson18 Sep 2008 “Testers of Tomorrow” & Mike Smith Value Flow ScoreCards Thanks for listening! For further information… Neil Thompson Thompson information Systems Consulting Ltd 23 Oast House Crescent & Mike Smith Testing Solutions Group Ltd Farnham,UK St Mary’s Court England, Surrey GU9 0NP 20 St Mary at Hill www.TiSCL.com London England, UK EC3R 8EE www.testing-solutions.com ©