• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Holistic Test Analysis & Design (2007)
 

Holistic Test Analysis & Design (2007)

on

  • 399 views

STARWest conference Oct 2007, near Los Angeles. Co-authored & co-presented with Mike Smith.

STARWest conference Oct 2007, near Los Angeles. Co-authored & co-presented with Mike Smith.

Statistics

Views

Total Views
399
Views on SlideShare
399
Embed Views
0

Actions

Likes
0
Downloads
6
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Holistic Test Analysis & Design (2007) Holistic Test Analysis & Design (2007) Presentation Transcript

    • STARWest 2007Track presentation T17 v1.4a T17 2007 Neil Thompson & Mike Smith Holistic Test Analysis & Design Neil Thompson Thompson information Systems Consulting Ltd 23 Oast House Crescent & Mike Smith Testing Solutions Group Ltd Farnham,UK St Mary’s Court England, Surrey GU9 0NP 20 St Mary at Hill www.TiSCL.com London England, UK EC3R 8EE www.testing-solutions.com ©
    • Do you control your testing, or does yourtesting control you? T17 2007 Neil Thompson & Mike Smith • Test Cases thought of FLEXIBLE, • Scripts / Procedures written RISK-MANAGED • Expectation that “those are the tests” TEST EXECUTION • What you really want to coverTHE REMAINDER OF YOUR LIFE • Governance / management needs or… • Product risks ©(ON THAT PROJECT) 2
    • Agenda• Contents: T17 2007 – Standards‟ & textbooks‟ guidance on test coverage; Test Cases Neil Thompson & Mike Smith – Test specification process: what then how – Physical & Logical test coverage: Test Conditions – Holistic Test Analysis & Design method: a spreadsheet! NEIL – Example – Scripted & exploratory testing – Formal & informal test techniques; mixing techniques – Full process, and route-mappable shortcuts – Fixed “test entities model”? – Business Performance Management; Scorecards; MIKE – Information traceability; Measurement – Tools – Conclusions NEIL• Learning objectives for audience: – understand why deriving test cases is not as simple as many believe – appreciate the distinction between logical & physical test coverage – just like development! – take away a flexible table-driven template for analysing Test Conditions & designing Test Cases – think about the test entities model, is it fixed? – be ready to mix multiple techniques, and © scripted & exploratory approaches 3
    • IEEE 829 (1998!) still wags many T17 2007 Neil Thompson • This diagram is titled & Mike Smith “Relationship of test documents to testing process” but need to search text for more process detail TEST PLAN includes: • Test Items: software items (source/object/job control code, control data, or a collection of these) which are objects of testing (stated with reference to their specifications) • Features to be tested: distinguishing characteristics of software items, eg performance, portability, functionality (stated individually and in combinations to be tested) TEST DESIGN SPEC includes: • Features to be tested: (including nominated Test Items) • approach refinements: including techniques & rationale, result checking method, inter-case relationships TEST CASE SPEC includes: • Test Items (including Features, and optionally specification references) TEST PROCEDURE SPEC: …to execute a set of TCSs or to analyse a software item to evaluate Features …includes: • purpose (including specification references) • At first sight a one-to-many hierarchy of Plan-Design-Case, but: - see those Λ symbols! and… - (Intro & p7) “a TCS may be referenced © by several TDSs” 4
    • Test Items, Features, Conditions, Cases…what do standards & textbooks tell us? T17 2007 Neil Thompson & Mike SmithThis is not a complete summary,just highlights noticed ITEM FEATURE BASIS RISK CONDITION OBJECTIVE CASE 1973 Hetzel (Yes, there was a “book” before Myers 1976 & 1979) First 1979 Myers Yes 1982 Beizer (2nd ed. 1990) First “Tests” 1983 ANSI/IEEE First Yes (in Example) (kind-of) Yes 1984 Hetzel Yes Yes First Yes 1992 Quentin (kind-of) (kind-of) First (kind-of) Yes 1993 Kaner-Falk-Nguyen (kind-of) Yes Yes Yes 1994 Marick “Clues→Req‟ts” (kind-of) 1995 Perry (2nd ed, 2000) (kind-of) (kind-of) (kind-of) (kind-of) 1995 Kit Yes Yes First Yes (kind-of) Yes Yes 1998 BS 7925-1 (working draft v6.3) (not really) Yes 1998 IEEE 829 Yes Yes (in Example) (kind-of) Yes 1999 Black (kind-of) (kind-of) (kind-of) Yes First Yes 2000 Binder …………………………………states compliance with IEEE 829……………………………. 2002 Craig & Jaskiel Yes Yes Yes “Inventory” Yes Yes 2003 Hutcheson (“dictionary” pref‟d) (not really) Yes “Inventory” of units Yes 2007 ISTQB Foundation Yes Yes Yes Yes Yes Yes Yes © 5
    • But what is a Test Case? T17 2007 Neil Thompson & Mike Smith• ISTQB definition (Glossary v1.3, 31 May 2007): – “Test Case: A set of input values, execution preconditions, expected results and execution postconditions, developed for a particular objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement. [After IEEE 610]” (highlighting & line spacing added by Neil Thompson)• Examples in textbooks tend to be for simple on-line functions• But consider a batch job – is a test case: – a field? – a group of fields? – a whole record / row? – the entire file / table?• Amazingly, no-one seems to care! © 6
    • We want to know test coverage T17 2007 Neil Thompson & Mike Smith 1479 Now, let‟s test cases, so start with ait must be good, right? classification tree Test specification process Documentation to agree coverage © 7
    • What‟s needed in the test specificationprocess? T17 2007 Neil Thompson & Mike Smith • To get concise coverage documentation, Test Design is important (test case design techniques, eg decision tables) • But test design should be preceded by Test Analysis: “what” to test • Test scripts / procedures are optional (eg some testers do exploratory testing)… • But we cannot do without documenting test coverage in a way which: – is reviewable with stakeholders – allows use of informal techniques in addition to formal – gives a holistic view (risk-based but including “requirements-based”) © 8
    • Logical & Physical coverage T17 2007 Neil Thompson & Mike Smith Test LOGICAL PHYSICAL Policy Test Strategy Master Test Plan Acceptance AT AT Scripts /Requirements AT Procedures Testing Plan Analysis & Design Functional System ST ST ST Scripts / Specification Testing Plan Analysis & Design Procedures LOGICAL PHYSICAL IT IT Scripts / Technical Integration IT Plan Analysis & Design Procedures Design Testing Module Component CT CT CT Scripts / Plan Analysis & Design Procedures Specs Testing © 9
    • But test coverage is just decomposingthe system requirements, isn‟t it? No! T17 2007 Neil Thompson & Mike Smith validation coverage needs to be DEVELOPMENT TEST testing multi-dimensional MODEL MODEL simplification Acceptance Test ATREAL Requirements Analysis & Design ExecutionWORLD verification testing refinement Functional System Test ST with risk of Specification Analysis & Design Execution distortion Technical Integration Test IT DEV MODEL TEST MODEL Analysis & Design Execution Design (expected) (ver‟d / val‟d) Module Component Test CT ORGANISATIONREAL Spec Analysis & Design Execution BEFORE AUTOMATIONWORLD Neil Thompson, EuroSTAR 1993(desired)after ©SOFTWARE TESTING:A CRAFTSMAN‟S SOFTWARE programming SOFTWARE 10APPROACHPaul Jorgensen (observed) with risk of mistakes
    • So what are we proposing for logical & physical coverage? Requirements 3. Test bases T17 2007 Neil Thompson & Mike Smith sectionsServiceincl Functional Start of day Events affecting … … PHYSICAL COVERAGESystem entities areas Test Level, eg OF TEST CASES 5. Test Conditions System Testing Whole (hierarchical) On-line Batch system Hardware transactions runs interfaces Test Suites / Packs 1. Test Items Delivered modules SCRIPTED Tests Tests Tests Test Cases --- Test Cases --- Test Cases --- Risk --- --- --- as a 4th --- --- --- dimension 2. Test Features Functionality Non-Functionality Performance Security … EXPLORATORY useful sub-divisions? eg Sessions … Stress LOGICAL COVERAGE OF TEST CONDITIONS (hierarchical, but also © 11 multi-dimensional)
    • Holistic method: how we show logical & physical coverage (spreadsheet representation) T17 2007 Neil Thompson & Mike SmithHEADER INFORMATION Test Script or Exploratory RegimeTest Level Test Level ObjectivesWHAT IS TO BE TESTED HOW TEST CASES ARE DESIGNED TEST SCRIPT MODIFICATIONS REFERENCE FOR CURRENT1. Test Items 2. Test Features 3. Test Basis 4. Product 5. Test Ver / Val Test Data Technique Test Test RELEASE& Sub-items & Sub-features References Risks Conditions Method Indications Names Objectives Cases (OR EXPLORATORY + whether TEST EXECUTION Behavioural RECORD) or Structural A “ “ B • (see later “ “ C slide) multiple dimensions are handled by: D • allowing flexible many-many relationships … (yes, it’s just a table!) • allowing hierarchy in Test Conditions • high-level Test Conditions in Overview sheet • low-level Test Conditions in detail sheets Overview A B C D … … … © 12
    • Logical left→right flow: but flexible (1 of 2) T17 2007 Neil Thompson TEST ITEMS TEST FEATURES BEHAVIOURAL / TEST PRODUCT TEST & Mike Smith STRUCTURAL BASIS RISKS CONDITIONS REFERENCES ADDRESSED Functional Non- ACCEPTANCE Functional Service to Online Batch stakeholders ES3 Service Val Nav … Perf Sec … Levels Func ES1 S1 S2 ES2  Behav Req‟ts … Behav Workshops Non- INTEGRATION   Behav SYSTEM Streams ES3 Func    Behav Req‟ts ES1 S1 S2 ES2 I‟face Struc Spec (continued   Data A F1 F2 F3 F5 F6 BehavSYSTEM Data B F2 F1 F3 F5 F7 Func on next Spec Data C Data D F1 F2 F3 F3 Threads  Behav slide) Data A F8 F9 F10 F11 F12 INTEGRATIONCOMPONENT COMPONENT F2 F3 F5 Tech Design C3 F1 C2 Pairs /  Behav clusters of modules Struc Module C1 F2 C4 Specs Modules F3 C6 Behav F5 F1 C2   Struc Programming Public op C3 C5 F4 Standards © 13 CAB
    • Logical left→right flow: but flexible (2 of 2) T17 2007 Neil Thompson & Mike Smith TEST MANUAL / AUTOMATED TEST DATA TEST CASE TEST SUITE / TEST CASE / CONDITIONS VERIFIC‟N / VALID‟N & CONSTRAINTS DESIGN TEST / SCRIPT RESULT CHECKING INDICATIONS TECHNIQUES TEST CASE PROCEDURE METHOD OBJECTIVES IDENTIFIERS ACCEPTANCE Manual: copy of live data, Use Cases Main success scenario AT-8.5.1 - screen images viewed, timing important, Extension 2a AT-8.5.2 - test log hand-written users all have access Extension 4a AT-8.5.3 etc Extension 4b Extension 6a AT-8.5.4 Manual for changes: ad-hoc data,INTEGRATION unpredictable content, SYSTEM - examine interface log prints check early with - view screens in each sys system contacts • If you use an informal technique, state so here. Auto regression test: update with care, • You may even invent new techniques! documentation in-house test harness out of date Manual for changes: arrange data separation StateSYSTEM between teams MPTU ST-9.7.1 - database spot-checks Transitions - view screens, audit print MPTC ST-9.7.2 (all transitions, contains MPC Auto regression test: sanitised live Chow 0-switch) MC threads, approved tool data extracts etc MN ST-9.7.3COMPONENT COMPONENT INTEGRATION Manual + Auto: - varies, under team control 1.0 over CT-2.4.1 0.1 over CT-2.4.2 Boundary on CT-2.4.3 Manual for changes: Value 0.1 under CT-2.4.4 - varies, under 1.0 under CT-2.4.5 Analysis individual control Auto regression test: etc © per component, tailored to each component 14 approved tool
    • So… what is a Test Condition? T17 2007 Neil Thompson & Mike Smith CURRENT ISTQB GLOSSARY OUR CURRENT (v1.3) WORKING DEFINITION • An item or event • A part / aspect of of a component or system behavioural, non-functional or that could be verified structural test coverage in a by one or more Test system (or the service in which Cases, eg a it is used) which is proposed function, transaction, featur by stakeholders and could be e, quality attribute or tested by one or more Test structural element. Cases (or a part of a Test Case). • May be qualified by specific data limitations / combinations. (Italic bold parts are • Test Conditions are derived by questioned here) analysis of Items Under Test, Testable Features, Test Bases and Product Risks. • Test Conditions may be specified in a hierarchical way. © 15
    • Relationships between Test Conditions & Test Cases (example) T17 2007 Neil Thompson & Mike SmithWHAT IS TO BE TESTED HOW TEST CASES ARE DESIGNED TEST SCRIPT MODIFICATIONS REFERENCE FOR CURRENT1. Test Items 2. Test Features 3. Test Basis 4. Product 5. Test Ver / Val Test Data Technique Test Test RELEASE& Sub-items & Sub-features References Risks Conditions Mechanism Indications Names Objectives Cases +Behav/Struct High (1st) -level: • customer ages: - pre - teenager - teenager - grumpy old person • customer domicile - Pacific “drawing - Mid - West a matrix” - Mid - East (do you have this?) (informal) - Atlantic  Overview 2ndlevel: teenager in the Mid - East “drawing another tariffs matrix” (informal)  credit A gender 3rd level: male teenager in the Mid - East on pay - as - you - go tariff with <$10 credit Jim Yellow female grumpy old person in Pacific on sunset tariff Prudence Brown male pre - teenager in Atlantic on pay - as - you - go tariff with >=$10 credit Foxy Red customer who has a criminal record Boxy Black customer with record of threatening behaviour to call centre staff Jim Yellow … Boxy Black … A1 © Adapted from Foundations Of Software Testing – ISTQB Certification 16 Graham, Van Veenendaal, Evans & Black
    • Relationships between Test Conditions & Test Cases (example continued) T17 2007 Neil Thompson & Mike Smith HOW SUITE/ SUITES OFWHAT IS TO BE TESTED TEST SCRIPT TEST SCRIPTS REFERENCES1. Test Items 2. Test Features 3. Test Basis 4. Product 5. Test Test& Sub-items & Sub-features References Risks Conditions Cases Item: … (from previous slide) … Suites: Whole Male teenager, Mid - East, pay - as - you - go, <$10 credit, criminal Yellow system Female old, Pacific, sunset tariff Brown Male pre - teen, Atlantic, pay - as - you - go, >=$10 credit, threatening Red (changes + Male pre - teen, Atlantic, pay - as - you - go, <=$10 credit, criminal + threatening Black regression) Multi-customer Sub-Item: • Mailshot the changes Requirements Jim Yellow Document Pre-cond‟s • Functionality: • Two customers • Yellow, Brown • New - Field validation 2.1 Select customer ----- transactions: 3.1 Examine credit hist • Three customers • Yellow, Brown, Red ----- - Navigation Post-cond‟s 4.1 Examine call hist • Three cust‟s, one different • Yellow, Red, Black - Mailshot between 1 Overview windows 2.5 Customer navig - Business logic 3.5 Credit navig 4.5 Calls navig • Selection of paths for one cust (Chow) • Red - Help text • Non-Func: … • Select –> Credit history for three cust‟s • Credit history –> Call history same three cust‟s • Yellow, Brown, Black • Yellow, Brown, Black Prudence Brown Pre-cond‟s - Usability • Call history –> Offer for four cust‟s • Brown, Yellow, Black, Red ----- ----- Post-cond‟s • Changed • Mailshot • System may • Already agreed to join promotion transactions: • Brown Design not warn… … - Billing Meeting • System may - Call centre enquiries not warn in time… • About to be disconnected • Voluntary • Forced • Red • Black … Sub-Item: system after the changes • Functionality: - (overall) … Multi-customer Pre-cond‟s ----- (regression ----- thread tests) • Non-Func: - Performance … Post-cond‟s - Contention … © (some columns 17 omitted for clarity)
    • Holistic method: potential use in exploratory testing (can mix with scripted) T17 2007 Neil Thompson & Mike Smith TESTING MISSION Determine Determine Determine Configure Operate Observe Evaluate Report Model test space oracles coverage test procedures test test test test test PRODUCT system system system results results DOMAIN PROBLEM DOMAIN TEST LAB Product Quality Project Tests Perceived Elements Criteria Environment Quality Test Script or Exploratory Regime EXPLORATORY1. Test Items 2. Test Features 3. Test Basis 4. Product 5. Test Ver / Val Test Data Technique Test TEST EXECUTION& Sub-items & Sub-features References Risks Conditions Mechanism Indications Names Objectives RECORD (and/or TEST SCRIPT REF) + whether Behavioural or Structural A Elements from “Heuristic Test Strategy Model”, “ “ B “Universal Testing Method v2.0” & “Improving By Doing” “ “ C quoted from Rapid Software Testing v2.1.2, training from James Bach & Michael Bolton … www.satisfice.com, www.developsense.com cross-referred here by Neil Thompson Overview A B C … © 18
    • Now let‟s look at techniques T17 2007 •Equivalence Partitioning Neil Thompson •Boundary Value & Mike Smith Transaction Customer type Account type Analysis •Cause/Effect Graphing – Decision Table Testing •State Transition Testing A2   •Classification Tree Method C1   ? •Random Testing •Syntax Testing C2   •Process Cycle Testing •Thread Testing D1   •Elementary Comparison P4   Testing •One of Each “black-box” … •Pairwise Testing •Inductive Testing BEHAVIOURAL ? Calling module Lists compiled by Chris Comey, STRUCTURAL Testing Solutions Group “white- or clear-box” 1,2,3 4,5 6 •Linear Code Sequence and Jump (LCSAJ) •Branch Condition Combination Testing •Modified Condition Decision TestingComponent 4,5 ? •Branch Condition Testing •Branch/Decision Testing 2,3 •Statement Testingunder 6 •Dataflow TestingIntegrationTest 1 2,4 3,5 FORMAL: Called modules surprisingly little-used! INFORMAL: (or used “implicitly”) “the freedom to just analyse stuff” © HOW TEST CASES 19 Test Analysis: WHAT IS TO BE TESTED ? ARE DESIGNED
    • Formal techniques are not carved instone tablets T17 2007 Neil Thompson & Mike Smith• Published techniques have evolved over time; names and representations come and go• Some major publications on techniques: – Myers 1979 – Beizer 1982 (2nd ed 1990) – Beizer 1995 (black-box) – British Standard 7925-2 – Binder (object-oriented) 2000 – Copeland 2003• But why does object orientation seem a different world? And now SOA!• What are patterns, actually?• What about Hetzel; Kaner-Falk-Nguyen; Marick; Jorgensen – their books don‟t seem to talk about “techniques” as such, but contain great value• In exploratory testing, we hear more about heuristics ©• What about “techniques” for non-functional testing? 20
    • So where have “Techniques” been,where are they going? T17 2007 Neil Thompson & Mike Smith * Modified from Neil Thompson‟s minutes of Software Testing Retreat 2, 2003 PROCEDURAL / EXPLORATORY (from which also this hierarchy was inverted!) SCRIPTED PARADIGM PARADIGMHeuristicsDefinition We tend to find……………….different…(based on Chambers 1981): bugs when we do this Guidance on• art of discovery in logic Testing object - oriented non - functional If after a while• education method in which This seems a useful way systems is similar at the testing seems you don‟t succeed, student discovers for self higher test levsl, but we fragmented and try something different to divide the system‟s• principles used in making structure & behaviour need a new approach not “techniqued” decisions when all possibilities for lower levels into manageable parts cannot be fully explored!? (not applied to CLASSES AREPatternsDefinition*: each must contain all of… software testing until object orientation? LIKE TRAD “MODULES” but INHERITED FEATURES CANNOT BE TRUSTED ? POLARITY- IN ALL CIRCUMSTANCES SWITCHING• catchy title;• description of the problem but…) “COLOURED” which the pattern addresses BOXES METHODS ARE• solution to the problem LIKE TRAD• context in which the pattern applies “UNITS”• one or more (pref >3) examples.? •Equivalence Partitioning •Boundary Value Analysis Invariant boundaries Warming up v. cruising v. cooling down Doing v. describing v. thinkingTechniques •Cause/Effect Graphing & Decision Tables •State Transitions •Classification Trees Non - modal class Quasi - modal class Modal class Polymorphic server Modal hierarchy Careful v. quick Data gathering v. data analysisDefinition •Random Solo work v. team effort(based on Chambers 1981): •Syntax “Test Design Your ideas v. other peoples‟ ideas• methods of performance •Process Cycles Patterns” from Category - partition Current version v. old versions• manipulation •Threads Binder 2002, Combinational function Testing v. touring• mechanical part of an •Elementary Comparisons Testing OO Recursive function Individual tests v. general •One of Each artistic performance! •Pairwise Systems Polymorphic message (selected from Testing Outside the Bachs, black-box •Inductive STAREast 2006) PROCESS •Linear Code Sequence and Jump (LCSAJ) PRODUCT “these techniques may also be applied in an exploratory way” •Branch Condition Combinations •Modified Condition Decisions •Branch Conditions © •Branch/Decisions •Statements “white-” or 21 •Dataflows clear-box
    • The holistic method: process flow? Route options! T17 2007 Neil Thompson & Mike SmithWHAT IS TO BE TESTED HOW TEST CASES ARE DESIGNED TEST SCRIPT MODIFICATIONS REF. (OR FOR CURRENT EXPLORATORY RELEASE1. Test Items 2. Test Features 3. Test Basis 4. Product 5. Test Ver / Val Test Data Technique Test TEST EXECUTION& Sub-items & Sub-features References Risks Conditions Method Indic‟s / Names Objectives RECORD) Constraints Refine from Stage Test Plan Test Derive Test Condition Conditions Analysis Method Techniques Constraints Test Cases (in procedure Test (full process flow) Case Design / Script) Tests added / modified / obsolete; regression Techniques Test Cases Indications (in procedure / Script) (omit Data 1 (omit Risks) Conditions Method Constr/Indics) (omit formal Techniques) Scripts (omit 2 Conditions etc) Cases (omit Conditions) Data Constr Techniques Cases 3 Method / Indics) (omit 4 Conditions Method Data Constr / Indics) Techniques Objectives) Cases 5 Conditions (direct from Conditions) Execution 6 Use in Exploratory Charter as appropriate Session © Sponsor/Owner measurement framework Tester measurement framework 22
    • So: is a fixed entity modelpossible / desirable? T17 2007 Neil Thompson & Mike Smith•The Software Testing Retreat (UK) has discussed this point• The initial reaction was that this was already clear, trivial!• Fierce debate showed otherwise• We sought both an entity model and a process• This entity model is only a workshop draft• The conversations continue… The main issues seem to be: • how many intermediate entities needed (eg “test coverage item”) • how dependent this all is on a fairly rigorous set of test basis documentsWE SAY FOR NOW:• BUILD THE ENTITIES AROUND IEEE 829 PLUS “TEST CONDITION” ©• DO NOT BE BOUND BY TEST BASIS DOCUMENTS, ALLOW EXPLORATORY 23
    • Test Model described by „methods‟ groupat large European financial institution T17 2007 Neil Thompson & Mike Smith WHY WHAT WHAT HOW Test Test Design Test Case Requirements Requirements Specifications Specifications in test management tool • “From Requirements to Test Case Specification” © 24
    • Role of test coverage in businessperformance measurement & management T17 2007 Neil Thompson & Mike Smith• Some people are talking about testing becoming a profession…• Professional managers want a clear idea what is well- covered, what is less well-covered, and why (counts of Test Cases and bugs mean little in themselves)• In particular, they do not want: – important tests omitted – large numbers of low-value tests – higher levels of testing merely repeating Component Testing – insufficient attention to non-functional tests – unstructured piles of detailed scripts – difficult-to-maintain testware…• Business managers are increasingly governed by structured objectives, eg Balanced Scorecards… © 25
    • Software Quality versionwww.balancedscorecard.org Scorecards published by Isabel Evans www.testing-solutions.com, adapted here by Neil Thompson T17 2007 © Paul Arveson 1998 Neil Thompsonversion after Financial & Mike Smith EfficiencyKaplan & Norton Productivity On-time, in budget - Cost of quality Customer Improve Process VALIDATION Compliance ment eg ISO9000 Risks eg TPI/TMM… Benefits Repeatability Predictability Acceptance Learning Satisfaction Innovation - Complaints - Mistakes Product VERIFICATION Risks Test coverage - Faults - Failures• Based on feedback loops (Deming)• Not only output • Goal VERIFICATION & VALIDATION feedback but • Question Test coverage incl Risks v Benefits “outcome” • Metric METRIC (information) Test Conditions & residual failures (more MEASUREMENT (data) Test Cases & fault-fixing dimensions) © 26
    • Principles of Business PerformanceMeasurement & Management T17 2007 Neil Thompson & Mike Smith • „Translating Strategy into Action‟ – Kaplan & Norton • Drives behaviour • Measures outcomes • Links actions to strategy • Leads to predictable outcomes • Comes with a „Health Warning‟! – powerful but dangerous if misused, can drive the wrong behaviour if measures are badly constructed © 27
    • Five key principles of Business T17 2007Performance Measurement & Management Neil Thompson & Mike Smith • Generic application • Objectives, Measures & Targets, Initiatives • What & How • Cascading Scorecards – one person‟s „How‟ is another person‟s „What‟ – Measures & Targets become objectives for next person • Lead & Lag Indicators – Goal Indicators (reactive) – Performance Indicators (predictive) © 28
    • If Test Cases are not a good “measure”, why are Test Conditions better? T17 2007 Neil Thompson & Mike Smith 2. Test Features Functionality Non-Functionality Performance Security … SCRIPTED useful sub-divisions? Tests Tests … Stress Delivered Test Cases --- Test Cases --- benefits --- --- 3. Test --- --- Requirements bases Start of Events sections day affecting … entitiesServiceincl Mitigation of… 5. Test Conditions FunctionalSystem areas On-line Batch 4. transactions runs Product Whole Risks system Hardware interfaces EXPLORATORY 1. Test Items Delivered Sessions modules Objectives Measures Targets Initiatives VERIFICATION & VALIDATION Test coverage incl Risks v Benefits © 29
    • Test Entity Model needs flexible many-many relationships and not fixed hierarchy T17 2007 Neil Thompson & Mike Smith WHY WHAT HOW Test Test Test Scripts / Test Execution Requirements Conditions Cases Procedures Schedule Does this work? No, because of these hierarchies Test Test Test Scripts / Test Execution Need this Requirements Conditions Cases Procedures Schedule instead? © 30
    • Analysis, Measurement andInformation Traceability T17 2007 Neil Thompson & Mike Smith• Generic Model• Separates „What‟ from „How‟• Links Hierarchies and relates Logical to Physical• Horizontal & Vertical traceability within Dev & Test Models• Testing Integrated into Development• „Pure‟ Test Analysis – Is difficult, requires expertise – A driver for further test activities – A driver for further development activities – A driver for predictable outcomes• Programme & Project Measurement Framework for sponsors AND owners of measures!• We can learn from the world of Business Performance Measurement & Management, and they can learn from us! © 31
    • „What‟ and „How‟ give us a Treble-V Model!Proactive „Testing‟ influence on SDLC T17 2007 Neil Thompson & Mike Smith At the higher levels, ‘What’ ………and……… ‘How’ are further apartPROJECT STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST FIX, RETEST,REQ‟TS SPEC TESTING ANALYSIS DESIGN EXECUTION REGR TEST Test Conditions Explicit / implicit Test Cases? LOGICAL STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST FIX, RETEST, DESIGN TESTING ANALYSIS DESIGN EXECUTION REGR TEST PHYSICAL STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST FIX, RETEST, DESIGN TESTING ANALYSIS DESIGN EXECUTION REGR TEST Test Cases? COMPONENT STATIC DYNAMIC TEST DYNAMIC TEST DYNAMIC TEST FIX, RETEST, DESIGN TESTING ANALYSIS DESIGN EXECUTION REGR TEST BUILD © 32
    • Tools to support this method T17 2007 Neil Thompson & Mike Smith• Spreadsheet (Microsoft Excel) has been used so far, is flexible and good for graphs – but labour- intensive, and not scaleable• A tailored RDBMS would be difficult if we do not accept a fixed ERD (limiting route mapping)• Leading proprietary tools handle the relationships but in different ways: – Most are „Tester Measurement & Management Frameworks‟ (eg Quality Center / Test Director) – Test Case focus – T-Plan (Sponsor/Stakeholder Measurement & Management Framework) – Test Condition focus © 33
    • • Summary: Conclusions T17 2007 – There is more to software test specification than Test Cases & Neil Thompson & Mike Smith standard techniques (and this surprises most of us) – Like development, tests benefit from logical specification (Analysis→Test Conditions) before physical Design – Our Holistic Method allows hierarchies of Test Conditions, and integrates multiple techniques & scripted-exploratory mixtures – Supports Test-Driven Design & Development, trad. & exploratory testing – A measured and risk-based view of test coverage is increasingly important for business performance measurement & management• Lessons learned: – The original IEEE829 is better than may be thought; but need to read it carefully, and textbooks have tended merely to quote it without adding much value in terms of how to apply it – No “one size fits all” for an entity model – IEEE829 is Test Case focussed – supports testers‟ measurement framework• Take away: – The Holistic Method spreadsheet (it‟s in use already at a major client) © 34
    • Way forward T17 2007 Neil Thompson & Mike Smith• A new ISO working group is creating a set of international software testing standards in collaboration with the IEEE• Mindset change required• Explicit v implicit Test Cases• Need flexibility to cater for sponsors‟ / owners‟ measurement framework in addition to testers‟• Enhanced tool support for testing-business linkage © 35
    • STARWest 2007Track presentation T17 v1.4a T17 2007 Neil Thompson & Mike Smith Holistic Test Analysis & Design Thanks for listening! For further information… Neil Thompson Thompson information Systems Consulting Ltd 23 Oast House Crescent & Mike Smith Testing Solutions Group Ltd Farnham,UK St Mary’s Court England, Surrey GU9 0NP 20 St Mary at Hill www.TiSCL.com London NeilT@TiSCL.com England, UK EC3R 8EE www.testing-solutions.com msmith@testing-solutions.com ©