• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Emergence (2012)
 

Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Emergence (2012)

on

  • 495 views

EuroSTAR conference Nov 2012, Amsterdam

EuroSTAR conference Nov 2012, Amsterdam

Statistics

Views

Total Views
495
Views on SlideShare
495
Embed Views
0

Actions

Likes
0
Downloads
10
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Emergence (2012) Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Emergence (2012) Presentation Transcript

    • Value-Inspired Testing: Renovating Risk-Based Testing, and Innovating with Emergence Neil Thompson,Thompson information Systems Consulting Ltd @esconfs www.eurostarconferences.com #esconfs
    • Value-Inspired Testing v1.1aRenovating Risk-BasedTesting, andInnovating with EmergenceNeil Thompson ©NeilT@TiSCL.com @neilttweet neiltskype 2
    • Deming: survival is “not compulsory”• Tim Rosenblatt (Cloudspace blog 22 Jun 2011) “Testing Is Dead – A Continuous Integration Story For Business People”• James Whittaker (STARWest 05 Oct 2011) “All That Testing Is Getting In The Way Of Quality”• Alberto Savoia (Google Test Automation Conference 26 Oct 2011) “Test Is Dead”• (There *may* be others?)Are reports of testing’s death “greatly exaggerated”? 3
    • But those definitions of testing seem too narrow – my Agenda instead...• To renovate the use of Risk in testing: – collate current variants, eg “Risk-Based, Risk-Driven” – use context-driven mix of principles – grade testing from high to low (not truncate) – balance risk against benefits, giving net Value – use risk throughout testing “process” – integrate risk into SDLC using Value Flow ScoreCards• To innovate in testing: – consider evolution in Nature – also a value flow? – appreciate concept of Memes; evolving “memeplexes” in testing – emergent path between “too much chaos” & “too much order” – creativity: where good ideas come from (Johnson) 4
    • So, when holistic & evolving, testing will not die?(based on http://www.needham.eu/wp-content/uploads/2011/01/ascent-of-man1.jpg) 5
    • 1972-3 Start renovation of “Risk” by collating current variants 2002 ! 1984-1988 Risk as prioritisation of features etc1976 1979 RISK-BASED TEST MANAGEMENT RISK-BASED TEST DESIGN 1970s - 1984 1990 Risks as entities to test, driving techniques IMPLICIT “TESTING IS RISK RISK- HOW TO RISK,PRINCIPLES BASED” DO IT SCHMISK! 6
    • Use a context-driven mix of available principles RISK-BASED TEST MANAGEMENT Project Risk workshops: environment • why, whether, who, where? • when, what risks, how handle? Business risks Prioritisation Risk factors Quality Product What to to choose, eg: criteria elements prioritise & Perceived focus on: • usage • newness quality • test items? • complexity Quality Product • features? criteria elements • data items? • test conditions? Test Technical risks techniques RISK-BASED TEST DESIGN Project environmentAfter: Heuristic Test Strategy Model v4.8, James Bach 7
    • Prioritisation: better than truncating “low-risk” tests, *grade* coverageTest Coverage&Effort Even distribution X X Random / spurious priorities Risk-truncated • Does this Riskiness • Even less X • Better, but make sense? sense! dangerous to • No! omit some areas completely? Risk-graded • This is the mostAfter: Chris Comey,Testing Solutions Group responsible way 8
    • Consider not only risks – balance against benefits to give net value Business... Benefit Benefit Benefit Priorities Objective Objective Objective Objective Objective Objective Objective Objective Objective Objective Objective Objective Project...+ FEATURES etc ... .. .... . ..... ... .. ... . Open Product Risks Closed Open Tests Closed graded Closed by... Open Open Closed .............Value Project objectives, hence business benefits,After: Paul Gerrard & Neil Thompson,book Risk-Based E-Business Testing available for release now 9
    • Apply risk principles throughout software lifecycleDEV MODEL TEST MODEL validation(expected) (ver’d / val’d) DEVELOPMENT TEST testing MODEL MODEL simplification Acceptance ATREAL REAL Requirements Test Analysis Execution & DesignWORLD WORLD(desired) refinement verification testing System SOFTWARE with risk of Functional Test Analysis ST Specification Execution (observed) distortion & Designafter SOFTWARE TESTING:A CRAFTSMAN’S APPROACH Integration Technical ITPaul Jorgensen Test Analysis Execution Design & DesignSo:• remember overlapping models Module Component Test Analysis CT Execution• we need both Spec & Design verification & validation programming SOFTWARE• this is not “the” V-model! with risk of bugs 10
    • Bear in mind causes and effects of risks DEVELOPMENT TEST MODEL MODEL simplification On TEST On REALREAL Requirements “process” WORLDWORLD after go-live refinement Functional Validation Static with risk of Specification Verification distortion Knock-on Effects TechnicalMistake: Design Fault:a human actionthat produces an an incorrect step, Anomaly: Failure: Defect: Module process or data an unexpected an incorrect resultincorrect result incorrect definition in a Spec result(eg in spec- information in computer programwriting, during testing Error: specifications (ie executable amount by whichprogram- software)coding) result is incorrect programming with risk of bugs SOFTWARE Probability of making mistakes, of defects causing faults, faults causing failures, etc Consequence of risk if it happens............................................................................... 11
    • Risk principles apply throughout testing “process” DEVELOPMENT TESTWrite / MODEL MODELmodel May be all or partially exploratory.................betterrequirements Static Test Test Test Bug other oracles validation Static analysis design exec’n mgmtPrevention Specification verification Fix, Use Detect Detect further bugs; test fixes, “Peopleware” omissions, distorti Adjust test coverage regression- principles ons, test rogue additions... Prioritise by both urgency............. DEV & TEST “processes” On & importance................................On REAL WORLD after go-live Knock-on EffectsMistake Defect Failure Fault Anomaly Error 12
    • A framework for managing value through the lifecycle: “Value Flow ScoreCard” WHO... Financial Supplier Process Product Customer Financial Infra- Improve- structure mentSupplier Improv’t Customer Infrast WHYProcess Product WHAT, WHEN, WHERE HOW • In action, the ScoreCard is a 7x4 table:• “The seven – uses include setting / balancing test watchwords of policy, strategy, coverage, troubleshooting & improvement highly effective – can start with repositionable paper notes, or use software people!” spreadsheet – NB the measures & targets need not be quantitative, may be qualitative eg rubrics 13
    • Risk can be integrated into the scorecard SEVEN VIEWPOINTS of what stakeholders want Supplier Process Product Customer Financial Improvement InfrastructureObjectives WHY we do thingsThreats to HOW they Risk Risk Risk Risk Risk Risk may failsuccessMeasures WHAT (will constitute success,Targets WHEN & WHERE) HOW toInitiatives do things well• Now it’s a 7x5 table 14
    • Types of riskEg:• supplier may Project deliver late• key staff may leave risk may may cause causeEg:• configuration management Process may install wrong version of product risk may cause mayEg: cause• specifications may Product contain defects• software may contain risk faults 15
    • So: we’ve renovated “risk-based testing” into a whole-lifecycle structure SEVEN VIEWPOINTS of what stakeholders want Supplier Process Product Customer Financial Improvement InfrastructureObjectives WHY we do thingsThreats to Project Process Product Project Project (Processsuccess risk risk risk risk risk risks)Measures WHAT (will constitute success,Targets WHEN & WHERE) HOW toInitiatives do things well 16
    • Now to move on to innovation• The double feedback loop of the ScoreCard: – not only is our – but also: how we are scorecard, and its planning to improve cascading, convergi for next &  ng on desired future targets for current projects project...    Improvement Supplier Process Product Customer Financial  Infrastructure Objectives Threats to Process success risks Measures Targets Initiatives 17
    • How does Nature innovate? Lamarck: Acquired characteristics, Usage, Inheritance Darwin: Mutation, Fitness, Reproduction (various authors) Emergence...Images from wikipedia 18
    • A scientific view of emergence Physics (gravity end) Physics (quantum end) (Ouroboros: Greek Οὐροβόρος or οὐρηβόρος, Chemistry: from οὐροβόρος ὄφις Inorganic "tail-devouring snake”) Chemistry: Organic Biology Social sciencesSources: Daniel Dennett “Darwin’s Dangerous Idea” “cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc) 19Image from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF
    • Is like value flow? (and it looks better this way up!) • Each level of progress generates6: Intelligence into matter/energy patterns?5: Bio methods integrated into technology? possibilities, which are tested“SINGULARITY” • Then, each level is a platform which, when established, is easily built upon4: Technology by “cranes” (without having to worry about the details below) • After the science levels... • humans made tools, talked and3: Brains co-operated2: Biology • printing gave us another level • now, software is following1: Chemistry exponential growth & Physics • So, software testing should surf the wave of evolution (not flounder in+0: Maths?! the shallows behindSingularity is Near, The it) 20 2005
    • The Darwinian view of evolution – but does this explain all emergence? Image from www.qwickstep.com 21
    • Biological evolution assophistication rising with diversity Sophistication Time Diversity 22
    • But evolution is not smooth? Sophistication Sophistication (equilibrium) Spread into new niche, eg Mammals Mass extinction, (equilibrium) eg Dinosaurs “Explosion” in species, eg Cambrian (equilibrium) “Gradual” Diversity Punctuated Diversity Darwinsim equilibria Number of Sophistication species Diversity“Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay GouldImages from www.wikipedia.org 23
    • So... evolution of sciences overall?• Arguably other sciences have not evolved smoothly either• Sudden advances, akin to Sophistication punctuated equilibria in biological evolution Social sciences Biology Organic Diversity Inorganic Per Bak, “How Nature works” 1996 Chemistry (image Tracey Saxby, Integration and Application Network, Physics University of Maryland Center for Environmental Science ian.umces.edu/imagelibrary/) 24
    • OK, what’s all this got to do with software testing?• Social sciences evolution Tipping Points (Malcolm Gladwell) Computers Sophistication • We have an Books important and difficult job to do Language here! Tools Diversity 25 Social sciences
    • Testing needs to evolve / emerge / innovate to keep up with complexity Artificial Intelligence?!Sophistication 4GL Internet, • For Object Mobile devices example, 3GL Orientation are we 2GL ready to test 1GL Diversity AI?? 26 Computers
    • How has testing evolved so far? PERIOD EXEMPLAR OBJECTIVES SCOPE “SCHOOL”? DEBUGGING Weinberg Test + Debug Programs “no schools,Pre- (Psychology) (1961 & 71) but...”1957 DEMONSTRATION Hetzel Show meets Standard Programs (Method) (1972) requirements (Control)1976 DESTRUCTION Myers Find bugs Programs, (Art) (1976 & 79) System, ? Accept’ce1983 EVALUATION ? Measure quality Analytic (Engineering?)1984 PREVENTION Beizer Find bugs, + Integr- Quality (Craft?) (1984) show meets ation requirements, +prevent bugs Factory2000 AUTOMATION? Agile ? (Test-Driven) (Technology?) HUMANISATION? Kaner et al Find bugs, in service Context of improving quality, Driven (Social Science?) (1988 & 99) for customer needs2011 UNIFICATION?? Experiment & Neo- Science? Evolve? Holistic?Overall periods developed after Gelperin & Hetzel, “The Growth of Software Testing”, 271988 CACM 31 (6) as quoted on Wikipedia
    • Another way of thinking about evolution: genes... Replication & Selection Sophist- ication Mutation DiversityImage from www.qwickstep.com Image from schools.wikipedia.org 28
    • ...and for humans, “memes”, as anextension of the genes concept (Lamarckian??) Replication & Selection Cranes “Other imitable phenomena” Writing Platforms SpeechSophistication Rituals Gestures Mental, social & cultural evolution Symbols Ideas Beliefs Practices Image from .www.salon.com Mutation Taxonomy from www.wikipedia.org Biological evolution Theme developed from Daniel Dennett “Darwin’s Dangerous Idea” 29 Diversity
    • Considering memes in testing: here is an example “memeplex” Effectiveness Always-consider Efficiency Risk management Quality management Decide process targets Assess where errors originally made & improve over time Insurance Assurance Be pragmatic over quality targets Plan early, then Define & use metrics Give confidence (AT) rehearse-run, Use handover & acceptance criteria Define & detect errors (UT,IT,ST) acceptance tests V-model: what testing against W-model: quality management Use independent system & acceptance testersRisks: list & evaluate Tailor risks & priorities etc to factors Use appropriate skills mix  Refine test specifications progressively: Define & agree roles & responsibilitiesPrioritise tests based on risks  Plan based on priorities & constraints  Design flexible tests to fit Use appropriate techniques & patternsDefine & measure  Allow appropriate script format(s)test coverage  Use synthetic + lifelike data Use appropriate toolsAllow & assess for coverage changes Document execution & management procedures Optimise efficiency Distinguish problems from change requests Measure progress & problem significance Prioritise urgency & importanceQuantify residual risks & confidence Distinguish retesting from regression testingSource: Neil Thompson STAREast 2003(not “best practices” but reference points for variation?) 30
    • Another example memeplex for testing• (Grouped here by chapter for illustration, and coloured by theme)• 293 individual “lessons” selectable by testers according to contextManagement Managing the testing group Your career in software testing Managing the testing project Thinking techniques Testing a tester Thinking like The role of the tester strategy testing Planning the testing Automating advocacy Bug Interacting with programmers Documenting testingSource: Neil Thompson BCS SIGiST 2002 review ofLessons Learned in Software Testing (Kaner, Bach & Pettichord) 31
    • So, do we have punctuated equilibria in the evolution of testing? UNIFICATION??• Where were the Science? Mass-market software Platforms? HUMANISATION? eg Context-Driven school• What were the Social Science? Open-source tools CRANES? AUTOMATION? eg test-driven development• Tipping points? Technology? Belief in cost-of-failure curves PREVENTION eg reviews, root cause analysis Craft? Publication of ANSI/IEEE standards EVALUATION eg metrics initiativesSophistication Engineering? Establishment of textbooks DESTRUCTION eg test techniques Art Acknowledg’t of testing as distinct discipline DEMONSTRATION eg V-model Method Sources: Software analysis Gelperin & Hetzel 1988 DEBUGGING etc?? Psychology • But... is there something wrong Diversity Software testing with this picture?... 32
    • One of the existing views of innovations in software testing Testing & Quality • Concepts: – hierarchy – products / processes Testing (20th C) • Factors: – invention / application – individuals / organisations – bottom-up / top-down – synthesis of precursors – adjacent possibilities – role of testing! • Aids: – population size – diversity / interdiscipline – free time / free to fail – psychology & serendipity – recording mediaAfter: Lines of innovation in software testing, Testing innovationsStuart Reid 2010/2011, in specific subjectstesting-solutions.com 33
    • Arguably, emergence is more than just Lamarckian / Darwinian • Emergences at coarser scales not explained by Time “reductionism” to finer scales • For best innovation & progress, need neither too much order nor too much chaos • Examples: galaxy development, phase transitions, Gaia, autocatalysis, aminoacids→proteins, political swings, AI & IA? Social sciencesSophistication Biology • Might also Chemistry apply to testing?? Physics DiversityExtrapolation from various sources, esp. Stuart Kauffmann, “The Origins of Order”, “Investigations” 34
    • History of testing is intertwined in “ecosystems” with technology, software lifecycles, etcSophistication Social science Technology Testing & mature Agile? Quality Craft immature Agile Engineering CASE tools Art Method Development Structured Psychology methodologies Diversity 35
    • And within testing, different contextshave so far evolved in separate streams?Sophistication Testing & X Quality: Technology TRADITIONAL “SCHOOLS” Social science • Limited dialogue, mutual mistrust, “languag Engineering X Craft e” differences Art X CONTEXT-DRIVEN Method • Recent changes X regarding “school” & “approach” Psychology Diversity 36
    • An “emergent” view of innovation • Eight related ideas from history of human innovation Web 7. Platforms 6. Exaptation 1. Adjacent 5. Error possible 4. Serendipity 2. Liquid networks 3. Slow hunch City“0” Reef Johnson’s ideas overlaid here on Neil Thompson’s graphic 37
    • Emergent view: (a) innovation framework • Once a new level is established, Web can build on it, 7. Platforms almost without 1. Adjacent thinking possible 2. Liquid networks • Ideas flowing City without friction • Things happen “0” Reef wherever they can happen “Patterns of innovation• Coral reefs surprisingly diverse are fractal” habitat, because crowded, wave- washed boundary zone• Cities concentrate minority interests where they can communicate• Tech innovations used to take 10 years; on www 1 is enough 38
    • Emergent view: (b) innovation “techniques” 6. Exaptation • Modifications can be hi-jacked for unexpected things (and beneficially) 5. Error • Noise can make us focus more • OK to fail, but try to fail fastWeb 4. Serendipity 7. Platforms • You may find 1. Adjacent something different, possible 2. Liquid but it’s important to networks be seeking something tattoos99.comCity 3. Slow hunch • Many innovations are not eureka moments,Reef they take time to“0” evolve & establish 39
    • A brief history of human innovationSource: Steven Johnson,“Where good ideas come from:the natural history of innovation” • Rise of market communities, eg:1800-current – radio (Marconi, Tesla, Braun , Hertz etc) • Rise of amateur communities, eg:1600-1800 – Milky Way (Al- Biruni, Galileo, Hersch el & his sister) Market • Most discoveries1400-1600 “amateur individuals”, eg: Amateur – supernovae (Brahe) Individual(s) Communities 40
    • So, what could software testing learn from the history of innovation?HUMAN HISTORY SOFTWARE TESTING Reef, City, Web • Even if introvert, use LinkedIn, Twitter etc Adjacent possible • Try modifying / combining / hybridising techniques. They’re not set in stone (eg 2-D classification trees) Slow • Keep a notebook. You never know what may hunch, Exaptatio come in handy eventually (see also Jerry n Weinberg’s Fieldstone method) Serendipity • If a trail goes cold, turn your nostrils in some other direction Platforms • Seek new uses of previous achievements, eg test automation in new ways (high-volume random) communities • Even competitors in this market seem to (market & collaborate and mutually-respect. Keep it up! amateur) • Attend conferences etc 41
    • An additional thought Renovated risk, • Testing contexts & Science,Sophistication will of course Traditional, as UNIFICATION? risk-averse continue to sectors differ, but... • More mutual Market-chasing, dialogue may product-oriented, increase risk-tolerant / risk-embracing innovation, sectors both sides • ...if we can all share understanding across varied contexts Diversity 42
    • A brief history of testing innovation? • Communities2012 onwards? interacting more? Quality Analytic Agile Factory (Test-Driven)2000-2012? • Communities in Context relative isolation? Driven Market1950s-1999? Amateur • Guru Individual(s) Communities individuals? 43
    • Key references & acknowledgements (NB this is not a full bibliography)• Use of Risk in testing (yes, other sources are available!): – Kaner, Bach & Pettichord: Lessons Learned in Software Testing – Craig & Jaskiel: Systematic Software Testing – Gerrard & Thompson: Risk-Based E-Business Testing• Principles contributing to Value Flow ScoreCard: – Kaplan & Norton: The Balanced Scorecard – Translating Strategy into Action – Isabel Evans, Mike Smith, Software Testing Retreat• History & innovations in testing: – Gelperin & Hetzel: The Growth of Software Testing – (Meerts: testingreferences.com incl. timeline – see Paper) – Stuart Reid: Lines of Innovation in Software Testing• Emergence: – Dennett: Darwin’s Dangerous Idea – Eldredge & Gould: Punctuated Equilibria... (in Models in Palaeobiology) – Kauffman: The Origins of Order, Investigations etc – Johnson: Where Good Ideas Come From – (+Kurzweil: The Singularity is Near?!) 44
    • Takeaway ideas• All testing is risk-based/value-inspired: whether or not you recognise it yet (so, make a virtue of it)• Embrace diversity; discuss! don’t dismiss, disrespect or just “agree to differ”• Mix with lots of non-testers• Seek out analogies & metaphors• Depending on your personality: – Read lots of books (eg “things to read together” = adjacent possible) – Do lots of thinking – deliberate & unintended – Participate in blogs, discussion groups• Remember: change is accelerating, and innovation is fractal! 45