The Science of Software Testing:                     SIGiSTExperiments, Evolution & Emergence            Specialist Intere...
In the beginning testing wasMethods (or Psychology?) – then came                                           SIGiST         ...
More about Gerald M. Weinberg                                                            SIGiST                           ...
But what is software testing now?                                      SIGiST                                             ...
Defining Quality is even more                                                               SIGiSTdifficult!              ...
“Quality is value to some person(s)”                                         SIGiST                                       ...
“The Science of Software Testing”                                                                   SIGiSTisn’t a book yet...
Some bloggers have been more                                                                                 SIGiSTspecifi...
SIGiST              Specialist Interest Group in              Software Testing 21 Jun 2011Part AExperiments               ...
Why should it be useful to treat                                                                                          ...
What is software testing? Definitions                                                                                     ...
So, how would these “methods”                                                                                          SIG...
A current hot topic: testing versus                                                                                    SIG...
Exploratory testing is more sophisticated                                                                                 ...
Science should help to understand                                                                                     SIGi...
Heuristics, patterns & techniques –                                                                                       ...
But wait! Is there one, agreed,                                                                                     SIGiST...
So... a post-Popper view of theories:                                                                                     ...
SIGiST                        Specialist Interest Group in                        Software Testing 21 Jun 2011Part BEvolut...
Traditional Darwinian evolution (ie                                                         SIGiSTbiological)             ...
...arguably Darwinan evolutionary                                                                                         ...
The software lifecycle as a flow of value                                                             SIGiST              ...
To improve value flow: agile methods                                                                                      ...
But any lifecycle should be improvable                                                                        SIGiSTby con...
Testing has a hierarchy, eg...                                                                     SIGiST                 ...
...Quality and Science can also be seen as                                                                                ...
Value flows down through,                                                                    SIGiSTthen up through, these ...
So, test appropriately to your scale                                         SIGiST                                       ...
How different sciences can inspire                                                               SIGiSTdifferent levels of...
Value Flow ScoreCards                                                                    SIGiST (...have been presented pr...
Value Flow ScoreCards can be                                                                                              ...
The Value Flow ScoreCard in action                                                       SIGiST                           ...
Use #1, Test Policy:                                                                                                      ...
Use #2, test coverage:                                                                                                    ...
Use #3: process improvement, eg via                                                                                       ...
Use #4a: context-driven testing, egGoldratt conflict resolution on process areas with choices                             ...
Use #4b: lifecycle methodology selection                                                                                  ...
A further use, #4c??                                       SIGiST                                                    Speci...
SIGiST                     Specialist Interest Group in                     Software Testing 21 Jun 2011Part CEmergence &V...
Evolution as Sophistication plotted                                                                 SIGiST against Diversi...
Punctuated equilibria                                                                                              SIGiST ...
Evolution of Science overall                     SIGiST                                          Specialist Interest Group...
Not only Evolution, but Emergence:                                                                                        ...
OK, what’s this got to do with                                              SIGiSTsoftware testing?                      S...
...and computers are evolving, in                                                  SIGiSTboth sophistication and diversity...
The Philosophy of Science is also                                                             SIGiSTevolving!             ...
SIGiST                     Specialist Interest Group in                     Software Testing 21 Jun 2011Part DPlatforms & ...
Biological reproduction & evolution                                                                          SIGiSTis cont...
Memes as an extension of the                                                                                              ...
Some candidates for Memes in                                                                                              ...
Four, five, six... schools of                                                                                             ...
Learning from the “Schools”                                                                SIGiSTsituation                ...
SIGiST                        Specialist Interest Group in                        Software Testing 21 Jun 2011Conclusions ...
Conclusions                                         SIGiST                                             Specialist Interest...
Recap of messages for Testing &                                                            SIGiSTQuality                  ...
Some wider advice                                         SIGiST                                                   Special...
References                                                   SIGiST                                                      S...
SIGiST                          Specialist Interest Group in                          Software Testing 21 Jun 2011• Thanks...
Upcoming SlideShare
Loading in...5
×

The Science of Software Testing - Experiments, Evolution & Emergence (2011)

536

Published on

British Computer Society Specialist Interest Group in Software Testing (SIGiST) Jun 2011, London

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
536
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
15
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

The Science of Software Testing - Experiments, Evolution & Emergence (2011)

  1. 1. The Science of Software Testing: SIGiSTExperiments, Evolution & Emergence Specialist Interest Group in Software Testing 21 Jun 2011via Value Flowv1.0Neil Thompson ©Thompson informationThompson information Systems Consulting Ltd Systems Consulting Ltd 1
  2. 2. In the beginning testing wasMethods (or Psychology?) – then came SIGiST Specialist Interest Group inthe Arts & Crafts movement(s)! Software Testing 21 Jun 2011 Contrary to popular belief, the first book devoted to software testing was 1973, ed. Bill Hetzel: • (technically, a conference proceedings, Chapel Hill North Carolina 1972) • contains the first V-model?!But NB Jerry Weinberg had written in 1961 & 1971 oftesting as intriguing, puzzle, a psychological problem Glenford Myers (1979) The Art...: • but 1976 “unnatural, destructive process”... “problem in economics” Brian Marick (1995) The Craft...: • specifically for subsystem testing & object-oriented Paul Jorgensen (1995) ...A Craftsman’s approach: • “Mathematics is a descriptive device that helps ©Thompson information Systems us better understand software to be tested” Consulting Ltd 2
  3. 3. More about Gerald M. Weinberg SIGiST Specialist Interest Group in Software Testing 21 Jun 2011• Ph.D. in Psychology (dissertation 1965 “Experiments in Problem Solving”)• In 1961’s Computer Programming Fundamentals with Herbert Leeds (revised 1966 & 1970): – “testing... is by far the most intriguing part of programming” – “seldom a step-by-step procedure”... “normally must circle around”• 1967 Natural Selection as applied to Computers & Programs (!)• 1971 The Psychology of Computer Programming: – “testing is first and foremost a psychological problem” – “one way to guard against... stopping testing too soon... is to prepare the tests in advance of testing and, if possible in advance of coding”– General Systems Thinking (1975): – the science of modelling and simplifying complex, open systems– Systems Thinking (1992) – “observe what’s happening and ...understand the significance” – feedback loops, locking on to patterns, controlling & changing Acknowledgement to James Bach’s summary in “The Gift of Time” (2008 ©Thompson information essays in honour of Jerry Weinberg on his 75th birthday) Systems Consulting Ltd 3
  4. 4. But what is software testing now? SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 Engineering? Graham Bath & Judy McKay (2008): • But “engineering” isn’t in glossary, or even index! • “What is a Test Analyst? Defining a role at the international level is not easy...” Profession? • EuroSTAR 2010, Isabel Evans and others • Magazine(s) Context-Driven? Yes but: • Context-Driven school and artistic? C-D and engineering??? • “Craft” is often used in Context-Driven discussions • Plus science & passion!• Also, more later about the ©Thompson information “schools” of software testing! Systems Consulting Ltd 4
  5. 5. Defining Quality is even more SIGiSTdifficult! Specialist Interest Group in Software Testing 21 Jun 2011Robert M Pirsig:• Zen and the Art of Motorcycle Maintenance – an Inquiry into Values (Bodley Head 1974, also see http://en.wikipedia.org/wiki/Zen_and_the_Art_of_ Motorcycle_Maintenance )• Lila – an Inquiry into Morals (Bantam 1991, also see http://en.wikipedia.org/wiki/Lila:_An_Inquiry_into_ Morals ) ©Thompson information Systems Consulting Ltd 5
  6. 6. “Quality is value to some person(s)” SIGiST Specialist Interest Group in Jerry Weinberg, Software Testing 21 Jun 2011 Quality Software Management 1992 Quality is Yes, but... Quality is value to me value to me Quality is Quality is value to me value to me Quality is value to me Quality is Quality is value to me value to me ©Thompson information “Summit” image from www.topnews.in Systems Consulting Ltd 6
  7. 7. “The Science of Software Testing” SIGiSTisn’t a book yet, but... Specialist Interest Group in Software Testing 21 Jun 2011 • Boris Beizer (1984) experimental process, and (1995) falsifiability (the well-known Popper principle) • Rick Craig & Stefan Jaskiel (2002) black-box science & art, “white-box” science • Marnie Hutcheson (2003) software art, science & engineering Kaner, Bach & Pettichord (2002) explicit science: • theory that software works, experiments to falsify • testers behave empirically, think sceptically, recognise limitations of “knowledge” • testing needs cognitive psychology, inference, conjecture & refutation ©Thompson information • all testing is based on models Systems Consulting Ltd 7
  8. 8. Some bloggers have been more SIGiSTspecific and detailed Specialist Interest Group in Software Testing 21 Jun 2011 Paul Carvalho (www.staqs.com) – testing skills include: • learning / relearning scientific method (multiple sources!) • knowledge of probability & statistics Randy Rice (www.riceconsulting.com) – science rigour decreasing? but: • testing analogies with observation, experiment, hypothesis, law etc David Coutts (en.wikipedia.org/wiki/User:David_Coutts) – yes “context” but: • both science & software testing have “right / wrong answers”, so... • test passes/fails based on its requirements. However, science & testing... • go beyond falsificationism to economy (few theories explaining many observations), consistency, maths foundation & independent verification • retesting is a different theory BJ Rollison (blogs.msdn.com/b/imtesty/ - note this is on his old blog) : • too many hypotheses to test in a reasonable time • debugging is scientific also Cem Kaner (www.kaner.com) – software testing as a social science : • software is for people, we should measure accordingly • bad theories & models give blind spots & impede trade-offs ©Tihompson nformationApologies to anyone I’ve so far missed! Systems Consulting Ltd 8
  9. 9. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011Part AExperiments © Thompson information Systems Consulting Ltd 9
  10. 10. Why should it be useful to treat SIGiSTtesting as a science? Specialist Interest Group in Software Testing 21 Jun 2011 System Requirements/ Specification/ Design Expected Test result Y “passes” Test result = Test Expected? Test N Test Product result “fails” Hypothesis Expected Hypothesis result Experiment Y confirmed result = Expected? Experiment N Hypothesis Part of the Experiment rejected cosmos result ©Thompson informationNote: this is starting with the “traditional” views of testing & science Systems Consulting Ltd 10
  11. 11. What is software testing? Definitions SIGiSTthrough the ages Specialist Interest Group in Software Testing 21 Jun 2011 PERIOD EXEMPLAR OBJECTIVES SCOPE APPROACH DEBUGGING Weinberg Test + Debug Programs Think, IteratePre- (Psychology) (1961 & 71)1957 DEMONSTRATION Hetzel Show meets Verify, +maybe Programs (Method) (1972) requirements Prove, Validate, “Certify”1976 DESTRUCTION Myers Find bugs Programs, Sys, + Walkthroughs, Reviews (Art) (1976 & 79) Acceptance & Inspections1983 ? Measure EVALUATION quality1984 PREVENTION Beizer Find bugs, + Integration (Craft?) (1984) show meets requirements, +prevent bugs2000 SCHOOL(S) Kaner et al Find bugs, in service Realistic, pragmatic, (1988 & 99) of improving quality, normal for customer needs2011 Science? Experiment & Neo-Holistic? Evolve? ©ThompsonOverall periods developed after Gelperin & Hetzel, “The Growth of Software Testing”, information1988 CACM 31 (6) as quoted on Wikipedia Systems Consulting Ltd 11
  12. 12. So, how would these “methods” SIGiSTlook if we adopt Myers & Popper? Specialist Interest Group in Software Testing 21 Jun 2011System Requirements/Specification/ Design “Aim to Test is find bugs” Y “successful” Test result = Test “as aim”? Test N Test is so far Product result “unsuccessful” “Aim to Hypothesis Falsification falsify Experiment Y confirmed hypothesis” result = “as aimed”? Experiment N Hypothesis Part of the Experiment not yet cosmos result falsified ©Thompson informationNote: this is starting with the “traditional” views of testing & science Systems Consulting Ltd 12
  13. 13. A current hot topic: testing versus SIGiST“just checking” Specialist Interest Group in Software Testing 21 Jun 2011System Requirements/Specification/ Design Expected Check result Check Y “passes” result = “Check” Expected? Check N Check Product result “fails” Other oracles Other quality-relatedSystem Requirements/ Ways criteria Quality-Specification/ Design could fail Test result = Y related info appropriate Test ? Info on Test N quality Product result issues ©Thompson information Systems Consulting Ltd 13
  14. 14. Exploratory testing is more sophisticated SIGiSTthan pre-designed, and does not demand Specialist Interest Group in Software Testing 21 Jun 2011a system specification Context Heuristics Epistemology Cognitive psychology Other oracles Other quality-related Abductive System Requirements/ Ways criteria inference Quality- Specification/ Design could fail Y related info Test result = appropriate Test ? Info on Test N quality Product result Bug issues advocacy• Test Framing: – context, mission, requirements, principles, oracles, risks – models, value ideas, skills, heuristics, cost/value/time “issues” – mechanisms, techniques, procedures, execution methods – deliverables ©Thompson information Systems Consulting Ltd 14
  15. 15. Science should help to understand SIGiSToverlapping models, and to derive Specialist Interest Group in Software Testing 21 Jun 2011better test models• Development models & test models each cover REAL WORLD (desired) subsets of actual & potential reality DEV TEST MODEL• Examples of development model techniques: MODEL (verified / – entity relationships (expected) validated) – state transitions• Examples of test model techniques – the above plus: – equivalence partitioning, domain testing & boundaries – transaction, control & data flows – entity life history (CRUD) – classification trees; decision tables after – timing SOFTWARE TESTING: A CRAFTSMAN’S – opportunity for more? APPROACH Paul Jorgensen• In “checking”, the test model tries to cover the SOFTWARE development model (observed)• Testing (sapient) should expand its model beyond ©Thompson that, as far into actual/potential real behaviour as information stakeholders want and can pay for Systems Consulting Ltd 15
  16. 16. Heuristics, patterns & techniques – SIGiSTand scientific analogues? Specialist Interest Group in Software Testing 21 Jun 2011 Heuristics † Conjectures w • art of discovery in logic • proposition that is unproven but is • education method in which thought to be true and student discovers for self has not been disproven • principle used in making decisions when ? all possibilities cannot be fully explored! ? Hypotheses w Patterns * • testable statement based on • catchy title; accepted grounds • description of the problem addressed ? • solution to the problem • context in which pattern applies Theories w • some examples. • proposed explanation of empirical phenomena, made in a way consistent with ? scientific method and satisfactorily tested Techniques † or proven • method of performance • manipulation ? • mechanical part of an Tests Experiments artistic performance! ©Thompson† Definitions based on Chambers 1981 +Laws! information w Systems* Definitions based on Software Testing Retreat #2, 2003 Definitions based on Wikipedia 2011 Consulting Ltd 16
  17. 17. But wait! Is there one, agreed, SIGiST “scientific method”? Specialist Interest Group in Software Testing 21 Jun 2011• No! These are the first dozen I found (unscientifically*)• Only two are near-identical, so here are eleven variants, all with significant differences! (extent, structure & content)• The philosophy of science has evolved (see later slides) ©Tihompson nformation Systems* Images from various websites, top-ranking of Google image search May 2011 Consulting Ltd 17
  18. 18. So... a post-Popper view of theories: SIGiST how science could help coverage Specialist Interest Group in Software Testing 21 Jun 2011• Coverage is multi-dimensional – so General Systems Thinking helps analyse dimensions, eg choose 2-D projections• And models need to map this multi-dimensional space – not quite a hierarchy: • Development Model, of which... • Test Model should be a superset, of which... • Real World is an unattainable superset – but Test should get as close as appropriate• Hybridise & innovate new techniques based on heuristics Sources: Neil Thompson EuroSTAR 1993 & patterns – and taking inspiration from conjectures, Doug Hoffman via www.testingeducation.org hypotheses, theories...• Remember multiple ways software can fail: + see variousProgram state Program state, including uninspected outputs BugSystem state System state Taxonomies System Intended inputs under Monitored outputs testConfiguration and Impacts on connectedsystem resources devices / system resources ©Thompson informationFrom other cooperatingprocesses, clients or servers To other cooperating processes, clients or servers Systems Consulting Ltd 18
  19. 19. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011Part BEvolution &Value Flow ScoreCards © Thompson information Systems Consulting Ltd 19
  20. 20. Traditional Darwinian evolution (ie SIGiSTbiological) Specialist Interest Group in Software Testing 21 Jun 2011 Image from www.qwickstep.com ©Thompson• Nearly everyone is familiar with this, but... information Systems Consulting Ltd 20
  21. 21. ...arguably Darwinan evolutionary SIGiSTprinciples apply beyond biology Specialist Interest Group in Software Testing 21 Jun 2011• There is a cascade (& approx symmetry!): – Biology depends on Organic Chemistry – Organic chemistry depends on the special properties of Carbon – Chemical elements in the upper part of the periodic table come from supernovae – Elements in the lower part of the periodic table come from ordinary stars – Elements are formed from protons, neutrons, electrons (Physics) – ... quarks... string theory?? etc• It just so happens that humans are about equidistant in scale from the smallest (Ouroboros: Greek things we can measure to the largest Οὐροβόρος or οὐρηβόρος, from οὐροβόρος ὄφις• Humans have evolved to use tools, build "tail-devouring snake”) societies, read, invent computers...• So, it is possible to think of pan-scientific Inventions evolution as a flow of value by humans,• Now, back to software lifecycles... eg Social SciencesSources: Daniel Dennett “Darwin’s Dangerous Idea” © Thompson information “cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc) SystemsImage from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF Consulting Ltd 21
  22. 22. The software lifecycle as a flow of value SIGiST Specialist Interest Group in Software Testing 21 Jun 2011• Working systems have value; documents in themselves do not; so this is the RAW MATERIALS FINISHED PRODUCT quickest route! a b c Stated requirements Demonstrations & acceptance tests Programming• SDLCs are necessary, but introduce impediments to value flow: misunderstandings, disagreements… documents are like inventory/stock, or “waste” Implicit a b c a b d requirements ’ Documented ? Acceptance tests I I requirements ? Meeting / escalation to agree Intermediate documentation! ©Thompson information Programming Systems Consulting Ltd 22
  23. 23. To improve value flow: agile methods SIGiSTfollowing principles of lean manufacturing Specialist Interest Group in Software Testing 21 Jun 2011LEVELS OF DOCUMENTATION, FLOW OF FULLY-WORKINGpushed by specifiers SOFTWARE, pulled by customer demand + Test SpecificationsRequirements Accepted System- tested + Func Spec WORKING SOFTWARE Integrated + Technical Design Unit / Component -tested + Unit / Component ©Thompson information specifications Systems Consulting Ltd 23
  24. 24. But any lifecycle should be improvable SIGiSTby considering the value flow through it Specialist Interest Group in Software Testing 21 Jun 2011 • The context influences what deliverables are mandatory / optional / not wanted • Use reviews to find defects & other difficulties fast • Do Test Analysis before Test Design (again, this finds defects early, before a large pile of detailed test scripts has been written) • Even if pre-designed testing is wanted by stakeholders, do some exploratory testing also • “Agile Documentation”*: – use tables & diagrams – consider wikis etc – care with structure © Thompson information* These points based on a book of that name, by Andreas Rüping Systems Consulting Ltd 24
  25. 25. Testing has a hierarchy, eg... SIGiST Levels of Specialist Interest Group in Software Testing 21 Jun 2011 system & Risks & Levels of Levels of service testing specification stakeholders integration responsib’s + Business Users may Business, processes Acceptance be unhappy (so Requirements Users, Testing generate Business Analysts, Acceptance Testers confidence) System may contain Functional & Architects, bugs not found by System NF specifica- “independent” Testing lower levels (so seek tions testers bugs of type z) Technical Designers, Integration Units may not interact spec, Hi-level properly (so seek bugs integration Testing design testers of type y) Individual units may Detailed Developers, Unit designs Testing malfunction (so seek bugs unit testers of type x)Remember: not only for waterfall or V-model SDLCs, rather iterative / incremental go down & up ©Thompson through layers of stakeholders, information specifications & system integrations Systems Consulting Ltd 25
  26. 26. ...Quality and Science can also be seen as SIGiSThierarchies, which testing can parallel Specialist Interest Group in Software Testing 21 Jun 2011 Levels of system & Layers of Levels of service Layers of quality stakeholders integration science + BusinessStatic values: Business, processes• Intellectual Users, Philosophy Business Analysts,• Social Acceptance Testers Social sciences Architects, Biology (& systems• Biological “independent” thinking) testers Chemistry: Organic Designers, integration Chemistry: Inorganic testers• Inorganic Developers, ©Thompson unit testers Physics information Systems Consulting Ltd 26
  27. 27. Value flows down through, SIGiSTthen up through, these layers Specialist Interest Group in Software Testing 21 Jun 2011 Levels of system & Desired Levels of service Tested (“known”) quality stakeholders integration quality + Business Business, processes Users, Business Analysts, Acceptance Testers Architects, “independent” testers Designers, integration testers Developers, ©Thompson unit testers information Systems Consulting Ltd 27
  28. 28. So, test appropriately to your scale SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 Physics Physics (gravity (quantum end) end) Unit Testing Chemistry: Integration Inorganic Testing Organic Biology Systems Social sciences System thinking) Testing Acceptance Testing ©Thompson information Understanding of solution Systems Consulting Ltd 28
  29. 29. How different sciences can inspire SIGiSTdifferent levels of testing Specialist Interest Group in Software Testing 21 Jun 2011• First, Unit/Component Testing (Physics): – think quanta (smallest things you can do to the software), equivalence partitions, data values• For Integration Testing (Chemistry): – think about interactions, what reactions should be, symmetry, loops, valencies, performance of interfaces• For System Testing (Biology): – fitness for purpose, entity life histories, ecosystems, palaeontology (historic bugs)• For Acceptance Testing: – think “Social Sciences”: what are contractual obligations?• For each test level, consider and tune the value which that level is adding... ©T hompson information Systems Consulting Ltd 29
  30. 30. Value Flow ScoreCards SIGiST (...have been presented previously, so these slides are for background Specialist Interest Group in Software Testing 21 Jun 2011 and will be skimmed through quickly in the presentation) Financial• Based on Kaplan & Norton Efficiency Productivity Balanced Business Scorecard On-time, in budget and other “quality” concepts Supplier - Cost of quality Customer VALIDATION Upward• Value chain ≈ Supply chain: management Risks Benefits Acceptance – in the IS SDLC, each participant Information gathering Satisfaction - Complaints should try to ‘manage their Improve- supplier’ ment – for example, development eg TPI/TMM… Predictability supplies testing Learning Innovation (in trad lifecycles, at least!) – we add supplier viewpoint to the other 5, giving a 6th view of quality• So, each step in the Process Product Compliance VERIFICATION value chain can manage its eg ISO9000 Repeatability Risks Test coverage inputs, outputs and - Mistakes - Faults ©Thompson other stakeholders - Failures information Systems Consulting Ltd 30
  31. 31. Value Flow ScoreCards can be SIGiSTcascaded (...but you don’t necessarily need all of these!) Specialist Interest Group in Software Testing 21 Jun 2011Business Analysts Requirements Reviewers Acceptance Test Analysts AT Designers & Scripters Acceptance Testers Architects Func Spec Reviewers Sys Test Analysts ST Designers & Scripters Sys Testers Designers Tech Design Reviewers Int Test Analysts IT Designers, Scripters & Executers Component Test Analysts, Designers & Executers?Pieces of a jig-sawIn addition to “measuring” qualityinformation within the SDLC: via pair programming?• can use to align SDLC principles with ©Thompson Developers higher-level principles from the information Systems organisation Consulting Ltd 31
  32. 32. The Value Flow ScoreCard in action SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 Financial Customer Financial Improv’t Supplier Process ProductSupplier Improv’t CustomerProcess Product • Yes – it’s just a table! …Into which we can put useful things… • We start with repositionable paper notes, then can ©Thompson put in spreadsheet(s) information Systems Consulting Ltd 32
  33. 33. Use #1, Test Policy: SIGiST All views included? Specialist Interest Group in Software Testing 21 Jun 2011 Why-What-How (G-Q-M) thought through?Organisation’s Organisation’s ScoreCardsGoals & Supplier Process Product Customer Financial Improvement & InfrastructureObjectives Upward Compliance VERIFICATION VALIDATION Efficiency management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Information Acceptance in budget Learning Why gathering - Mistakes - Faults - Failures Satisfaction - Complaints - Cost of quality Innovation • IS actively • Products • Products • Proj Mgr is • Staff must be • Constant • Use TestFrameObjectives supports to satisfy to be fit responsible certified improv’t of for test analysis employees specified for purpose for quality dev & test & execution GOAL requirements processes • Indep- • (comprehensive • Automate regr • Bus Mgt is • Testing endence scope) • Detect tests as much responsible prioritised & increases defects as possible for enforcing managed with early Test PolicyWhat test type • Defect source analysis • Both static • Defect • Product • TMM levelsMeasures & dynamic Detection risks • ISTQB • Freq of process adjustments • Planning, Percentage QUESTION preparation • Importance heeding metrics & evaluation of req’ts • Software & • Advisors • TMM level 2 • Twice per yearTargets related work Expert at least, now products • Managers METRIC Advanced • TMM level 3 • Analysts within 2 years How FoundationInitiatives ©ThompsonSource: summarised from an example in TestGrip by Marselis, van Royen, Schotanus & Pinkster (CMG, 2007) information Systems Consulting Ltd 33
  34. 34. Use #2, test coverage: SIGiSTTest conditions as measures & targets (not test cases!) Specialist Interest Group in Software Testing 21 Jun 2011 (from Supplier Process Product Customer Financial Improvement & LEVEL TEST PLAN Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… and Info from other Repeatability Test coverage Benefits On-time, Predictability TEST BASES) levels of Acceptance in budget Learning Treble-V model - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality Test Items Product Constraints (level of benefits Objectives Features to be integration) Test Basis Features to be tested References tested Product Product Product Product Product Risks Risks Risks Risks Risks Measures Areas we could cover Agreed Test Conditions with Targets we intend to cover stakeholders Objectives for Initiatives (to next level Test Cases of sys design) ©Thompson (to test design information & execution) Systems Consulting Ltd 34
  35. 35. Use #3: process improvement, eg via SIGiSTGoldratt’s Theory of Constraints: Specialist Interest Group in Software Testing 21 Jun 2011“Swimlane” symptoms, causes & proposed remedies Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Information Acceptance in budget Learning gethring - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality CURRENT ILLSObjectives CONFLICT RESOLUTION FUTURE REMEDIESMeasures PRE- REQUISITESTargets TRANSITIONInitiatives ©ThompsonNote: this is similar to Kaplan & Norton’s “Strategy Maps” (Harvard Business School Press 2004) information SystemsWhen cause-effect branches form feedback loops, this becomes part of Systems Thinking Consulting Ltd 35
  36. 36. Use #4a: context-driven testing, egGoldratt conflict resolution on process areas with choices SIGiST Specialist Interest Group inFrom Software Testing 21 Jun 2011 Supplier Process Product Customer Financial Improvement &Context / Upward Compliance VERIFICATION VALIDATION Efficiency InfrastructureCircumstances management eg ISO9000 Repeatability Risks Test coverage Risks Benefits Productivity On-time, eg TPI/TMM… Predictability Information Acceptance in budget Learning gathering - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality Legal: Process Application Sector Job type & size • regulation constraints, eg: characteristics Resources:Objectives • standards • quality mgmt Culture • money ( skills, environments) CURRENT • configuration Technical • time SITUATION Moral: mgmt risks Business risks • safety Technology CHOICE AREAS... Test specifications Handover & acceptance etc (about 30 criteria categories)Measures informal formal informal formal CONFLICT RESOLUTION Where in the range Where in the range DESIREDTargets (specific aspects) (specific aspects) SITUATION ©Thompson informationInitiatives Appropriate Testing in this context / circumstances Systems Consulting Ltd 36
  37. 37. Use #4b: lifecycle methodology selection SIGiST/ design, Value Flow ScoreCard as unifying framework Specialist Interest Group in Software Testing 21 Jun 2011 Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Repeatability Test coverage Benefits On-time, Predictability Information Acceptance in budget Learning gathering - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of qualityObjectives Risks Risks Risks Risks Risks Risks BALANCE Game Theory...................................................................................................... “METHODOLOGY PER PROJECT” ...........................................(any other approaches?)...........................................Measures Conflicts & balances One hand The otherTargets Appropriate lifecycle methodology in this ©ThompsonInitiatives information context / circumstances Systems Consulting Ltd 37
  38. 38. A further use, #4c?? SIGiST Specialist Interest Group in Software Testing 21 Jun 2011• Could Value Flow ScoreCard ideas help discuss & bridge the (arguably) growing divide between traditional & agile software practitioners, eg: - waterfall, V-model, W-model, iterative, incremental...? - “schools” of software testing, eg Analytic, Standard, Quality, Context-Driven, Agile... Factory, Oblivious...? - scripted (or at least pre-designed) & exploratory testing?• To attempt this, let’s extend the evolution & value flowconcepts to... ©Thompson information Systems Consulting Ltd 38
  39. 39. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011Part CEmergence &Value Flow Science © Thompson information Systems Consulting Ltd 39
  40. 40. Evolution as Sophistication plotted SIGiST against Diversity Specialist Interest Group in Software Testing 21 Jun 2011 Sophistication Diversity ©Thompson information SystemsSource: Daniel Dennett “Darwin’s Dangerous Idea” Consulting Ltd 40
  41. 41. Punctuated equilibria SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 Sophistication Sophistication (equilibrium) Spread into new niche, eg Mammals Mass extinction, (equilibrium) eg Dinosaurs “Explosion” in species, eg Cambrian (equilibrium) “Gradual” Diversity Punctuated Diversity Darwinsim equilibria Number of Sophistication species Diversity ©Thompson“Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay Gould information SystemsImages from www.wikipedia.org Consulting Ltd 41
  42. 42. Evolution of Science overall SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 Social sciences Biology Organic Inorganic Chemistry ©Thompson Physics information Systems Consulting Ltd 42
  43. 43. Not only Evolution, but Emergence: SIGiST progress along order-chaos edge Specialist Interest Group in Software Testing 21 Jun 2011 • For best innovation & progress, need neither too much order nor too much chaos • “Adjacent Possible” Social sciences Biology Chemistry Physics ©Thompson informationExtrapolation from various sources, esp. Stuart Kauffmann, “The Origins of Order”, “Investigations” Systems Consulting Ltd 43
  44. 44. OK, what’s this got to do with SIGiSTsoftware testing? Specialist Interest Group in Software Testing 21 Jun 2011 Computers • We have an Books important and difficult job to do Language here! ©Thompson Tools information Systems Social sciences Consulting Ltd 44
  45. 45. ...and computers are evolving, in SIGiSTboth sophistication and diversity, Specialist Interest Group in Software Testing 21 Jun 2011faster than software testing? Artificial Intelligence?! 4GL Internet, • Are we Object Mobile ready 3GL Orientation devices to test 2GL AI?? ©Thompson 1GL information Systems Computers Consulting Ltd 45
  46. 46. The Philosophy of Science is also SIGiSTevolving! Specialist Interest Group in Software Testing 21 Jun 2011 ## Laudan Lakatos Bayesianism, Grounded Kuhn Theory... Popper Empiricism • So, perhaps the Positivism Philosophy of Software Testing Logical could learn from this, perhaps it’s also evolving?... ©Thompson Classical information Systems Consulting Ltd 46
  47. 47. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011Part DPlatforms & Cranes(Genes to Memes) © Thompson information Systems Consulting Ltd 47
  48. 48. Biological reproduction & evolution SIGiSTis controlled by Genes Specialist Interest Group in Software Testing 21 Jun 2011 Replication & Selection Sophist- ication Mutation Diversity ©Thompson informationImage from www.qwickstep.com Image from .schools.wikipedia.org Systems Consulting Ltd 48
  49. 49. Memes as an extension of the SIGiSTGenes concept Specialist Interest Group in Software Testing 21 Jun 2011 Replication & Selection Cranes “Other imitable phenomena” Writing Platforms Speech Rituals Gestures Mental, social & cultural evolution Symbols Ideas Beliefs Practices Image from .www.salon.com Mutation Taxonomy from www.wikipedia.org ©Thompson Biological evolution information Theme developed from Daniel Dennett “Darwin’s Dangerous Idea” Systems Consulting Ltd 49
  50. 50. Some candidates for Memes in SIGiSTsoftware testing Specialist Interest Group in Software Testing 21 Jun 2011 Effectiveness Always-consider Efficiency Risk management Quality management Decide process targets Assess where errors originally made & improve over time Insurance Assurance Be pragmatic over quality targets Plan early, then Define & use metrics Give confidence (AT) rehearse-run, Use handover & acceptance criteria Define & detect errors (UT,IT,ST) acceptance tests V-model: what testing against W-model: quality management Use independent system & acceptance testersRisks: list & evaluate Tailor risks & priorities etc to factors Use appropriate skills mix  Refine test specifications progressively: Define & agree roles & responsibilitiesPrioritise tests based on risks  Plan based on priorities & constraints  Design flexible tests to fit Use appropriate techniques & patternsDefine & measure  Allow appropriate script format(s)test coverage  Use synthetic + lifelike data Use appropriate toolsAllow & assess for coverage changes Document execution & management procedures Optimise efficiency Distinguish problems from change requests Measure progress & problem significance Prioritise urgency & importanceQuantify residual risks & confidence Distinguish retesting from regression testing ©Thompson informationSource: Neil Thompson STAREast 2003 Systems Consulting Ltd 50
  51. 51. Four, five, six... schools of SIGiST software testing? Specialist Interest Group in Software Testing 21 Jun 2011 (Updated version) March 2007 Copyright © 2003-2007 Bret Pettichord. Permission to reproduce granted with attribution Emphasis on Oblivious / policing developers and Groucho? acting as “gatekeeper”Emphasis on analytical methodsfor assessing the qualityof the software, includingimprovement of testability byimproved precision of specifications Factory: Emphasis on reduction ofand many types of modeling testing tasks to (Control): routines that can be Emphasis on automated or standards and delegated to cheap labour processes that enforce or rely heavily Emphasis on on standards (Test-Driven): adapting to emphasis on the circumstances code-focused testing Neo- under which by programmers Holistic? the product is Axiomatic developed and used ? (like C-D) ©ThompsonAnnotations by Neil Thompson after the Bret Pettichord ppt (blue text), information Systemsthe list in Cem Kaner’s blog December 2006 (black text) , and other sources! (red text) Consulting Ltd 51
  52. 52. Learning from the “Schools” SIGiSTsituation Specialist Interest Group in Software Testing 21 Jun 2011 • Think in terms of memes: evolution and transmission • Separate “what people have been taught” from: – what their bosses say they want (or should want??) – what their personalities push them towards • Is school behaviour volatile? (context-driven!?) • Things we could adapt from other disciplines: – see various conference talks, eg oil exploration – but what about insurance, and their actuaries?! • Preparing for the future, eg testing Artificial Intelligence: – what happened to Genetic Algorithms? – what’s the latest Bayesian application? © Thompson information Systems Consulting Ltd 52
  53. 53. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011Conclusions & Summary © Thompson information Systems Consulting Ltd 53
  54. 54. Conclusions SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 • The Ouroboros looks better with software testing at the top! Value flows upwards : • (right) from the Big Bang to planet Earth and human habitations; and • (left) from subatomic particles to humans • (the “origin”, Quantum Gravity?, yet to be agreed) • Value Flow ScoreCards are useful, but this talk is more about applying the layered principles of science to IT quality • Humans now evolve in terms of technology-aided Memes, and we can use that to understand & develop the future of software testing ©Thompson information Systems Consulting Ltd 54
  55. 55. Recap of messages for Testing & SIGiSTQuality Specialist Interest Group in Software Testing 21 Jun 2011• When strategising, planning and performing testing: – test according to your scale, using analogies from different sciences to help “frame” your tests – use Value Flow Scorecards to understand and balance your stakeholders – design experiments to seek different bug types at different levels (don’t just “falsify” the opposite experiment)• When considering your position & future in the testing industry: – it’s not just teaching but also psyche, and what bosses want – “stand on the shoulders of giants” ie make use of the platforms which give huge leverage (eg exploratory automation) ©T hompson information Systems Consulting Ltd 55
  56. 56. Some wider advice SIGiST Specialist Interest Group in Software Testing 21 Jun 2011 • When reading new material, use the Adjacent Possible – consider reading two authors at once (or maybe three): – either different representations of similar opinions, or – apparently opposing opinions • (I’ve been quoting pairs of books on Twitter, “Things To Read Together”) © Thompson information Systems Consulting Ltd 56
  57. 57. References SIGiST Specialist Interest Group in Software Testing 21 Jun 2011• Already quoted slide-by-slide – for summary of main sources see associated article in “The Tester”• Stop Press (since preparing this talk) – see also related views: – The Software Testing Timeline, www.testingreferences.com – Stuart Reid, “Lines of Innovation in Software Testing”, 2011 paper – Jurgen Appelo, “Management 3.0” (2011 book and website, about agile leadership practices – makes significant use of complexity theory, eg Kauffman) © Thompson information Systems Consulting Ltd 57
  58. 58. SIGiST Specialist Interest Group in Software Testing 21 Jun 2011• Thanks for listening! ©Thompson• Questions? information Systems Consulting Ltd 58
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×