Memes & Fitness Landscapes - analogies of testing with sci evol (2011)
Upcoming SlideShare
Loading in...5

Memes & Fitness Landscapes - analogies of testing with sci evol (2011)



Conference of the Association for Software Testing (CAST) Aug 2011 (Emerging Topics track), near Seattle.

Conference of the Association for Software Testing (CAST) Aug 2011 (Emerging Topics track), near Seattle.



Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds


Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

Memes & Fitness Landscapes - analogies of testing with sci evol (2011) Memes & Fitness Landscapes - analogies of testing with sci evol (2011) Presentation Transcript

  • Memes andFitness Landscapes Emerging Topics trackdown at theSchools-yard:Analogies oftesting context withscientific evolutionv1.0 Slides 15 minutes (not “lightning”, but...), discussion 5 minutes (plus after, whenever!)Neil Thompson ©Thompson informationThompson information Systems Consulting Ltd Systems Consulting Ltd 1
  • Four, five, six... schools of software testing? Emerging Topics track (Updated version) March 2007 Copyright © 2003-2007 Bret Pettichord. Permission to reproduce granted with attribution Emphasis on Oblivious / policing developers Groucho? and acting asEmphasis on analytical methods “gatekeeper”for assessing the qualityof the software, includingimprovement of testability byimproved precision of Factory: Emphasis on reduction ofspecifications testing tasks toand many types of modeling (Control): routines that can be Emphasis on automated or standards and delegated to processes that cheap labour enforce or Emphasis on rely heavily (Test-Driven): adapting to on standards emphasis on the circumstances code-focused Neo- under which Axiomatic testing Holistic? the product is ? by programmers (like C-D) developed and used ©ThompsonAnnotations by Neil Thompson after the Bret Pettichord ppt (blue text), information Systemsthe list in Cem Kaner’s blog December 2006 (black text) , and other sources! (red text) Consulting Ltd 2
  • Why I want to talk about this Emerging Topics track • To understand better why and how different people work and communicate in different ways (eg balance of personal and situational factors) • To try to broker more interactive and detailed debate (with less emotional confrontation?) between people exhibiting the characteristics of different Schools • To help pave the way for the evolution of testing into the future, eg using power tools such as Grounded Theory with statistical analysis, Artificial Intelligence concepts such as Bayesian belief networks and genetic algorithms; testing AI itself! © Thompson information Systems Consulting Ltd 3
  • Precursor talk to this: “TheScience of Software Testing” Emerging Topics trackSystem Requirements/Specification/ Design “Aim to Myers Test is find bugs” Y “successful” Test result = Test “as aim”? Test N Test is so far Product result “unsuccessful” “Aim to Popper Hypothesis Falsification falsify Experiment Y confirmed hypothesis” result = “as aimed”? Experiment N Hypothesis Part of the Experiment not yet cosmos result falsified ©Thompson informationNote: this is starting with some “traditional” views of testing & science Systems Consulting Ltd 4
  • This comparison informs the hottopic: testing versus “just” “checking” Emerging Topics track System Requirements/ Specification/ Design Expected Check result Check Y “passes” result = “Check” Expected? Check N Check Product result “fails” Other oracles Other quality-related System Requirements/ Ways criteria Quality- Specification/ Design could fail Test result = Y related info appropriate Test ? Info on Test N quality Product result issues ©Thompson informationSources: Michael Bolton blog and various Context-Driven material Systems Consulting Ltd 5
  • A complementary view: Testing asfacilitating value flow through SDLC Emerging Topics trackLEVELS OF DOCUMENTATION, FLOW OF FULLY-WORKINGpushed by specifiers SOFTWARE, pulled by customer demand + Test SpecificationsRequirements Accepted System- tested + Func Spec WORKING SOFTWARE Integrated + Technical Design Unit / Component -tested + Unit / Component ©Thompson information specifications Systems Consulting Ltd 6
  • Value Flow in SDLC as layerswith four characteristics Emerging Topics track Levels ofLevels of system & Testedspec- Desired Levels of service (“known”)ification quality stakeholders integration quality + Business Business, processes Requirements Users, Business Analysts, Acceptance Testers Functional & Architects, NF specifica- “independent” tions testers But, how Technical measure & Designers, spec, Hi-level design integration manage? testers Detailed Developers, designs unit testersRemember: not only for waterfall or V-model SDLCs, rather iterative / incremental go ©Thompson down & up through layers of stakeholders, information Systems specifications & integrations Consulting Ltd 7
  • Managing Value Flow throughSDLC using VF ScoreCards Emerging Topics track Financial• Based on Kaplan & Norton Efficiency Productivity Balanced Business Scorecard On-time, in budget and other “quality” concepts Supplier - Cost of quality Customer VALIDATION Upward• Value chain ≈ Supply chain: management Risks Benefits Acceptance – in the IS SDLC, each participant Information gathering Satisfaction - Complaints should try to ‘manage their Improve- supplier’ ment – for example, development eg TPI/TMM… Predictability supplies testing Learning Innovation (in trad lifecycles, at least!) – we add supplier viewpoint to the other 5, giving a 6th view of quality• So, each step in the Process Product Compliance VERIFICATION value chain can manage its eg ISO9000 Repeatability Risks Test coverage inputs, outputs and - Mistakes - Faults ©Thompson other stakeholders - Failures information Systems Consulting Ltd 8
  • Value Flow ScoreCards canbe cascaded (...but don’t necessarily need all of these!) Emerging Topics trackBusiness Analysts Requirements Reviewers Acceptance Test Analysts AT Designers & Scripters Acceptance Testers Architects Func Spec Reviewers Sys Test Analysts ST Designers & Scripters Sys Testers Designers Tech Design Reviewers Int Test Analysts IT Designers, Scripters & Executers Component Test Analysts, Designers & Executers?Pieces of a jig-saw puzzle!In addition to “measuring” qualityinformation within the SDLC: via pair programming?• can use to align SDLC principles with ©Thompson Developers higher-level principles from the information Systems organisation Consulting Ltd 9
  • Using “Metaphysics” & Scienceas analogies to inform Value Flow Emerging Topics track Levels of system & Layers of Levels of service Layers of quality stakeholders integration scienceStatic values: + Business Business, processes• Intellectual Users, (Philosophy) Business Analysts,• Social Acceptance Testers Social sciences Architects, Biology (& systems• Biological “independent” testers thinking) Chemistry: Organic Designers, integration Chemistry: Inorganic testers• Inorganic Developers, ©Thompson unit testers Physics information Systems Consulting Ltd 10
  • What was that about layers ofScience? Emerging Topics track• There is a cascade (and approx symmetry!): – Biology depends on Organic Chemistry – Organic chemistry depends on the special properties of Carbon – Chemical elements in the upper part of the periodic table come from supernovae – Elements in the lower part of the periodic table come from ordinary stars – Elements are formed from protons, neutrons, electrons (Physics) – ... quarks... string theory?? etc• It just so happens that humans are about equidistant in scale from the smallest things (Ouroboros: Greek we can measure to the largest Οὐροβόρος or οὐρηβόρος, from οὐροβόρος ὄφις• Then... humans have evolved to use tools, "tail-devouring snake”) build societies, read, invent computers...• So, it is possible to think of pan-scientific Inventions evolution as a flow of value by humans,• Now, back to software lifecycles... eg Social SciencesSources: Daniel Dennett “Darwin’s Dangerous Idea” © Thompson information “cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc) SystemsImage from Consulting Ltd 11
  • Invert that picture: Testing is reallyat the top, and should diverge fast! Emerging Topics track • Each level of progress generates6: Intelligence into matter/energy patterns?5: Bio methods integrated into technology? possibilities, which are tested“SINGULARITY” • Then, each level is a platform which, when established, is easily built upon by “cranes” (without having to worry4: Technology about the details below) • After the science levels... • humans made tools, talked and co-operated3: Brains • printing gave us another level2: Biology • now, software is following exponential growth1: Chemistry & Physics • So, software testing should surf the wave of evolution (not flounder in the shallows behind it) ©Thompson information+0: Maths?! • Kurzweil epochs Systems Consulting Ltd 12
  • Evolution: first, traditionalDarwinian (ie biological) Emerging Topics track Image from ©Thompson information Systems Consulting Ltd 13
  • Biological reproduction & evolutionare controlled by Genes Emerging Topics track Replication & Selection Sophist- ication Mutation Diversity ©Thompson informationImage from Image from Systems Consulting Ltd 14
  • Biological Evolution asSophistication plotted against Diversity Emerging Topics track Sophistication Diversity ©Thompson information SystemsSource: Daniel Dennett “Darwin’s Dangerous Idea” Consulting Ltd 15
  • But evolution is not smooth? Emerging Topics track Sophistication Sophistication (equilibrium) Spread into new niche, eg Mammals Mass extinction, (equilibrium) eg Dinosaurs “Explosion” in species, eg Cambrian (equilibrium) “Gradual” Diversity Punctuated Diversity Darwinsim equilibria Number of Sophistication species Diversity ©Thompson“Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay Gould information SystemsImages from Consulting Ltd 16
  • So... Evolution ofScience overall Emerging Topics track • Arguably other sciences have not evolved smoothly either • Sudden advances, akin to punctuated equilibria in biological evolution Social sciences Biology Organic Inorganic Chemistry ©Thompson Physics information Systems Consulting Ltd 17
  • OK, what’s this got to do withsoftware testing? Emerging Topics track • Social sciences evolution Tipping Points(Malcolm Gladwell) Computers • We have an Books important and difficult job to do Language here! ©Thompson Tools information Systems Social sciences Consulting Ltd 18
  • ...and computers are evolving, in bothsophistication and diversity, faster than Emerging Topics tracksoftware testing? Artificial Intelligence?! 4GL Internet, • Are we Object Mobile ready 3GL Orientation devices to test 2GL AI?? ©Thompson 1GL information Systems Computers Consulting Ltd 19
  • How software testing hasevolved so far? Emerging Topics track PERIOD EXEMPLAR OBJECTIVES SCOPE APPROACH DEBUGGING Weinberg Test + Debug Programs Think, IteratePre- (Psychology) (1961 & 71)1957 DEMONSTRATION Hetzel Show meets Verify, +maybe Programs (Method) (1972) requirements Prove, Validate, “Certify”1976 DESTRUCTION Myers Find bugs Programs, Sys, + Walkthroughs, Reviews (Art) (1976 & 79) Acceptance & Inspections1983 ? Measure EVALUATION quality1984 PREVENTION Beizer Find bugs, + Integration (Craft?) (1984) show meets requirements, +prevent bugs2000 SCHOOL(S) Kaner et al Find bugs, in service Realistic, pragmatic, (1988 & 99) of improving quality, normal for customer needs2011 Science? Experiment & Neo-Holistic? Evolve? ©ThompsonOverall periods developed after Gelperin & Hetzel, “The Growth of Software Testing”, information1988 CACM 31 (6) as quoted on Wikipedia Systems Consulting Ltd 20
  • Can we retrofit the Schools tothis history? Emerging Topics track PERIOD EXEMPLAR OBJECTIVES SCOPE “SCHOOL”? DEBUGGING Weinberg Test + Debug Programs “no schools,Pre- (Psychology) (1961 & 71) but...”1957 DEMONSTRATION Hetzel Show meets Standard Programs (Method) (1972) requirements (Control)1976 DESTRUCTION Myers Find bugs Programs, (Art) (1976 & 79) Sys, Acc ?1983 ? Measure EVALUATION Analytic quality1984 PREVENTION Beizer Find bugs, + Int Quality (Craft?) (1984) show meets requirements, +prevent bugs Factory Agile2000 SCHOOL(S) Kaner et al Find bugs, in service (Test-Driven) Context (1988 & 99) of improving quality, Driven for customer needs2011 Science? Experiment & Neo- Evolve? Holistic? ©Thompson information Systems Consulting Ltd 21
  • The Philosophy of Science isalso evolving! Emerging Topics track • Again, progress jerky, not smooth • Paradigm shifts akin to punctuated equilibria Hull Laudan Lakatos Bayesianism, Grounded Kuhn Theory... Popper Empiricism • So, perhaps the Positivism Philosophy of Software Testing Logical could learn from this, perhaps it’s also evolving?... ©Thompson Classical information Systems Consulting Ltd 22
  • Memes as an extension of theGenes concept Emerging Topics track Replication & Selection Cranes “Other imitable phenomena” Writing Platforms Speech Rituals Gestures Mental, social & cultural evolution Symbols Ideas Beliefs Practices Image from Mutation Taxonomy from ©Thompson Biological evolution information Theme developed from Daniel Dennett “Darwin’s Dangerous Idea” Systems Consulting Ltd 23
  • Some candidates for Memes insoftware testing Emerging Topics track Effectiveness Always-consider Efficiency Risk management Quality management Decide process targets Assess where errors originally made & improve over time Insurance Assurance Be pragmatic over quality targets Plan early, then Define & use metrics Give confidence (AT) rehearse-run, Use handover & acceptance criteria Define & detect errors (UT,IT,ST) acceptance tests V-model: what testing against W-model: quality management Use independent system & acceptance testersRisks: list & evaluate Tailor risks & priorities etc to factors Use appropriate skills mix  Refine test specifications progressively: Define & agree roles & responsibilitiesPrioritise tests based on risks  Plan based on priorities & constraints  Design flexible tests to fit Use appropriate techniques & patternsDefine & measure  Allow appropriate script format(s)test coverage  Use synthetic + lifelike data Use appropriate toolsAllow & assess for coverage changes Document execution & management procedures Optimise efficiency Distinguish problems from change requests Measure progress & problem significance Prioritise urgency & importanceQuantify residual risks & confidence Distinguish retesting from regression testing ©ThompsonSource: Neil Thompson STAREast 2003 information Systems(not “best practices” but reference points for variation?) Consulting Ltd 24
  • An example of a differentsoftware testing “memeplex” Emerging Topics track Managing the testing group Your career in software testing Managing the testing project techniques Testing a tester Thinking like The role of the tester strategy testing Planning the testing Automating advocacy Bug Interacting with programmers Documenting testing ©ThompsonSource: Neil Thompson BCS SIGiST 2002 review of information SystemsLessons Learned in Software Testing (Kaner, Bach & Pettichord) Consulting Ltd 25
  • Memeplexes and FitnessLandscapes: fixed / flexible? Emerging Topics track Supplier Process Product Customer Financial Improvement & Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure management eg ISO9000 Risks Risks Productivity eg TPI/TMM… Info from other Repeatability Test coverage Benefits On-time, Predictability levels of Acceptance in budget Learning Treble-V model - Faults Satisfaction Innovation - Mistakes - Failures - Complaints - Cost of quality ------- RUGGED & FIXED? ------- FLEXIBLE? Standard standards and (Control) processes testability analytical Analytic precision of specs methods assessing quality ----- many types of modeling ----- of software Quality policing developers acting as “gatekeeper” reduction of tasks Factory routines delegated to can be cheap labour automated Agile code-focused(Test-Driven) testing by (needs programmers automation) Context Emphasis on adapting to circumstances under which the product is developed & used Driven ©Thompson information Systems Consulting Ltd 26
  • Are these separate specieswhich cannot interbreed? Emerging Topics track Factory (Control) (Test-Driven)• ... or is each part of an ecosystem with its suppliers and customers? ©Thompson information Systems Consulting Ltd 27
  • Shouldn’t (or doesn’t) Context-Drivensubsume practices of other schools Emerging Topics track*when context is appropriate*? Factory (Control) Neo-Holistic? (Test-Driven) (or Context-Driven itself?) ©Thompson information Systems Consulting Ltd 28
  • Examples of how memeplexes canhelp ascend peaks of fitness landscapes Emerging Topics track Supplier Process Product Customer Financial Improvement & Infrastructure testability analytical Analytic precision of specs methods assessing quality ----- many types of modeling ----- of software W-model: quality management Assess where errors originally made Measure progress & problem significancePLUS... OR INSTEAD! Risks: list & evaluate Allow & assess for coverage changes Context Prioritise tests Driven based on risks Quantify residual risks• Context-Driven thinking could address gaps & confidence and rebalance the scorecard; or... ©Thompson information• It might prefer its own scorecard Systems Consulting Ltd 29
  • A Tester’s Taxonomy forMeme generation &transmission Emerging Topics track Replication & Selection“Other WEEKENDimitable TESTINGphenomena” COURSES BOOKS Writing PAPERS STANDARDS BLOGS CONFERENCE IN BAR WITH WORKING Speech ON A TALKS WORKMATES PROJECT Rituals SPECIFIC IN BED/ STANDUP METHODS & PARK/BAR, MEETINGS PROCESSES THINKING Gestures V/W MODEL Mutation Symbols Ideas Beliefs Practices ©Thompson Image from information Systems Taxonomy from Consulting Ltd 30
  • Speculation on memevariations in software testing Emerging Topics track Favoured Meme“Other vehiclesimitable COURSES WEEKENDphenomena” COURSES (Certification) (Proficiency) TESTING BOOKS, PAPERS Writing STANDARDS BLOGS Speech CONFERENCE TALKS (but not necessarily the same conferences!) Rituals SPECIFIC METHODS STANDUP & PROCESSES MEETINGS Gestures V/W MODELFavouredMeme types? Symbols Practices Beliefs Ideas Standard School groups Factory Context (Control) Agile (Test-Driven) Driven ©Thompson information Analytic Quality Systems Consulting Ltd 31
  • Not only Evolution, but Emergence:progress along order-chaos edge Emerging Topics track • For best innovation & progress, need neither too much order nor too much chaos • “Adjacent Possible” • Might this also apply to testing? Social sciences Biology Chemistry Physics ©Thompson informationExtrapolation from various sources, esp. Stuart Kauffmann, “The Origins of Order”, “Investigations” Systems Consulting Ltd 32
  • So...What’s the message, here? Emerging Topics track• Regarding schools of software testing – I believe they are a very useful concept but: – please see analogies with species, fitness landscapes & ecosystems – don’t just preach to testers, educate the customers! – are schools really fixed around beliefs, or are they more flexible?• When you think & communicate, try using the memes framework to better understand what you are building on, where you are innovating, and what you want to achieve, with what audience: – could “old school” people please blog and tweet more? – could Context-Driven people write more books please? – let’s share our experiences on projects• Software testing is evolving, should continue to evolve: – future jumps could be quite big (“platforms & cranes”) – may be sudden paradigm shifts (cf punctuated equilibria, Per Bak’s sandpiles) © Thompson information – most fruitful path is on the chaos-order boundary? Systems Consulting Ltd 33
  • Next steps already considering Emerging Topics track• Analyse more specific examples of memes in congenial & hostile environments – the “Extended Phemotype”!• More analogies of testing with history & philosophy of science – both for individual strategies and for improvement• Practical uses of Bayesianism to focus testing – already are some? What can we actually use? What’s coming?• Is there correlation between personality (eg Myers-Briggs, Belbin) and “membership” of schools of software testing? – Myers-Briggs fixed, Belbin can vary with situation & mood??• Is the Cynefin construct of any use here? (pronounced “kanavin”?) – quadrants of systems/situations (simple, complicated, complex & chaotic) may be suitable for different school-type behaviour – or, different responses of Context-Driven – suggests steps to approach, which vary in usage & sequence: © Thompson Sense, Categorise, Analyse, Probe, Respond, Just-act Snformation i ystems Consulting Ltd 34
  • Questions to think about /discuss Emerging Topics track • Are (some?) people “stuck” in a school because of: – their personalities, – upbringing, – education/inculcation, – deeply-held beliefs?... – the company they keep, or – the jobs they tend to get recruited for? – what their boss wanted yesterday? • Are the schools themselves evolving – if so, how? Eg... – Context-Driven, from origins to book(s) to blogs to this conference? – Factory school in response to “maturing” of outsourcing / offshoring market? – current Agile movements? • Do some memes replicate in spite of not really © Thompson information Systems helping their hosts? Consulting Ltd 35
  • Questions to think about /discuss (continued) Emerging Topics track • Shouldn’t Context-Driven, “by definition”, embrace practices of all the other schools *where appropriate* (or are other schools bad even in their own context? Are cultural / ethical divisions insoluble?) • Analogies between testing schools and schisms in science, eg string theory v the others, frequentists v Bayesians • How should individuals in software testing evolve? • How should software testing be preparing for the future, eg testing Artificial Intelligence: – what happened to Genetic Algorithms? – what can we do with Grounded Theory? – and (as above) Bayesian? • What other big innovations could be coming? (see Steven Johnson) • Any memes about to die out? • Do we want to go beyond testing ©T i hompson nformation only software? S C L ystems onsulting td 36
  • Main references &Acknowledgements Emerging Topics track– Bret Pettichord: “Four Schools” presentation– Dave Gelperin & Bill Hetzel paper, The Growth of Software Testing– testingtimeline– Stuart Reid paper, Lines of Innovation in Software Testing– Cem Kaner: blog, and “Software Testing as a Social Science” presentations– James Bach & Michael Bolton blogs, plus Kaner Bach & Pettichord “Lessons Learned...” book– Mike Smith: originating motivation & ideas, then co-development, of Value Flow ScoreCards – plus key input from Isabel Evans– Robert Pirsig books: Zen & the art of Motorcycle Maintenance + Lila– Sheldon Glashow: cosmic Ouroboros– Charles Darwin books– Richard Dawkins: various books, esp. Extended Phenotype & Climbing Mount Improbable– Daniel Dennett: various books, esp. Darwin’s Dangerous Idea– Susan Blackmore, Robert Aunger, Kate Distin etc: various books on Memes– Matt Heusser, blog post 31 Jul 2009, esp. comments by Laurent Bossavit & James Christie– Peter Godfrey-Smith book: Theory and Reality (Philosophy of Science)– Stuart Kauffman: various books, eg Investigations– Ray Kurzweil book, The Singularity is Near– Jurgen Appelo book & website, Management 3.0– Sharon Bertsch McGrayne book: The Theory That Would Not Die (Bayes) © Tnformation i hompson– Dave Snowden, Cynefin Systems Consulting Ltd 37
  • Emerging Topics track• Thanks for listening! ©Thompson• Questions & discussion? information Systems Consulting Ltd 38