A Rapid Introduction to Rapid Software Testing

637 views

Published on

You're under tight time pressure and have barely enough information to proceed with testing. How do you test quickly and inexpensively, yet still produce informative, credible, and accountable results? Rapid Software Testing, adopted by context-driven testers worldwide, offers a field-proven answer to this all-too-common dilemma. In this one-day sampler of the approach, Paul Holland introduces you to the skills and practice of Rapid Software Testing through stories, discussions, and "minds-on" exercises that simulate important aspects of real testing problems. The rapid approach isn't just testing with speed or a sense of urgency; it's mission-focused testing that eliminates unnecessary work, assures that the most important things get done, and constantly asks how testers can help speed up the successful completion of the project. Join Paul to learn how rapid testing focuses on both the mind set and skill set of the individual tester who uses tight loops of exploration and critical thinking skills to help continuously re-optimize testing to match clients' needs and expectations.

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
637
On SlideShare
0
From Embeds
0
Number of Embeds
40
Actions
Shares
0
Downloads
11
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

A Rapid Introduction to Rapid Software Testing

  1. 1. MA Full-Day Tutorial 9/30/2013 8:30:00 AM "A Rapid Introduction to Rapid Software Testing" Presented by: Paul Holland Testing Thoughts Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  2. 2. Paul Holland Testing Thoughts An independent software test consultant and teacher, Paul Holland has more than sixteen years of hands-on testing and test management experience, primarily at Alcatel-Lucent where he led a transformation of the testing approach for two product divisions, making them more efficient and effective. As a test manager and tester, Paul focused on exploratory testing, test automation, and improving testing techniques.
  3. 3. A Rapid Introduction to Rapid Software Testing James Bach, Satisfice, Inc. james@satisfice.com www.satisfice.com +1 (360) 440-1435 Michael Bolton, DevelopSense mb@developsense.com www.developsense.com +1 (416) 656-5160 Paul Holland, Testing Thoughts paul@testingthoughts.com www.testingthoughts.com +1 (613) 297-3468 Acknowledgements • Some of this material was developed in collaboration with Dr. Cem Kaner, of the Florida Institute of Technology. See www.kaner.com and www.testingeducation.org. • Doug Hoffman (www.softwarequalitymethods.com) has also contributed to and occasionally teaches from this material. • Many of the ideas in this presentation were also inspired by or augmented by other colleagues including Jonathan Bach, Bret Pettichord, Brian Marick, Dave Gelperin, Elisabeth Hendrickson, Jerry Weinberg, Noel Nyman, and Mary Alton. • Some of our exercises were introduced to us by Payson Hall, Jerry Weinberg, Ross Collard, James Lyndsay, Dave Smith, Earl Everett, Brian Marick, Cem Kaner and Joe McMahon. • Many ideas were improved by students who took earlier versions of the class going back to 1996. 2 1
  4. 4. Assumptions About You • You test software, or any other complex human creation. • You have at least some control over the design of your tests and some time to create new tests. • You are worried that your test process is spending too much time and resources on things that aren’t important. • You test under uncertainty and time pressure. • Your major goal is to find important problems quickly. • You want to get very good at (software) testing. 3 A Question What makes testing harder or slower? 2
  5. 5. Premises of Rapid Testing 1. Software projects and products are relationships between people. 2. Each project occurs under conditions of uncertainty and time pressure. 3. Despite our best hopes and intentions, some degree of inexperience, carelessness, and incompetence is normal. 4. A test is an activity; it is performance, not artifacts. Premises of Rapid Testing 5. Testing’s purpose is to discover the status of the product and any threats to its value, so that our clients can make informed decisions about it. 6. We commit to performing credible, cost-effective testing, and we will inform our clients of anything that threatens that commitment. 7. We will not knowingly or negligently mislead our clients and colleagues or ourselves. 8. Testers accept responsibility for the quality of their work, although they cannot control the quality of the product. 3
  6. 6. Rapid Testing Rapid testing is a mind-set and a skill-set of testing focused on how to do testing more quickly, less expensively, with excellent results. This is a general testing methodology. It adapts to any kind of project or product. 7 How does Rapid Testing compare with other kinds of testing? More Work & Time (Cost) When testing is turned into an elaborate set of rote tasks, it becomes ponderous without really being thorough. Ponderous Slow, expensive, and easier Slapdash Much faster, cheaper, and easier You can always test quickly... But it might be poor testing. Management likes to talk about exhaustive testing, but they don’t want to fund it and they don’t know how to do it. Exhaustive Slow, very expensive, and difficult Rapid Faster, less expensive, still challenging Better Thinking & Better Testing (Value) Rapid testing may not be exhaustive, but it is thorough enough and quick enough. It’s less work than ponderous testing. It might be less work than slapdash testing. It fulfills the mission of testing. 8 4
  7. 7. Excellent Rapid Technical Work Begins with You When the ball comes to you… Do you know you have the ball? Can you receive the pass? Do you know your options? Do you know what your role and mission is? Do you know where your teammates are? Is your equipment ready? Can you read the situation on the field? Are you aware of the Can you let your teammates help you? criticality of the situation? Are you ready to act, right now? 9 …but you don’t have to be great at everything. • Rapid test teams are about diverse talents cooperating • We call this the elliptical team, as opposed to the team of perfect circles. • Some important dimensions to vary: • • • • • • • • • Technical skill Domain expertise Temperament (e.g. introvert vs. extrovert) Testing experience Project experience Industry experience Product knowledge Educational background Writing skill • Diversity makes exploration far more powerful • Your team is powerful because of your unique contribution 10 5
  8. 8. What It Means To Test Rapidly • Since testing is about finding a potentially infinite number of problems in an infinite space in a finite amount of time, we must… • understand our mission and obstacles to fulfilling it • know how to recognize problems quickly • model the product and the test space to know where to look for problems • prefer inexpensive, lightweight, effective tools • reduce dependence on expensive, time-consuming artifacts, while getting value from the ones we’ve got • do nothing that wastes time or effort • tell a credible story about all that 11 One Big Problem in Testing Formality Bloat • Much of the time, your testing doesn’t need to be very formal* • Even when your testing does need to be formal, you’ll need to do substantial amounts of informal testing in order figure out how to do excellent formal testing. • • Who says? The FDA. See http://www.satisfice.com/blog/archives/602 Even in a highly regulated environment, you do formal testing You do informal testing to make sure you don’t lose money, blow things up, or kill people. primarily for the auditors. * Formal testing means testing that must be done to verify a specific fact, or that must be done in a specific way. 6
  9. 9. EXERCISE Test the Famous Triangle What is testing? Serving Your Client If you don’t have an understanding and an agreement on what is the mission of your testing, then doing it “rapidly” would be pointless. 14 7
  10. 10. Not Enough Product and Project Information? Where do we get test information? What Is A Problem? A problem is… 8
  11. 11. How Do We Recognize Problems? An oracle is… a way to recognize a problem. Learn About Heuristics Heuristics are fallible, “fast and frugal” methods of solving problems, making decisions, or accomplishing tasks. “The engineering method is the use of heuristics to cause the best change in a poorly understood situation within the available resources.” Billy Vaughan Koen Discussion of the Method 9
  12. 12. Heuristics: Generating Solutions Quickly and Inexpensively • Heuristic (adjective): serving to discover or learn • Heuristic (noun): a fallible method for solving a problem or making a decision “Heuristic reasoning is not regarded as final and strict but as provisional and plausible only, whose purpose is to discover the solution to the present problem.” - George Polya, How to Solve It Oracles An oracle is a heuristic principle or mechanism by which we recognize a problem. “It works!” really means… “...it appeared at least once to meet some requirement to some degree” “...uh, when I ran it” “...that one time” “...on my machine.” 10
  13. 13. Familiar Problems If a product is consistent with problems we’ve seen before, we suspect that there might be a problem. Explainability If a product is inconsistent with our ability to explain it (or someone else’s), we suspect that there might be a problem. 11
  14. 14. World If a product is inconsistent with the way the world works, we suspect that there might be a problem. History Okay, so how the #&@ do I print now? If a product is inconsistent with previous versions of itself, we suspect that there might be a problem. 12
  15. 15. Image If a product is inconsistent with an image that the company wants to project, we suspect a problem. Comparable Products WordPad Word When a product seems inconsistent with a product that is in some way comparable, we suspect that there might be a problem. 13
  16. 16. Claims When a product is inconsistent with claims that important people make about it, we suspect a problem. User Expectations When a product is inconsistent with expectations that a reasonable user might have, we suspect a problem. 14
  17. 17. Purpose When a product is inconsistent with its designers’ explicit or implicit purposes, we suspect a problem. Product When a product is inconsistent internally—as when it contradicts itself—we suspect a problem. 15
  18. 18. Statutes and Standards When a product is inconsistent with laws or widely accepted standards, we suspect a problem. Consistency (“this agrees with that”) an important theme in oracle principles • • • • • • Familiarity: The system is not consistent with the pattern of any familiar problem. Explainability: The system is consistent with our ability to describe it clearly. World: The system is consistent with things that we recognize in the world. History: The present version of the system is consistent with past versions of it. Image: The system is consistent with an image that the organization wants to project. Comparable Products: The system is consistent with comparable systems. • Claims: The system is consistent with what important people say it’s supposed to be. • Users’ Expectations: The system is consistent with what users want. • Product: Each element of the system is consistent with comparable elements in the same system. • Purpose: The system is consistent with its purposes, both explicit and implicit. • Standards and Statutes: The system is consistent with applicable laws, or relevant implicit or explicit standards. Consistency heuristics rely on the quality of your models of the product and its context. 32 16
  19. 19. All Oracles Are Heuristic An oracle doesn’t tell you that there IS a problem. An oracle tells you that you might be seeing a problem. An oracle can alert you to a possible problem, but an oracle cannot tell you that there is no problem. Consistency heuristics rely on the quality of your models of the product and its context. Rely solely on documented, anticipated sources of oracles, and your testing will likely be slower and weaker. Train your mind to recognize patterns of oracles and your testing will likely be faster and your ability to spot problems will be sharper. General Examples of Oracles things that suggest “problem” or “no problem” • • • • • • • • • • People A person whose opinion matters. An opinion held by a person who matters. A disagreement among people who matter. A reference document with useful information. A known good example output. Mechanisms A known bad example output. A process or tool by which the output is checked. A process or tool that helps a tester identify patterns. A feeling like confusion or annoyance. Feelings A desirable consistency between related things. Principles 34 17
  20. 20. Oracles from the Inside Out Explicit Tacit Tester Inference Your Feelings & Mental Models Observable Consistencies Other People Experience Conference Reference Stakeholders’ Feelings & Mental Models Shared Artifacts (specs, tools, etc.) Oracle Cost and Value • Some oracles are more authoritative • but more responsive to change • Some oracles are more consistent • but maybe not up to date • Some oracles are more immediate • but less reliable • Some oracles are more precise • but the precision may be misleading • Some oracles are more accurate • but less precise • Some oracles are more available • but less authoritative • Some oracles are easier to interpret • but more narrowly focused 18
  21. 21. Feelings As Heuristic Triggers For Oracles • An emotional reaction is a trigger to attention and learning • Without emotion, we don’t reason well • See Damasio, The Feeling of What Happens • When you find yourself mildly concerned about something, someone else could be very concerned about it • Observe emotions to help overcome your biases and to evaluate significance An emotion is a signal; consider looking into it All Oracles Are Heuristic • We often do not have oracles that establish a definite correct or incorrect result, in advance. Oracles may reveal themselves to us on the fly, or later. That’s why we use abductive inference. • No single oracle can tell us whether a program (or a feature) is working correctly at all times and in all circumstances. That’s why we use a variety of oracles. • Any program that looks like it’s working, to you, may in fact be failing in some way that happens to fool all of your oracles. That’s why we proceed with humility and critical thinking. • We never know when a test is finished. That’s why we try to maintain uncertainty when everyone else on the project is sure. • You (the tester) can’t know the deep truth about any result. That’s why we report whatever seems likely to be a bug. 38 19
  22. 22. Oracles are Not Perfect And Testers are Not Judges • You don’t need to know for sure if something is a bug; it’s not your job to decide if something is a bug; it’s your job to decide if it’s worth reporting. • You do need to form a justified belief that it MIGHT be a threat to product value in the opinion of someone who matters. • And you must be able to say why you think so; you must be able to cite good oracles… or you will lose credibility. MIP’ing VS. Black Flagging 39 Coping With Difficult Oracle Problems • Ignore the Problem • Ask “so what?” Maybe the value of the information doesn’t justify the cost. • Simplify the Problem • Ask for testability. It usually doesn’t happen by accident. • Built-in oracle. Internal error detection and handling. • Lower the standards. You may be using an unreasonable standard of correctness. • Shift the Problem • Parallel testing. Compare with another instance of a comparable algorithm. • Live oracle. Find an expert who can tell if the output is correct. • Reverse the function. (e.g. 2 x 2 = 4, then 4/2 = 2) • Divide and Conquer the Problem • Spot check. Perform a detailed inspection on one instance out of a set of outputs. • Blink test. Compare or review overwhelming batches of data for patterns that stand out. • • • • Easy input. Use input for which the output is easy to analyze. Easy output. Some output may be obviously wrong, regardless of input. Unit test first. Learn about the pieces that make the whole. Test incrementally. Learn about the product by testing over a period of time. 40 20
  23. 23. “Easy Input” • Fixed Markers. Use distinctive fixed input patterns that are easy to spot in the output. • Statistical Markers. Use populations of data that have distinguishable statistical properties. • Self-Referential Data. Use data that embeds metadata about itself. (e.g. counterstrings) • Easy Input Regions. For specific inputs, the correct output may be easy to calculate. • Outrageous Values. For some inputs, we expect error handling. • Idempotent Input. Try a case where the output will be the same as the input. • Match. Do the “same thing” twice and look for a match. • Progressive Mismatch. Do progressively differing things over time and account for each difference. (code-breaking technique) 41 Oracles Are Linked To Threats To Quality Criteria Capability Scalability Reliability Compatibility Usability Performance Charisma Installability Security Development Any inconsistency may represent diminished value. Many test approaches focus on capability (functionality) 42 and underemphasize the other criteria. 21
  24. 24. Oracles Are Linked To Threats To Quality Criteria Supportability Testability Maintainability Portability Localization Any inconsistency may represent diminished value. Many test approaches focus on capability (functionality) 43 and underemphasize the other criteria. Focusing on Preparation and Skill Can Reduce Documentation Bloat 3.0 Test Procedures 3.1 General testing protocol. • In the test descriptions that follow, the word “verify" is used to highlight specific items that must be checked. In addition to those items a tester shall, at all times, be alert for any unexplained or erroneous behavior of the product. The tester shall bear in mind that, regardless of any specific requirements for any specific test, there is the overarching general requirement that the product shall not pose an unacceptable risk of harm to the patient, including an unacceptable risk using reasonably foreseeable misuse. • Test personnel requirements: The tester shall be thoroughly familiar with the generator and workstation FRS, as well as with the working principles of the devices themselves. The tester shall also know the working principles of the power test jig and associated software, including how to configure and calibrate it and how to recognize if it is not working correctly. The tester shall have sufficient skill in data analysis and measurement theory to make sense of statistical test results. The tester shall be sufficiently familiar with test design to complement this protocol with exploratory testing, in the event that anomalies appear that require investigation. The tester shall know how to keep test records to credible, professional standard. 22
  25. 25. Remember… For skilled testers, good testing isn’t just about pass vs. fail. For skilled testers, testing is about problem vs. no problem. Where Do We Look For Problems? Coverage is… how much of the product has been tested. 23
  26. 26. What IS Coverage? Coverage is “how much of the product we have tested.” It’s the extent to which we have traveled over some map of the product. MODELS Models • A model is an idea, activity, or object… such as an idea in your mind, a diagram, a list of words, a spreadsheet, a person, a toy, an equation, a demonstration, or a program • …that heuristically represents (literally, re-presents) another idea, activity, or object… such as something complex that you need to work with or study • …whereby understanding something about the model may help you to understand or manipulate the thing that it represents. - A map is a model that helps to navigate across a terrain. - 2+2=4 is a model for adding two apples to a basket that already has two apples. - Atmospheric models help predict where hurricanes will go. - A fashion model helps understand how clothing would look on actual humans. - Your beliefs about what you test are a model of what you test. 24
  27. 27. There are as many kinds of test coverage as there are ways to model the system. Intentionally OR Incidentally One Way to Model Coverage: Product Elements (with Quality Criteria) • Structure • Function • Data • Interfaces • Platform • Operations • Time Capability Reliability Usability Charisma Security Supportability Scalability Testability Compatibility Performance Maintainability Installability 25
  28. 28. To test a very simple product meticulously, part of a complex product meticulously, or to maximize test integrity… 1. 2. 3. 4. 5. 6. Start the test from a known (clean) state. Prefer simple, deterministic actions. Trace test steps to a specified model. Follow established and consistent lab procedures. Make specific predictions, observations and records. Make it easy to reproduce (automation may help). 51 General Focusing Heuristics • use test-first approach or unit testing for better code coverage • work from prepared test coverage outlines and risk lists • use diagrams, state models, and the like, and cover them • apply specific test techniques to address particular coverage areas • make careful observations and match to expectations To do this more rapidly, make preparation and artifacts fast and frugal: leverage existing materials and avoid repeating yourself. Emphasize doing; relax planning. You’ll make discoveries along the way! 26
  29. 29. To find unexpected problems, elusive problems that occur in sustained field use, or more problems quickly in a complex product… That’s a PowerPoint bug! 1. 2. 3. 4. 5. 6. Start from different states (not necessarily clean). Prefer complex, challenging actions. Generate tests from a variety of models. Question your lab procedures and tools. Try to see everything with open expectations. Make the test hard to pass, instead of easy to reproduce. 53 General Defocusing Heuristics • diversify your models; intentional coverage in one area can lead to unintentional coverage in other areas—this is a Good Thing • diversify your test techniques • be alert to problems other than the ones that you’re actively looking for • welcome and embrace productive distraction • do some testing that is not oriented towards a specific risk • use high-volume, randomized automated tests 27
  30. 30. DISCUSSION How Many Test Cases? What About Quantifying Coverage Overall? • A nice idea, but we don’t know how to do it in a way that is consistent with basic measurement theory • • If we describe coverage by counting test cases, we’re committing reification error. If we use percentages to quantify coverage, we need to establish what 100% looks like. • • But we might do that with respect to some specific models. Complex systems may display emergent behaviour. 28
  31. 31. Extent of Coverage • Smoke and sanity • Can this thing even be tested at all? • Common, core, and critical • Can this thing do the things it must do? • Does it handle happy paths and regular input? • Can it work? • Complex, harsh, extreme and exceptional • Will this thing handle challenging tests, complex data flows, and malformed input, etc.? • Will it work? How Might We Organize, Record, and Report Coverage? • • • • • • automated tools (e.g. profilers, coverage tools) annotated diagrams and mind maps coverage matrices bug taxonomies Michael Hunter’s You Are Not Done Yet list James Bach’s Heuristic Test Strategy Model • described at www.satisfice.com • articles about it at www.developsense.com • Mike Kelly’s MCOASTER model • product coverage outlines and risk lists • session-based test management • http://www.satisfice.com/sbtm See three articles here: http://www.developsense.com/publications.html#coverage 29
  32. 32. What Does Rapid Testing Look Like? Concise Documentation Minimizes Waste Testing Heuristics Risk Catalog General Coverage Model Risk Model ProjectSpecific Testing Playbook Schedule Issues Bugs Test Strategy Reference Status Dashboard 59 Rapid Testing Documentation • Recognize • a requirements document is not the requirements • a test plan document is not a test plan • a test script is not a test • doing, rather than planning, produces results • Determine where your documentation is on the continuum: product or tool? • Keep your tools sharp and lightweight • Obtain consensus from others as to what’s necessary and what’s excess in products • Ask whether reporting test results takes priority over obtaining test results • note that in some contexts, it might • Eliminate unnecessary clerical work 30
  33. 33. Visualizing Test Progress Visualizing Test Progress 31
  34. 34. Visualizing Test Progress See “A Sticky Situation”, Better Software, February 2012 What IS Exploratory Testing? • Simultaneous test design, test execution, and learning. • James Bach, 1995 But maybe it would be a good idea to underscore why that’s important… 32
  35. 35. What IS Exploratory Testing? • I follow (and to some degree contributed to) Kaner’s definition, which was refined over several peer conferences through 2007: Exploratory software testing is… • • • • • a style of software testing that emphasizes the personal freedom and responsibility of the individual tester to continually optimize the value of his or her work by treating test design, test execution, test result interpretation, and test-related learning So maybe it would be • as mutually supportive activities a good idea to keep it • that run in parallel brief most of the time… • throughout the project. See Kaner, “Exploratory Testing After 23 Years”, www.kaner.com/pdfs/ETat23.pdf Why Exploratory Approaches? • Systems are far more than collections of functions • Systems typically depend upon and interact with many external systems 33
  36. 36. Why Exploratory Approaches? • Systems are too complex for individuals to comprehend and describe • Products evolve rapidly in ways that cannot be anticipated In the future, developers will likely do more verification and validation at the unit level than they have done before. Testers must explore, discover, investigate, and learn about the system. Why Exploratory Approaches? • Developers are using tools and frameworks that make programming more productive, but that may manifest more emergent behaviour. • Developers are increasingly adopting unit testing and test-driven development. • The traditional focus is on verification, validation, and confirmation. The new focus must be on exploration, discovery, investigation, and learning. 34
  37. 37. Why Exploratory Approaches? • • • • • • • • • • We don’t have time to waste preparing wastefully elaborate written plans for complex products built from many parts and interacting with many systems (many of which we don’t control… …or even understand) where everything is changing over time and there’s so much learning to be done and the result, not the plan, is paramount. Questions About Scripts… arrows and cycles What happens when the unexpected happens during a script? Where do scripts come from? What do we do with what we learn? Will everyone follow the same script the same way? (task performing) 35
  38. 38. Questions About Exploration… arrows and cycles What happens when the unexpected happens during Where does exploration come from? exploration? What do we do with what we learn? Will everyone explore the same way? (value seeking) Exploration is Not Just Action arrows and cycles 36
  39. 39. You can put them together! arrows and cycles You can put them together! arrows and cycles 37
  40. 40. What Exploratory Testing Is Not • Touring • http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-1touring/ • After-Everything-Else Testing • http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-2-aftereverything-else-testing/ • Tool-Free Testing • http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-3-toolfree-testing/ • Quick Tests • http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-4-quicktests/ • Undocumented Testing • http://www.developsense.com/blog/2011/12/what-exploratory-testing-is-not-part-5undocumented-testing/ • “Experienced-Based” Testing • http://www.satisfice.com/blog/archives/664 • defined by any specific example of exploratory testing • http://www.satisfice.com/blog/archives/678 Exploratory Testing The way we practice and teach it, exploratory testing… • IS NOT “random testing” (or sloppy, or slapdash testing) • IS NOT “unstructured testing” • IS NOT procedurally structured • IS NOT unteachable • IS NOT unmanageable • IS NOT scripted • IS NOT a technique • IS “ad hoc”, in the dictionary sense, “to the purpose” • IS structured and rigorous • IS cognitively structured • IS highly teachable • IS highly manageable • IS chartered • IS an approach 38
  41. 41. Contrasting Approaches Scripted Testing Exploratory Testing • • • • • • • • • • • • • • • • Is directed from elsewhere Is determined in advance Is about confirmation Is about controlling tests Emphasizes predictability Emphasizes decidability Like making a speech Like playing from a score Is directed from within Is determined in the moment Is about investigation Is about improving test design Emphasizes adaptability Emphasizes learning Like having a conversation Like playing in a jam session Exploratory Testing IS Structured • Exploratory testing, as we teach it, is a structured process conducted by a skilled tester, or by lesser skilled testers or users working under supervision. • The structure of ET comes from many sources: • • • • • • • • • • • Test design heuristics Chartering Not procedurally Time boxing structured, but cognitively structured. Perceived product risks The nature of specific tests The structure of the product being tested The process of learning the product Development activities Constraints and resources afforded by the project In other words, The skills, talents, and interests of the tester it’s not “random”, The overall mission of testing but systematic. 39
  42. 42. Exploratory Testing IS Structured In excellent exploratory testing, one structure tends to dominate all the others: Exploratory testers construct a compelling story of their testing. It is this story that gives ET a backbone. 79 To test is to compose, edit, narrate, and justify THREE stories. A story about the status of the PRODUCT… …about how it failed, and how it might fail... …in ways that matter to your various clients. A story about HOW YOU TESTED it… …how you configured, operated and observed it… …about what you haven’t tested, yet… …and won’t test, at all… A story about how GOOD that testing was… …what the risks and costs of testing are… …what made testing harder or slower… …how testable (or not) the product is… …what you need and what you recommend. 80 40
  43. 43. What does “taking advantage of resources” mean? • Mission • The problem we are here to solve for our customer. • Information • Information about the product or project that is needed for testing. • Developer relations • How you get along with the programmers. • Team • Anyone who will perform or support testing. • Equipment & tools • Hardware, software, or documents required to administer testing. • Schedule • The sequence, duration, and synchronization of project events. • Test Items • The product to be tested. • Deliverables • The observable products of the test project. 81 “Ways to test…”? General Test Techniques • • • • • • • • • Function testing Domain testing Stress testing Flow testing Scenario testing Claims testing User testing Risk testing Automatic checking 82 41
  44. 44. Cost as a Simplifying Factor Try quick tests as well as careful tests A quick test is a cheap test that has some value but requires little preparation, knowledge, or time to perform. • Happy Path • Tour the Product • • • • • • Sample Data Variables Files Complexity Menus & Windows Keyboard & Mouse • • • • • • • Interruptions Undermining Adjustments Dog Piling Continuous Use Feature Interactions Click on Help 83 Cost as a Simplifying Factor Try quick tests as well as careful tests A quick test is a cheap test that has some value but requires little preparation, knowledge, or time to perform. • • • • • Input Constraint Attack Click Frenzy Shoe Test Blink Test Error Message Hangover • • • • Resource Starvation Multiple Instances Crazy Configs Cheap Tools 84 42
  45. 45. Touring the Product: Mike Kelly’s FCC CUTS VIDS • • • • • • Feature tour Complexity tour Claims tour Configuration tour User tour Testability tour • • • • • Scenario tour Variability tour Interoperability tour Data tour Structure tour 43
  46. 46. Summing Up: Themes of Rapid Testing • • • • • • • • • • • Put the tester's mind at the center of testing. Learn to deal with complexity and ambiguity. Learn to tell a compelling testing story. Develop testing skills through practice, not just talk. Use heuristics to guide and structure your process. Replace “check for…” with “look for problems in…” Be a service to the project community, not an obstacle. Consider cost vs. value in all your testing activity. Diversify your team and your tactics. Dynamically manage the focus of your work. Your context should drive your choices, both of which evolve over time. 88 44

×