Conquering the Largest Challenge of Software Testing: Too Much to Test & Not Enough Time to Test It All
Upcoming SlideShare
Loading in...5
×
 

Conquering the Largest Challenge of Software Testing: Too Much to Test & Not Enough Time to Test It All

on

  • 772 views

Justin Hunter’s presentation at the 12th annual international software testing conference in Bangalore, India, December 2012. ...

Justin Hunter’s presentation at the 12th annual international software testing conference in Bangalore, India, December 2012.

Link to video of this 30 minute presentation is at: http://hexawise.tv/2013/02/too-much-to-test-and-not-enough-time-to-test-it-all/

The techniques discussed focus on how to reduce the amount of time spent selecting and documenting test scripts, reduce the amount of tests needed for execution by creating unusually powerful tests and thus increase the thoroughness of software test suites.

The talk explored the significant risks, for users, companies and employees of failing to catch software application failures before release (for example, looking at releasing Apple Maps with significant failures). And discussed how combinatorial (also orthogonal array or pairwise) software testing can be used to create test plans that test a large number of parameters/factors quickly.

Related Presentations:

Combinatorial Testing Beyond Pairwise Testing: http://www.slideshare.net/JustinHunter/combinatorial-software-testdesignbeyondpairwisetesting

Detailed Example for Creating Pairwise Test Plans Using Hexawise

New Features of Hexawise "2.0" - (Nov 2012): http://www.slideshare.net/JustinXHunter/introduction-to-hexawise-new-features-20121109

How to Think about Test Inputs in Software Testing: http://hexawise.tv/2012/10/how-to-think-about-test-inputs-in-software-testing/

Also see: http://hexawise.com (a pairwise / orthogonal array test design tool with powerful additional test design features).


Statistics

Views

Total Views
772
Views on SlideShare
743
Embed Views
29

Actions

Likes
0
Downloads
12
Comments
0

1 Embed 29

http://hexawise.tv 29

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Conquering the Largest Challenge of Software Testing: Too Much to Test & Not Enough Time to Test It All Conquering the Largest Challenge of Software Testing: Too Much to Test & Not Enough Time to Test It All Presentation Transcript

  • Conquering the Single Largest Challenge Facing Todays Testers Justin Hunter CEO of Hexawise 1Tuesday, March 5, 13
  • Many slides in this “Bad News” presentation might not make sense on their own. The YouTube video taken “Good News” of this presentation is available here. Photo credit on opening slide: it’s my own photo (taken in the fantastic Rankapur Jain temple in Rajasthan) http://www.flickr.com/photos/82153534@N05/8515803041/sizes/o/in/set-72157632883997160/ 2Tuesday, March 5, 13
  • Single Biggest Challenge “There’s too much to test and not enough time to test it all.” According to a recent survey conducted by Robert Sabourin, this is the single largest challenge facing test managers today. 3Tuesday, March 5, 13
  • I. “Maps Mayhem” 1. What Happened? 2. Avoidable? 3. Practical Implications 4Tuesday, March 5, 13
  • Most Careers TimeTuesday, March 5, 13
  • Most Careers TimeTuesday, March 5, 13
  • Scott’s Career TimeTuesday, March 5, 13
  • Scott’s Career TimeTuesday, March 5, 13
  • Scott’s Career TimeTuesday, March 5, 13
  • Even worse... 9 / 10Tuesday, March 5, 13
  • 2nd to Go (He’s also Amazing) 9Tuesday, March 5, 13
  • Nightmare Worsens 123,000Tuesday, March 5, 13
  • CEO’s Apology Letter “We are extremely sorry...” “While we’re improving Maps, you can try alternatives... like Bing, MapQuest and Waze, or use Google or Nokia maps...”Tuesday, March 5, 13
  • Everyday Fails (Cont.) 12Tuesday, March 5, 13
  • Missing Details http://www.itsagadget.com/2012/09/apple-google-maps-ios-6.html 13Tuesday, March 5, 13
  • Squiggly Roads http://www.fastcompany.com/3003446/apple-reportedly-fires-their-maps-man 14Tuesday, March 5, 13
  • On the water http://news.yahoo.com/blogs/technology-blog/apple-ceo-apologizes-maps-recommends-google-instead-182143889.html 15Tuesday, March 5, 13
  • In the water http://machineslikeus.com/news/get-lost-apple-maps-road-nowhere 16Tuesday, March 5, 13
  • Missing water 17Tuesday, March 5, 13
  • Water Turned into Beaches http://theamazingios6maps.tumblr.com/page/6 18Tuesday, March 5, 13
  • Melted Streets http://theamazingios6maps.tumblr.com/page/4 19Tuesday, March 5, 13
  • Social Media Mockery... http://www.crowdsourcing.org/images/resized//editorial_19902_780x0_proportion.jpg?1349379876 20Tuesday, March 5, 13
  • Even Mocked by These Guys! http://blogs.telegraph.co.uk/technology/micwright/100007771/apple-moronic-new-maps-this-is-turning-into-a-disaster/ 21Tuesday, March 5, 13
  • Impact to Sales? http://www.businessinsider.com/google-maps-apple-maps-2012-10 22Tuesday, March 5, 13
  • In Fairness to those Involved - Extreme Complexity - Unimaginably Large Scope - Highly Visible Mistakes - Google Had a Huge Head Start http://img.photobucket.com/albums/v40/Dragonrider1227/chainsawsonfire.jpg 23Tuesday, March 5, 13
  • Could this have been Avoided? Imminent Disaster Sometimes it’s just better to grab a beer and watch... 24Tuesday, March 5, 13
  • I. More Smart Testers This man, Harry Robinson, is a genius. He helped lead testing for Google Maps. IMO, he’d be a bargain to Apple at $1 million / year. 25 http://model-based-testing.info/2012/03/12/interview-with-harry-robinson/Tuesday, March 5, 13
  • II. Using Smart Test Design ...884,736 possible tests x 5 options x 2 options x 2 options ...13,824 possible tests 6 browser choices x 2 options x 4 options x 2 options x 3 options x 4 options x 4 options x 2 options x 4 options x 2 options x 2 options = 884,736 possible tests... x 2 options x 2 options x 2 options x 4 options x 4 options x 2 options x 2 options x 3 options x 2 options x 2 options x 2 options x 2 options This single web page could be tested with = 13,824 possible tests... 72,477,573,120 possible tests 25 26Tuesday, March 5, 13
  • II. Using Pairwise Test Design 27Tuesday, March 5, 13
  • First, users input details of an application to be tested... TM 28Tuesday, March 5, 13
  • Next, users create tests that will cover interactions of every valid pair of values in as few tests as possible. (1) Browser = “Opera” tested with (2) View = “Satellite?” Covered. (1) Mode of Transport = “Walk” tested with (2) Show Photos = “Y”? Covered. (1) Avoid Toll Roads = “Y” tested with (2) Show Traffic = “Y (Live)” ? Covered. (1) Browser = IE6 tested with (2) Distance in = KM and (3) Zoom in = “Y” ? That is a 3-way interaction. It might not be covered in these 35 tests. See next page. 29Tuesday, March 5, 13
  • The third screen provides objective coverage data which is useful in determining when to stop testing. Every test plan has a finite number of valid combinations of parameter values (involving, in this case, 2 parameter values). The chart below shows, at each point in the test plan, what percentage of the total possible number of relevant combinations have been covered. In this set of test cases, as in most, there is a significant decreasing marginal return. % Coverage by Number of Tests 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 2 4 7 9 11 14 16 18 21 23 25 28 30Tuesday, March 5, 13
  • Testing each feature to “see if it works” is not enough. Does it work in combination with every other feature?Tuesday, March 5, 13
  • Every pair of test inputs get tested in at least one test! 32Tuesday, March 5, 13
  • Prioritization How many test inputs are needed to trigger defects in production? 5% 1 11% 2 (“pairwise”) 51% 3 33% 4, 5, or 6 • Medical Devices:  D.R. Wallace, D.R. Kuhn, Failure Models in Medical Device Software: an Analysis of 15 Years of Recall Data, International Journal of Reliability, Quality, and Safety Engineering, Vol. 8, No. 4, 2001.     • Browser, Server:  D.R. Kuhn, M.J. Reilly, An investigation of the Applicability of Design of Experiments to Software Testing, 27th NASA/IEEE Software Engineering Workshop, NASA Goddard SFC 4-6 December, 2002 .   • NASA database:  D.R. Kuhn, D.R. Wallace, A.J. Gallo, Jr., Software Fault Interactions and Implications for Software Testing, IEEE Trans. on Software Engineering, vol. 30, no. 6, June, 2004.   • Network Security:  K.Z. Bell, Optimizing Effectiveness and Efficiency of Software Testing: a Hybrid Approach,  PhD Dissertation, North Carolina State University, 2006.  Tuesday, March 5, 13
  • Banking / Capital Markets Case Study This part of the plan involved the following 6 parameters, each of which had between 2 and 4 values: A B C 09 10 I R C E L A A B C V I B 34Tuesday, March 5, 13
  • A B C 09 10 I R C E L A A B C V I BTuesday, March 5, 13
  • C E L A C B A A B C B I V 09 10 R ITuesday, March 5, 13
  • C E L A C B A A B After 5 C Hexawise B Tests I V 09 10 R ITuesday, March 5, 13
  • C E L A C B A A B After 10 C Hexawise B Tests I V 09 10 R ITuesday, March 5, 13
  • C E L A C B A A B After 13 C Hexawise B Tests I V 09 10 R ITuesday, March 5, 13
  • Manual test case selection Now let’s compare coverage achieved by manual test case selection. We’ll skip the “after 5 tests” and “after 10 tests” and go directly to “after 13 tests” (e.g., when Hexawise had already achieved 100% coverage of all pairs of values).Tuesday, March 5, 13
  • Manual test case selection C E L A C B A A After 13 B Manual C B Tests I V 09 10 R ITuesday, March 5, 13
  • Manual test case selection C E L A C B A A After 13 B Manual C B Tests I V 09 10 R I This is worse than it might look at first...Tuesday, March 5, 13
  • Manual test case selection In other words, when Hexawise tests had not only tested each test input but tested each test input in combination with every other test input in the plan at least once...Tuesday, March 5, 13
  • Manual test case selection There were many, many pairs of values (in red) that the 13 manual tests had not tested together. C E L A C B A A B After 13 C Manual B Tests I V 09 10 R ITuesday, March 5, 13
  • Manual test case selection Even after the original plan’s 126 manually-created tests (or almost ten times more test cases than the Hexawise test plan required)...Tuesday, March 5, 13
  • ... there were four pairs of values that were never tested in the much longer original plan... C E L A C B Never tested A A B After 126 Neve r tested Manual Never tested C B Tests d I este er t V 09 Nev 10 Manual test case R I selectionTuesday, March 5, 13
  • Manual test case selection ... and there were four pairs of values that were wastefully tested 15 or more times each. C E L s C 5 t ime A 1 t ed B Tes A A B C Te st ed B 2 2t Te I im ste es d 42 V tim 09 es 10 sted 2 2 times Te R ITuesday, March 5, 13
  • Manual test case selection ... and there were four pairs of values that were wastefully tested 15 or more times each. C E L s C 5 t ime A 1 t ed B Tes A A B After 126 Manual C Te Tests st ed B 2 2t Te I im ste es d 42 V tim 09 es 10 sted 2 2 times Te R ITuesday, March 5, 13
  • These characteristics (wasteful repetition combined with missed gaps in coverage) are found in almost all manually-selected test cases. Manual test C E L case selection C 15 ti me s A d ste d Te este B er t Never A A Nev B r tested teste Neve C Never tested d Te st ed B 2 2t Te I im ste es d 42 V tim 09 es 10 s 22 ti me Tested R ITuesday, March 5, 13
  • These characteristics (wasteful repetition combined with missed gaps in coverage) are found in almost all manually-selected test cases. Manual test C E L case selection C 15 ti me s A d ste d Te este B er t Never A A Nev B After 126 r tested teste Neve Manual Never tested C d Te Tests st ed B 2 2t Te I im ste es d 42 V tim 09 es 10 s 22 ti me Tested R ITuesday, March 5, 13
  • Without Hexawise With Hexawise 126 tests 13 tests Incomplete coverage Complete coverage Wasteful repetition Variation, not repetitionTuesday, March 5, 13
  • Faster Test Creation Source: Conservatively interpreted data from several dozen recent pilot projects. Time savings are often significantly larger than 40% and will almost always exceed 30%. 49Tuesday, March 5, 13
  • More Defects Found / Hour Source: Empirical study of average benefits 10 software testing projects published in IEEE Computer magazine in 2009: “Combinatorial Software Testing” Rick Kuhn, Raghu Kacker,Yu Lei, Justin Hunter. Results of individual projects will differ. 50Tuesday, March 5, 13
  • More Defects Found Source: Empirical study of average benefits 10 software testing projects published in IEEE Computer magazine in 2009: “Combinatorial Software Testing” Rick Kuhn, Raghu Kacker,Yu Lei, Justin Hunter. Results of individual projects will differ. 51Tuesday, March 5, 13
  • Three Final Implications 52Tuesday, March 5, 13
  • 1. Bad software quality can bring disaster to anyone. 53Tuesday, March 5, 13
  • 2. Smart, skilled, empowered testers are essential. 54Tuesday, March 5, 13
  • 3. Pairwise and combinatorial testing helps test systems 55Tuesday, March 5, 13
  • 3. Pairwise and combinatorial testing helps test systems BOTH more thoroughly 55Tuesday, March 5, 13
  • 3. Pairwise and combinatorial testing helps test systems BOTH more thoroughly 56Tuesday, March 5, 13
  • 3. Pairwise and combinatorial testing helps test systems BOTH more thoroughly AND more quickly. 56Tuesday, March 5, 13
  • For more info, Google / Bing: 1) Harry Robinson testing 2) Pairwise testing case studies 57Tuesday, March 5, 13
  • 58Tuesday, March 5, 13
  • Thank You! (BTW did this topic make your “Top 3” list?) 58Tuesday, March 5, 13