Software Test Patterns: Successes and Challenges

1,865 views

Published on

Imbus Software QS-TAG 08. November 8, 2007, Nuremberg, Germany.
Reviews how the pattern approach has been applied to software testing.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,865
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Software Test Patterns: Successes and Challenges

  1. 1. ™ mVerify A Million Users in a Box ® Software Test Patterns:Successes and Challenges Robert V. Binder Software QS-TAG 08 Nuremberg November 8, 2007
  2. 2. Overview Why I Used Test Patterns What Have We Learned? Are Patterns Necessary? Test Design Patterns, 2.0 Q&A 2
  3. 3. Test Design Patterns Software testing, c. 1995  A large and fragmented body of knowledge  Few ideas about testing OO software Challenges  Re-interpret wealth of knowledge for OO  Address unique OO considerations  Systematic presentation  Uniform analytical framework Patterns looked like a useful schema  Existing templates didn’t address unique testing issues 3
  4. 4. Some Footprints1995 Design Patterns 2003 Briand’s Experiments1995 Beizer, Black Box 2003 Dot Net Test Objects Testing1995 Firesmith PLOOT 2003 Microsoft Patterns Group1995 McGregor 2004 Java Testing Patterns1999 TOOSMPT 2005 JUnit Anti Patterns2000 Tutorial Experiment 2007 Test Object Anti Patterns2001 POST Workshops (4) 2007 Software QS-TAG 4
  5. 5. Ten Years After … Many new design patterns for hand-crafted test automation  Elaboration of Incremental Test Framework (e.g. JUnit)  Platform-specific or application-specific  Narrow scope Few new test design patterns No new oracle patterns Attempts to generate tests from design patterns To date ~9,500 copies of TOOSMPT 5
  6. 6. Limits on Acceptance Proliferation of templates Confusion -- process v. implementation v. testing “Me-too” forms – idiosyncratic heuristics lacking hooks for conceptual integration Pattern adoption rituals Finger versus the moon “Code-first” (I code, therefore I am) Perceived cost of secondary modeling too high 6
  7. 7. What Have We Learned? Effective for articulation of insight and practice  Requires discipline to develop  Supports research and tool implementation Do not “work out of the box”  Requires discipline in application  Enabling factors Irrelevant to the uninterested, undisciplined  Low incremental benefit  Readily available substitutes Broadly influential, but not compelling 7
  8. 8. Innovators and Scribes A paradox  Successful innovators don’t follow patterns, they create them  Without application, there cannot be any patterns to discover 8
  9. 9. Wright and the Prairie School Wright’s “style” is a pattern language. “A building should appear to grow easily from its site. Design gently sloping roofs low-pitch hipped, unbroken … Keep proportions low. Use suppressed heavy chimneys. Build sheltering overhangs. Include low terraces. Construct garden walls that reach out. … Group windows in a rhythmic way. Use casement windows, not double-hung, guillotine-style windows.” Frank Lloyd Wright, In the Cause of Architecture, 1908 9
  10. 10. Robie House: Chicago, 1908 10
  11. 11. Wright and the Prairie School Wright’s own pattern language doesn’t include a key element of its effectiveness  Prospect and refuge: where one can see without being seen 11
  12. 12. Heurtley House, Oak Park, 1902 12
  13. 13. Are Patterns Necessary? We don’t need Patterns  for a pattern language  for elegant solutions We do need Patterns  for a trustworthy conceptual map  to efficiently identify, share, and use solutions 13
  14. 14. 21st Century Software Challenges Systems of Systems  Exponential increase in complexity and impact  Testing is the only approach that can assure system scope reliability  Realistic model-based test generation and evaluation is the only testing strategy that can scale for SOS Testing depends on software creation  Must test the SUT, not models or patterns  Second modeling effort is too expensive 14
  15. 15. The Hardware Strategy Design equals implementation Formal design models with substantial automated support (Mathlab, Matrix X, Cadence …) Development testing must be (and is) mostly automatic Could a practical analog be found for software? 15
  16. 16. Test Design Patterns, 2.0 Establish a common descriptive framework  Choose a template  http://www.taxonomywarehouse.com Establish value-creating applications  Test automation  Design  Benchmarking  Research  What can be learned from classification? Align with Model-Driven Architecture  Testable model criteria  OMG Standards 16
  17. 17. Q&A 17

×