Your SlideShare is downloading. ×
0
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Experiences with Semi-Scripted Exploratory Testing
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Experiences with Semi-Scripted Exploratory Testing

561

Published on

Presentation for talk at Agile Testing Days 2011, 15 November, Potsdam, Germany

Presentation for talk at Agile Testing Days 2011, 15 November, Potsdam, Germany

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
561
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • These experiences are based on bullets 2&3, and through bullet 1 spreading the word to other teams.
  • Marionette: Following instructions of the puppeteer But suppose there is some danger in front of him that the puppeteer doesn't see? He can decide to divert or modify his action, and even when to continue following directions. So, it's not blindly following instructions...
  • Using some pre-defined set-up as an enabler to an exploratory session, i.e. there can be a certain amount of pre-meditated design and thought before you start exploring The pre-defined set-up and exploration should not exclude observation and feedback. Walking with someone else is valuable (just like testing with someone else adds a new/different dimension)
  • Note, you need a way to determine if/when to carry on Script usage could also be called script-supported testing
  • Large Organisation Some features are for trial purposes Many specifications & protocols Many features and interactions Perpetual question: “How much detail is enough?” Prototype in a conservative test team set-up But, the new thinking has followed into end-2- end team set-ups
  • “ Erfinden sie ein problem” = “they invent a problem”
  • Say which bits will be zoomed-in....
  • Trial, demo & live? A trial might have different performance requirements to live usage. Other apsects not required might be expansion/scaleability needs or robustness and negative testing.
  • This is a 'weeding-out' activity also. Input to 'silent evidence' here is: not possible due to tool or time restraints.
  • Some traditional aspects kept Some new aspects/ideas introduced
  • Note, you can't just make things ambiguous and then thinking and reflection happens. The thinking and questioning must be encouraged as a part of the activity. The ambiguity stops ideas getting 'boxed-in'.
  • Don’t let test cases ”constrain” (box-in) ideas – create them with ”enough” ambiguity so that you’re not constrained, boxed-in, no restricted to a comfort zone and eventually ”thinking outside your comfort zone/box”
  • Testing has been framed as an investigative activity Try this, what happens?, grey areas? Look closer Contrast with, “this is the spec -> test to that” Moves away from reliance on only confirmatory testing
  • By changing the frames of reference the labelling of the activity was avoided. Sometimes labels get in the way. Framing a problem differently is much more subtle -> acceptable.
  • I think of test cases as test idea constraints – because they tend to lock-in ideas about what a test case produces -> it takes on much more value than it should... Getting past test case counting Relating value in test ideas to end-user value Emphasized with PO/team involvement early Working with good test story telling is important Whole team involved in initial analysis Reduces handovers + creates a feeling of involvement Prototype early – good toolsmiths are important Fail fast – throw away ideas before they cost too much!
  • The thinking and reflection about the test ideas, feeding back to improve them -> these are attributes of what I call good testing and attributes that align with many aspects of good agile testing. BIg challebge – abscence of team set-up, wow & processes----- (was also an aid to not have defined WoW & processes!) Reflect on semi-script & exploratory....
  • Q&A session
  • Transcript

    1. Experiences with Semi-scripted Exploratory Testing Simon Morley Agile Testing Days 2011 15 Nov 2011
    2. About <ul><li>Me </li></ul><ul><ul><li>I work as a Test Coordinator with many different teams and testers. </li></ul></ul><ul><ul><li>I jump in as a team leader from time to time </li></ul></ul><ul><ul><li>And even do some testing! </li></ul></ul><ul><li>Context </li></ul><ul><ul><li>Multimedia IP networks </li></ul></ul><ul><ul><li>To “telecom grade” </li></ul></ul>
    3. Overview <ul><li>Preamble... </li></ul><ul><li>Case story </li></ul><ul><ul><li>Background / Problem Description </li></ul></ul><ul><ul><li>Approaches </li></ul></ul><ul><ul><li>Observations & Lessons </li></ul></ul>
    4. Preamble <ul><li>My usage of... </li></ul><ul><ul><li>Scripted </li></ul></ul><ul><ul><li>Exploratory </li></ul></ul><ul><ul><li>Semi-Scripted </li></ul></ul>
    5. Semi-Scripted & Exploratory?
    6. What? <ul><li>Experiences with Combining the following approaches </li></ul><ul><ul><li>Scripted Testing </li></ul></ul><ul><ul><ul><li>Scripted Testing vs Scripted Execution </li></ul></ul></ul><ul><ul><li>Exploratory Testing </li></ul></ul>
    7. Scripted Execution vs Scripted Testing?
    8. Another Example: Login to PC <ul><li>Step1: Enter User-name </li></ul><ul><li>Step2: Enter Password </li></ul><ul><ul><li>What if password has expired? </li></ul></ul><ul><ul><li>What if it was a guest account (no password)? </li></ul></ul><ul><li>Result: Successfully logged-in </li></ul>
    9. Semi-Scripted? <ul><li>“ Walking in the woods” </li></ul><ul><ul><li>Using some pre-defined set-up as an enabler to an exploratory session </li></ul></ul><ul><ul><li>The pre-defined set-up should not exclude observation and feedback. </li></ul></ul><ul><ul><li>Walking with someone else is valuable </li></ul></ul>
    10.  
    11. Why “Walking in the woods” <ul><li>Seasons change -> set-up/config changes </li></ul><ul><ul><li>Repeating walks can actually help reveal new information (due to conditions changing ) </li></ul></ul><ul><li>Terrain changes -> systems “change” due to use </li></ul><ul><ul><li>Prolonged usage of systems build up different problems/conditions “under the surface” </li></ul></ul>
    12. Case Story <ul><li>Background </li></ul><ul><li>Feature Walkthroughs </li></ul><ul><li>Test Ideas & Feasibility </li></ul><ul><li>Test Execution </li></ul><ul><li>Some Success Indicators </li></ul><ul><li>Challenges and Lessons </li></ul>
    13. Background
    14. Experience Context <ul><li>Background factors – challenging... </li></ul><ul><li>Short deadline </li></ul><ul><li>Trial Feature </li></ul><ul><li>Complex Environment </li></ul><ul><li>Traditional Approach </li></ul>
    15. Environment Unchanged Changed NE NE (SUT) Sim NE NE NE NE Sim NE Sim NE Sim NE
    16. Result?
    17. Additional terms of reference <ul><li>Catch-up effort </li></ul><ul><li>Gather information for feature assessment </li></ul><ul><li>“ Free hand” </li></ul><ul><li>Possibilities... </li></ul>
    18. Approach Possibilities
    19. Initial Feature Analysis
    20. Feature Walkthrough
    21. Test Feasibility
    22. Learning
    23. Feature Walkthrough <ul><li>All were 'up to speed' before the discussion </li></ul><ul><li>All the team plus external experience </li></ul><ul><li>Discussion – Q&A - Brainstorming </li></ul><ul><li>Whiteboard is centre-stage </li></ul><ul><li>Good toolsmiths make a difference! </li></ul>
    24. Feature Walkthrough -> Brainstorming for risks
    25. Brainstorming #1
    26. Brainstorming #2
    27. First real testing of the feature!
    28. Test Ideas
    29. Test Ideas -> Test Feasibility
    30. Learnings Gather the right people to look at the problem. Allow time for reflection.
    31. Estimation Notes <ul><li>Main estimation figures were: </li></ul><ul><ul><li>Who was needed and for how long </li></ul></ul><ul><ul><li>Equipment need & availability (feedback) </li></ul></ul><ul><ul><li>Estimation was not based on test case ideas </li></ul></ul>
    32. Test Execution
    33. Semi-Scripted? Same input Same input Config Changes Provisioned Data Changes NE NE (SUT) Sim NE NE NE NE Sim NE Sim NE Sim NE
    34. Test Execution
    35. Test Execution
    36. Test Execution
    37. Test Execution
    38. Test Execution
    39. No “test idea constraints” linked with tester thinking & reflection from execution feeds back into better test ideas.
    40. When you're not interested in numbers Then testers are no longer interested in talking about them!
    41. Results from execution <ul><li>Found issues not found elsewhere </li></ul><ul><li>Results used to declare better information on the feature when going to trial </li></ul><ul><li>Start of change in mindset </li></ul>
    42. Some Success Indicators
    43. A Success Indicator... <ul><li>A test report changes from </li></ul><ul><li>“ Run X test cases, where Y failed and Z passed” </li></ul><ul><li>And becomes more like.. </li></ul>
    44. “ Feature A tested, there are some interaction issues with feature B and, by implication, risks for features C and D. We have not explicitly tested feature C. Recommend some further investigation with feature A in combination with features B and C. As feature D is not to be used in trial a priority decision is needed on the inclusion of D into scope. If feature D is to be put into general usage then further test and investigation is recommended.”
    45. Pilot activity spread to end-2-end teams with greater engagement and end-value appreciation. Issues in the product for end-user usage & interactions are found -> adding value to the product reporting
    46. Observation & Analysis
    47. What's happening here? Testing has been framed as an investigative activity! Scripting is re-framed as a tool rather than a goal for testing!
    48. Note on labels <ul><li>The testing was never labelled as one form or another </li></ul><ul><li>The frames in which it was presented/discussed changed </li></ul><ul><ul><li>Questions to trigger the activity changed </li></ul></ul><ul><ul><li>Move away from valuing test cases </li></ul></ul><ul><ul><li>Move to relate the value in PO terms </li></ul></ul>
    49. Sometimes labels get in the way! Framing a choice differently -> transition!
    50. Change? How? <ul><li>This is a transition rather than a big bang change. Why? Adaptive, small steps, fail fast & improve! </li></ul><ul><li>Time was key – pull the team in a direction, but allow them to see that the benefit from the change -> then they control the speed of change and so it happens quicker! </li></ul><ul><li>The team is challenged to think about the activity – and with the right support & encouragement they generally do think more! </li></ul>
    51. Factors particular to this case <ul><li>“ Free hand” </li></ul><ul><li>No expectation on change of mindset </li></ul><ul><li>First pilot was in a separated test team. </li></ul>
    52. To some external observers... <ul><li>Maybe not so much different </li></ul><ul><ul><li>A product with some level of testing is delivered </li></ul></ul><ul><li>But, from the team perspective </li></ul><ul><ul><li>The value of the testing and ideas was debated & discussed early </li></ul></ul><ul><ul><li>All in the team & PO value this earlier feedback </li></ul></ul><ul><ul><li>Ideas formed, tried out, re-evaluated </li></ul></ul>
    53. Challenges <ul><li>Getting past test case counting </li></ul><ul><li>Relating value in test ideas to end-user value </li></ul><ul><li>Working with good test story telling is important </li></ul><ul><li>Whole team involved in initial analysis </li></ul><ul><li>Prototype early – good toolsmiths are important </li></ul>
    54. Recap <ul><li>Big Challenges </li></ul><ul><li>Used the challenges as a means to start the transition </li></ul><ul><li>Part of transition was enabled by using scripting as an enabler </li></ul><ul><li>But also to trigger thinking / questioning mindset -> throughout (workshop->test analysis/feasibility->execution) </li></ul>
    55. Lessons
    56. Lessons #1 <ul><li>Feature Walkthrough: </li></ul><ul><ul><li>First real testing of the feature happens early </li></ul></ul><ul><ul><li>Gather the right people to look at the problem. </li></ul></ul><ul><ul><li>Allow time for reflection. </li></ul></ul><ul><li>Test Ideas: </li></ul><ul><ul><li>No “test idea constraints”. </li></ul></ul><ul><ul><li>Scripts are an enabler not the goal. </li></ul></ul>
    57. Lessons #2 <ul><li>Execution </li></ul><ul><ul><li>No “test idea constraints” linked with tester thinking & reflection from execution feeds back into better test ideas. </li></ul></ul><ul><ul><li>When you're not interested in numbers then, t hen testers are no longer interested in talking about them! </li></ul></ul>
    58. Lessons #3 <ul><li>General </li></ul><ul><ul><li>Challenge the team (testers) to think and they generally will think! </li></ul></ul><ul><ul><li>Allow the team to see the benefits of change (transition) – then they will begin to drive the speed of change. </li></ul></ul><ul><ul><li>Frame testing differently </li></ul></ul><ul><ul><li>Frame scripts as tools and enablers rather than fountains of knowledge! </li></ul></ul>
    59. Lesson #4 <ul><li>It takes practice to get your testing experience from: </li></ul><ul><li>To: </li></ul>
    60. Questions?
    61. Thank You! Blog: http://testers-headache.blogspot.com Twitter: @YorkyAbroad
    62. Attributions <ul><li>http://www.flickr.com/photos/kalavinka/ </li></ul><ul><li>http://www.flickr.com/photos/kalavinka/4617897952/ </li></ul><ul><li>http://www.flickr.com/photos/tesla314/2666463779/ </li></ul><ul><li>http://www.flickr.com/photos/txmx-2/6054712319/ </li></ul><ul><li>http://www.flickr.com/photos/28385889@N07/3058090229/in/photostream/ </li></ul><ul><li>http://www.flickr.com/photos/somegeekintn/3810233454/ </li></ul><ul><li>http://www.flickr.com/photos/sashomasho/260136080/ </li></ul>

    ×