T21	
  
Test	
  Automation	
  
10/5/17	
  15:00	
  
	
  
	
  
	
  
	
  
The	
  Pothole	
  of	
  Automating	
  Too	
  Much	
  
	
  
Presented	
  by:	
  
	
  
Paul	
  Holland	
  
	
  Medidata	
  Solutions,	
  Inc.	
  
	
  
Brought	
  to	
  you	
  by:	
  	
  
	
  	
  
	
  
	
  
	
  
	
  
	
  
350	
  Corporate	
  Way,	
  Suite	
  400,	
  Orange	
  Park,	
  FL	
  32073	
  	
  
888-­‐-­‐-­‐268-­‐-­‐-­‐8770	
  ·∙·∙	
  904-­‐-­‐-­‐278-­‐-­‐-­‐0524	
  -­‐	
  info@techwell.com	
  -­‐	
  http://www.starwest.techwell.com/	
  	
  	
  
	
  
	
  	
  
	
  
	
  
Paul	
  Holland	
  
Medidata	
  Solutions,	
  Inc.	
  
	
  
With	
  more	
  than	
  twenty	
  years’	
  experience	
  in	
  software	
  testing,	
  Paul	
  Holland	
  is	
  a	
  
senior	
  director	
  of	
  test	
  engineering	
  at	
  New	
  York	
  City-­‐based	
  Medidata	
  Solutions,	
  
Inc.	
  Previously,	
  he	
  spent	
  two	
  years	
  as	
  head	
  of	
  testing	
  at	
  a	
  small	
  consultancy,	
  two	
  
years	
  as	
  the	
  principal	
  consultant/owner	
  at	
  Testing	
  Thoughts,	
  and	
  seventeen	
  years	
  
at	
  Alcatel-­‐Lucent.	
  Paul	
  specializes	
  in	
  adapting	
  testing	
  methodologies	
  to	
  reduce	
  
waste,	
  and	
  to	
  increase	
  effectiveness	
  and	
  efficiency	
  by	
  finding	
  ways	
  to	
  document	
  
only	
  what	
  needs	
  to	
  be	
  documented,	
  modifying	
  reporting	
  of	
  test	
  activities	
  to	
  
provide	
  actionable	
  information	
  to	
  stakeholders,	
  and	
  reducing	
  or	
  eliminating	
  
potentially	
  harmful	
  metrics.	
  Paul	
  is	
  one	
  of	
  four	
  instructors	
  of	
  the	
  Rapid	
  Software	
  
Testing	
  course,	
  developed	
  by	
  James	
  Bach	
  and	
  Michael	
  Bolton.	
  
	
  
The Pothole of Trying to
Automate too Much
Paul Holland,
Sr. Director, Test Engineering at Medidata
September, 2017
27 September 2017
My Background
Ø Sr. Director Test Engineering at Medidata since 2016
Ø S/W Testing consultant since 2012-2016
Ø 20+ years testing telecommunications equipment and
reworking test methodologies at Alcatel-Lucent
Ø 15+ years as a test manager/director
Ø Frequent conference speaker
Ø Teacher of S/W testing for the past 7 years
Ø Teacher of Rapid Software Testing
Ø Military Helicopter pilot – Canadian Sea Kings
227 September 2017
Disclaimer / Clarification
Ø When I talk about “automation” in this
presentation, I am referring to UI level
automation – typically coded by SDETs or
Testers
Ø I am not talking about unit level
automation – typically created by
developers
Ø I am also not talking about Integration or API
level automation – typically written by people
who wish they had a different job (just kidding)
327 September 2017
Automation …
Ø Automation will (likely) NOT
• Increase your coverage
• Decrease your costs
• Save you time
• Allow you to reduce headcount
Ø Automation CAN
• Give you a decent sanity check of your
product
• Execute in less time than a human performing
the same checks
427 September 2017
Execute
An Experience Report
Ø System Test team had raised 1260 bugs
over one year – attributed to scripts
Ø Analysis of whether they were found by
the script or by a vigilant tester
Ø Findings:
• 70% of the bugs were found by a vigilant
tester
• i.e.: They probably would NOT have been
found by automation
Nortel Networks
27 September 2017
(when it was still a thing)
5
Metaphor: Your Software = USA
Paths through the software = roads in the country
27 September 2017 6
A lot More Automation
27 September 2017 7
A Big Problem
827 September 2017
There are more roads than we can possibly test
A Big Problem
927 September 2017
There are more roads than we can possibly test
A Big Problem
1027 September 2017
There are more roads than we can possibly test
A Big Problem
11
There are more roads than we can possibly test
27 September 2017
Another Experience Report
Ø A decision was made to write an automated test for
every requirement
Ø Why?
• To save time?
• Reduce headcount?
• To catch more bugs?
• To increase coverage?
• To future proof the product?
• To make management feel better?
Ø Probably ALL of these reasons
12
A Large Insurance Company
27 September 2017
(initially)
Another Experience Report
Ø When I analyzed their testing situation:
• They had an EXCELLENT automation framework
(modular, dereferenced calls, etc)
• They had over 2500 scripts
• Testing all written requirements
• Not ALL requirements, however, just the written ones
• Any bug that did not contradict a requirement was not
a bug – until a requirement was created – then the
bug could be entered
13
A Large Insurance Company
27 September 2017
Another Experience Report
Ø When I analyzed their testing situation:
• 13 outsourced testers in India took 2 weeks to
execute them and investigate failures
• About 40% of the scripts failed on each execution
• They would get re-executed and if they passed then
“no problem”
• If it still failed they had to find out why and either
update the script or raise a bug
• On average this team found 6 defects a month
• I pair tested the product for 20 min and found 5 bugs
14
A Large Insurance Company
27 September 2017
Problem 1
There are more kinds of
problems than automation can
be programmed to recognize
Vigilant testers can observe and
evaluate a very large variety of outputs
and also vary the inputs to test new
code paths – story about web page rendering
1527 September 2017
Problem 2
There are more checks to
automate than can possibly be
written
1627 September 2017
Problem 3
Some things are too difficult to
automate effectively
Complex pass/fail algorithms
Perhaps could be done quickly by
humans
1727 September 2017
Problem 4
Investigating reported failures
takes a long time
UI level automation tends to be fragile
and often breaks when code changes
If investigation does not happen then
why run the tests?
1827 September 2017
Problem 5
Automation is expensive to build
and maintain
High cost to value ratio
Sunk cost syndrome
1927 September 2017
Better Solution
A Strategic Mixed Approach:
Ø Automate critical paths
Ø Automate paths with the highest use
Ø Do NOT write automation for all failures found
in the field
Ø Consider the cost of automation vs. benefit
• Difficulty to create
• Difficulty to maintain (frequency of changes)
• Difficulty to analyze failures
Ø Augment automation by performing “testing” (by
actual humans)
2027 September 2017

The Pothole of Automating Too Much

  • 1.
              T21   Test  Automation   10/5/17  15:00           The  Pothole  of  Automating  Too  Much     Presented  by:     Paul  Holland    Medidata  Solutions,  Inc.     Brought  to  you  by:                   350  Corporate  Way,  Suite  400,  Orange  Park,  FL  32073     888-­‐-­‐-­‐268-­‐-­‐-­‐8770  ·∙·∙  904-­‐-­‐-­‐278-­‐-­‐-­‐0524  -­‐  info@techwell.com  -­‐  http://www.starwest.techwell.com/                
  • 2.
    Paul  Holland   Medidata  Solutions,  Inc.     With  more  than  twenty  years’  experience  in  software  testing,  Paul  Holland  is  a   senior  director  of  test  engineering  at  New  York  City-­‐based  Medidata  Solutions,   Inc.  Previously,  he  spent  two  years  as  head  of  testing  at  a  small  consultancy,  two   years  as  the  principal  consultant/owner  at  Testing  Thoughts,  and  seventeen  years   at  Alcatel-­‐Lucent.  Paul  specializes  in  adapting  testing  methodologies  to  reduce   waste,  and  to  increase  effectiveness  and  efficiency  by  finding  ways  to  document   only  what  needs  to  be  documented,  modifying  reporting  of  test  activities  to   provide  actionable  information  to  stakeholders,  and  reducing  or  eliminating   potentially  harmful  metrics.  Paul  is  one  of  four  instructors  of  the  Rapid  Software   Testing  course,  developed  by  James  Bach  and  Michael  Bolton.    
  • 3.
    The Pothole ofTrying to Automate too Much Paul Holland, Sr. Director, Test Engineering at Medidata September, 2017 27 September 2017
  • 4.
    My Background Ø Sr.Director Test Engineering at Medidata since 2016 Ø S/W Testing consultant since 2012-2016 Ø 20+ years testing telecommunications equipment and reworking test methodologies at Alcatel-Lucent Ø 15+ years as a test manager/director Ø Frequent conference speaker Ø Teacher of S/W testing for the past 7 years Ø Teacher of Rapid Software Testing Ø Military Helicopter pilot – Canadian Sea Kings 227 September 2017
  • 5.
    Disclaimer / Clarification ØWhen I talk about “automation” in this presentation, I am referring to UI level automation – typically coded by SDETs or Testers Ø I am not talking about unit level automation – typically created by developers Ø I am also not talking about Integration or API level automation – typically written by people who wish they had a different job (just kidding) 327 September 2017
  • 6.
    Automation … Ø Automationwill (likely) NOT • Increase your coverage • Decrease your costs • Save you time • Allow you to reduce headcount Ø Automation CAN • Give you a decent sanity check of your product • Execute in less time than a human performing the same checks 427 September 2017 Execute
  • 7.
    An Experience Report ØSystem Test team had raised 1260 bugs over one year – attributed to scripts Ø Analysis of whether they were found by the script or by a vigilant tester Ø Findings: • 70% of the bugs were found by a vigilant tester • i.e.: They probably would NOT have been found by automation Nortel Networks 27 September 2017 (when it was still a thing) 5
  • 8.
    Metaphor: Your Software= USA Paths through the software = roads in the country 27 September 2017 6
  • 9.
    A lot MoreAutomation 27 September 2017 7
  • 10.
    A Big Problem 827September 2017 There are more roads than we can possibly test
  • 11.
    A Big Problem 927September 2017 There are more roads than we can possibly test
  • 12.
    A Big Problem 1027September 2017 There are more roads than we can possibly test
  • 13.
    A Big Problem 11 Thereare more roads than we can possibly test 27 September 2017
  • 14.
    Another Experience Report ØA decision was made to write an automated test for every requirement Ø Why? • To save time? • Reduce headcount? • To catch more bugs? • To increase coverage? • To future proof the product? • To make management feel better? Ø Probably ALL of these reasons 12 A Large Insurance Company 27 September 2017 (initially)
  • 15.
    Another Experience Report ØWhen I analyzed their testing situation: • They had an EXCELLENT automation framework (modular, dereferenced calls, etc) • They had over 2500 scripts • Testing all written requirements • Not ALL requirements, however, just the written ones • Any bug that did not contradict a requirement was not a bug – until a requirement was created – then the bug could be entered 13 A Large Insurance Company 27 September 2017
  • 16.
    Another Experience Report ØWhen I analyzed their testing situation: • 13 outsourced testers in India took 2 weeks to execute them and investigate failures • About 40% of the scripts failed on each execution • They would get re-executed and if they passed then “no problem” • If it still failed they had to find out why and either update the script or raise a bug • On average this team found 6 defects a month • I pair tested the product for 20 min and found 5 bugs 14 A Large Insurance Company 27 September 2017
  • 17.
    Problem 1 There aremore kinds of problems than automation can be programmed to recognize Vigilant testers can observe and evaluate a very large variety of outputs and also vary the inputs to test new code paths – story about web page rendering 1527 September 2017
  • 18.
    Problem 2 There aremore checks to automate than can possibly be written 1627 September 2017
  • 19.
    Problem 3 Some thingsare too difficult to automate effectively Complex pass/fail algorithms Perhaps could be done quickly by humans 1727 September 2017
  • 20.
    Problem 4 Investigating reportedfailures takes a long time UI level automation tends to be fragile and often breaks when code changes If investigation does not happen then why run the tests? 1827 September 2017
  • 21.
    Problem 5 Automation isexpensive to build and maintain High cost to value ratio Sunk cost syndrome 1927 September 2017
  • 22.
    Better Solution A StrategicMixed Approach: Ø Automate critical paths Ø Automate paths with the highest use Ø Do NOT write automation for all failures found in the field Ø Consider the cost of automation vs. benefit • Difficulty to create • Difficulty to maintain (frequency of changes) • Difficulty to analyze failures Ø Augment automation by performing “testing” (by actual humans) 2027 September 2017