Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Tools. Techniques. Trouble?
Why test automation is getting more difficult and what
can be done about it.
Gordon McKeown
Gr...
Why?
Why?
Source: Forrester
“(Response) time is money”
Creative tension
Testing
£ $ €
costs--
rate of change++ complexity++
The burden of testing
Are we condemned to be like
Sisyphus, to push a rock up a
hill for eternity?
Museo del Prado, Madrid...
Historical precedent
The division of labour and automation are the
twin foundations of both a modern economy
and effective...
Automation within testing
• Process & personal productivity.
• Test automation
• Driving the System Under Test (SUT).
• Au...
Business context
• Two trends are in conflict…
• Increasing frequency of releases
• requires more testing and therefore be...
Multi-vendor challenges
• Contracts should explicitly require that suppliers
provide technical and logistic support for te...
Technical challenges
• “Mashup”
• Multiple services.
• Multiple vendors.
• Multiple technology stacks.
• Heterogeneous cli...
The “mashup” paradigm
• “A mashup, in web development, is a web page,
or web application, that uses content from more than...
Automating “mashup” apps
• Move up the protocol stack to give holistic test (“user
experience”).
• Multi-level / multi-pro...
Shifting boundaries: the SUT
• ‘it means just what I choose it to mean — neither more nor
less.’ (Humpty Dumpty in Lewis C...
Test automation challenges
• Tester productivity
• Coverage
• Script re-use & maintenance overhead across:
• Time (softwar...
Heterogeneous clients
• Public facing applications
• Multiple mobile platforms plus desktops. Web services.
• Range of dev...
Adding petrol to the flames
• Test executions = functional tests x client types x
releases
• 53% of respondents cite “freq...
GUI level automation may help
• High level GUI automation
• Test across services and components
• User experience, “end to...
Intelligent image recognition and OCR
User-centric GUI test automation
Objects versus images
• For discussion: potential advantages of image based (+
OCR) versus object based:
• Total device co...
Multi-user functional testing
• Today very little software runs in glorious isolation.
• Most functional testing is single...
Network behaviour in scope or out?
• The network is (back) on the application testing agenda
• Twenty years ago the networ...
Network emulation
• Provides real world network behaviour when the actual
test environment has high bandwidth and low late...
Why network emulation?
Test environment Real world
Why network emulation?
Test environment Real world
A 64MB file takes 5s to transfer on a LAN. On a FAST
network from Londo...
Load testing challenges
• All the issues discussed so far apply to both functional
and load testing.
• They are more acute...
Load testing and Web evolution
• Load testing of Web servers has traditionally been
based on “replaying” or constructing h...
HTTP traffic generation approaches
• Verbatim replay of N hour’s worth of network traffic
• This is a niche approach and i...
The Virtual User advantage
• Mimics user activity (“user” may be software agent).
• Maintains conversation state.
• sessio...
Protocol level load testing (emulation)
Load testing tool
Application level load testing
Load testing tool
Application level versus emulation
• Application level
• VU instance drives the real client-side technology.
• E.g. Web Br...
Web scripts from network recordings
• The “traditional” approach for high concurrency testing.
• Simple replay only works ...
The Web technology new wave
• Ajax.
• HTTP/2, SPDY.
• WebSocket.
• Binary components redux.
• HTTP may be mixed with other...
Ajax asynchronous requests
© NetBeans
Dynamic data
GET
HTTP POST user= T. Atkins
200 OK
HTTP 200 OK sessionID= 57341abc
HTTP GET ..57341abc..
HTTP POST sessionI...
Dynamic data
GET
HTTP POST user= T. Atkins
200 OK
HTTP 200 OK sessionID= 57341abc
HTTP GET ..57341abc..
HTTP POST sessionI...
Dynamic data
GET
HTTP POST user= T. Atkins
200 OK
HTTP 200 OK sessionID= 57341abc
HTTP GET ..57341abc..
HTTP POST sessionI...
Dynamic data
GET
HTTP POST user= T. Atkins
200 OK
HTTP 200 OK sessionID= 57341abc
HTTP GET ..57341abc..
HTTP POST sessionI...
Dynamic data
GET
HTTP POST user= T. Atkins
200 OK
HTTP 200 OK sessionID= 57341abc
HTTP GET ..57341abc..
HTTP POST sessionI...
Example: humble date time values
• Formats
• Various strings, milliseconds since Unix epoch (Jan 1 1970) or
some other tim...
Responding to the challenge
• Improve protocol level scripting
• More intelligent emulation and script generation.
• Move ...
Improving protocol level scripting
• Cleverer dynamic data correlation
• Rule based script generation.
• Heuristics and gu...
Moving up the stack?
• Non-GUI client-side APIs
• Producer support needed to avoid reverse engineering.
• Good if document...
Virtual user overhead ≃ scalability
Application GUI
Web browser (Selenium)
HTML Unit or PhantomJS (Selenium)
Web VU (HTTP)
The immediate future
• Increasing scalability for GUI automation
• VMs & cloud.
• Device management.
• Tool resilience & r...
Combined (multi –level) load testing
Protocol level load injection 10k virtual users
Application
level testing
(5-50 VUs)
...
Some conclusions
• Get the contracts right – explicit support for testing.
• Careful consideration and precision when defi...
Thank you!
Questions?
Upcoming SlideShare
Loading in …5
×

Tools. Techniques. Trouble?

235 views

Published on

Why test automation is getting more difficult, and what can be done about it. This slides are from a presentation by Group Director, Product Management at TestPlant, Gordon McKeown, which was presented at the Northern Lights conference in Manchester in April 2016.

Published in: Software
  • Login to see the comments

  • Be the first to like this

Tools. Techniques. Trouble?

  1. 1. Tools. Techniques. Trouble? Why test automation is getting more difficult and what can be done about it. Gordon McKeown Group Director, Product Management, TestPlant UK Northern Lights Manchester 27 April 2016
  2. 2. Why?
  3. 3. Why? Source: Forrester
  4. 4. “(Response) time is money”
  5. 5. Creative tension Testing £ $ € costs-- rate of change++ complexity++
  6. 6. The burden of testing Are we condemned to be like Sisyphus, to push a rock up a hill for eternity? Museo del Prado, Madrid, Spain
  7. 7. Historical precedent The division of labour and automation are the twin foundations of both a modern economy and effective software testing.
  8. 8. Automation within testing • Process & personal productivity. • Test automation • Driving the System Under Test (SUT). • Automating test creation • Create scripts or test data .
  9. 9. Business context • Two trends are in conflict… • Increasing frequency of releases • requires more testing and therefore better test automation. • Increasing numbers of suppliers involved with system delivery • Technical support for testing. • Creating test infrastructure. • (The challenges of more complex technology stacks will be examined later.)
  10. 10. Multi-vendor challenges • Contracts should explicitly require that suppliers provide technical and logistic support for testing. • Testability should be a requirement • Testability should be an important technical criteria when choosing technology. • Components should be testable • Apply this principle to internal development. • Request (demand?) testability of third party components.
  11. 11. Technical challenges • “Mashup” • Multiple services. • Multiple vendors. • Multiple technology stacks. • Heterogeneous clients and interfaces • Desktops (Windows, OS X, Linux). • Mobile (IOS, Android and more). • Service consumers (many types, many frameworks). • IOT, embedded systems. • Web technology is getting complicated! • Increasing rate of technical change • Did I mention business pressures for short release cycles?
  12. 12. The “mashup” paradigm • “A mashup, in web development, is a web page, or web application, that uses content from more than one source to create a single new service displayed in a single graphical interface.” (Wikepedia) • Originated with public facing web sites • Influencing internal enterprise applications. • SOA (Service Oriented Architecture) and micro-service approaches create similar issues for testing.
  13. 13. Automating “mashup” apps • Move up the protocol stack to give holistic test (“user experience”). • Multi-level / multi-protocol testing may also be required • Background load (performance testing). • Individual services / subset focus. • More about this topic anon..
  14. 14. Shifting boundaries: the SUT • ‘it means just what I choose it to mean — neither more nor less.’ (Humpty Dumpty in Lewis Carroll, Through the Looking Glass) • Defining the SUT precisely and completely is essential. • Get explicit agreement from all stake-holders! • You may need to supply missing services • Stubs or Mocks. http://martinfowler.com/articles/mocksArentStubs.html
  15. 15. Test automation challenges • Tester productivity • Coverage • Script re-use & maintenance overhead across: • Time (software releases, technology changes). • Device types / platforms.
  16. 16. Heterogeneous clients • Public facing applications • Multiple mobile platforms plus desktops. Web services. • Range of devices, variable power, screen size & resolution. • Native apps plus web site. • Internal / Enterprise • Increased use of mobile so all of the above can apply. •
  17. 17. Adding petrol to the flames • Test executions = functional tests x client types x releases • 53% of respondents cite “frequency of application functionality changes” as a concern in 2015/2016 World Quality Report (Cap Gemini, HP, Sogeti). • https://www.uk.capgemini.com/thought-leadership/world-quality-report-2015-16
  18. 18. GUI level automation may help • High level GUI automation • Test across services and components • User experience, “end to end” • Hardware costs declining & more flexible resource management through VMs, containers, Cloud • Is becoming more relevant for load testing • Not all SUTs can be tested via a GUI! • Multi-paradigm testing for complex systems • E.g. Web services, APIs and GUIs across devices.
  19. 19. Intelligent image recognition and OCR User-centric GUI test automation
  20. 20. Objects versus images • For discussion: potential advantages of image based (+ OCR) versus object based: • Total device control including reboot. • Test whole GUI not just the app. • Minimal intrusion. • One script many devices (vary images or use image collections). • Images may be less volatile than objects as system evolves. • P.S. Assume I’m biased!
  21. 21. Multi-user functional testing • Today very little software runs in glorious isolation. • Most functional testing is single path and misses whole classes of errors. • Errors are often exposed by low concurrency load tests intended to debug scripts. This confirms that there is a systematic problem. • Most load testing covers a small subset of functionality. • We need to execute low concurrency (compared to load testing) parallel execution of functional tests. • Shared components, servers, networks should be included in detailed functional testing. • Multi-user functional testing is the “missing link”.
  22. 22. Network behaviour in scope or out? • The network is (back) on the application testing agenda • Twenty years ago the network was often in scope. • The last decade: fast intranets + relatively simple web applications meant network was out of scope for much testing. However, it could be argued that this was sometimes the wrong decision! • The rise of mobile devices and the immanent arrival of the IOT means that how software reacts to network characteristics should be an important concern.
  23. 23. Network emulation • Provides real world network behaviour when the actual test environment has high bandwidth and low latency. • Using (a sample of) real networks is expensive and difficult to control. • Relevant for both functional and load testing.
  24. 24. Why network emulation? Test environment Real world
  25. 25. Why network emulation? Test environment Real world A 64MB file takes 5s to transfer on a LAN. On a FAST network from London to New York the latency is just 90ms and the file takes 440s to transfer! There is nothing “bandwidth” can do about this!
  26. 26. Load testing challenges • All the issues discussed so far apply to both functional and load testing. • They are more acute for load testing. • The changing nature of Web technology is particularly challenging….
  27. 27. Load testing and Web evolution • Load testing of Web servers has traditionally been based on “replaying” or constructing http traffic. • This is done more or less intelligently… • The evolution of Web technology is disrupting this approach.
  28. 28. HTTP traffic generation approaches • Verbatim replay of N hour’s worth of network traffic • This is a niche approach and is only employed by network oriented test tools (often with specialist hardware). Problems with system state, clocks etc. • Specify http requests and the target throughput and unleash worker threads across injector machines. • OK for basic throughput testing and where http requests are independent. Problematical when the application is based around conversation state. • Virtual Users that model real users
  29. 29. The Virtual User advantage • Mimics user activity (“user” may be software agent). • Maintains conversation state. • sessions, multi-step transactions, security authentication and authorisation • More natural software model • Variable test data, timings etc.
  30. 30. Protocol level load testing (emulation) Load testing tool
  31. 31. Application level load testing Load testing tool
  32. 32. Application level versus emulation • Application level • VU instance drives the real client-side technology. • E.g. Web Browser, Application GUI or client-side non-GUI application code like a Java remote service stub. • Emulation • The tool emulates client-side behaviour. • For Web testing the more sophisticated tools will emulate some browser features like security, re-directs, cookies and data caching “out of the box”. • The rest of the emulation will be based on the recorded HTTP traffic supported by an internal API.
  33. 33. Web scripts from network recordings • The “traditional” approach for high concurrency testing. • Simple replay only works for simple sites. • Key challenges: • Parameterisation. • Conversation state • Dynamic data correlation originating from the server. • Dynamic data originating from client-side scripting.
  34. 34. The Web technology new wave • Ajax. • HTTP/2, SPDY. • WebSocket. • Binary components redux. • HTTP may be mixed with other protocols. • Expanded data formats including binary data.
  35. 35. Ajax asynchronous requests © NetBeans
  36. 36. Dynamic data GET HTTP POST user= T. Atkins 200 OK HTTP 200 OK sessionID= 57341abc HTTP GET ..57341abc.. HTTP POST sessionID= 57341abc .. token= k2fg54" 200 OK 200 OK GET GET GET GET 200 OK 200 OK 200 OK 200 OK GET
  37. 37. Dynamic data GET HTTP POST user= T. Atkins 200 OK HTTP 200 OK sessionID= 57341abc HTTP GET ..57341abc.. HTTP POST sessionID= 57341abc .. token= k2fg54" 200 OK 200 OK GET GET GET GET 200 OK 200 OK 200 OK 200 OK GET Input by user: make into a parameter, vary value during test execution
  38. 38. Dynamic data GET HTTP POST user= T. Atkins 200 OK HTTP 200 OK sessionID= 57341abc HTTP GET ..57341abc.. HTTP POST sessionID= 57341abc .. token= k2fg54" 200 OK 200 OK GET GET GET GET 200 OK 200 OK 200 OK 200 OK GET Input by user: make into a parameter, vary value during test execution From server: copy into variable
  39. 39. Dynamic data GET HTTP POST user= T. Atkins 200 OK HTTP 200 OK sessionID= 57341abc HTTP GET ..57341abc.. HTTP POST sessionID= 57341abc .. token= k2fg54" 200 OK 200 OK GET GET GET GET 200 OK 200 OK 200 OK 200 OK GET Input by user: make into a parameter, vary value during test execution From server: copy into variable Return to server: set from variable Return to server: set from variable
  40. 40. Dynamic data GET HTTP POST user= T. Atkins 200 OK HTTP 200 OK sessionID= 57341abc HTTP GET ..57341abc.. HTTP POST sessionID= 57341abc .. token= k2fg54" 200 OK 200 OK GET GET GET GET 200 OK 200 OK 200 OK 200 OK GET Input by user: make into a parameter, vary value during test execution From server: copy into variable Return to server: set from variable Return to server: set from variable Ceated by client: Test tool must emulate
  41. 41. Example: humble date time values • Formats • Various strings, milliseconds since Unix epoch (Jan 1 1970) or some other time. Time zones. • Originated by server or client? • Meaning when originating from client • Now. • Now + offset. • Fixed. • Calculated from other data including time values. • End of current hour / day / week / month / year.
  42. 42. Responding to the challenge • Improve protocol level scripting • More intelligent emulation and script generation. • Move automation up the software stack • Target automation functionality provided by framework developers (vendors and open source projects). • Virtual Users based around GUI automation.
  43. 43. Improving protocol level scripting • Cleverer dynamic data correlation • Rule based script generation. • Heuristics and guided rule creation. • One recording -> multiple scripts • Background and foreground scripts, workflow fragments. • Tool extensibility and plug-ins • Add to and modify standard behaviour. • Incorporate external code.
  44. 44. Moving up the stack? • Non-GUI client-side APIs • Producer support needed to avoid reverse engineering. • Good if documented. • Better if explicit testing support like tracing and ease of deployment is provided. • Not truly “end to end”. • GUI automation • Potential re-use of functional test scripts. • Management & resource overhead of real or virtual application client instances. • Hardware costs continue to fall. • “End to end”.
  45. 45. Virtual user overhead ≃ scalability Application GUI Web browser (Selenium) HTML Unit or PhantomJS (Selenium) Web VU (HTTP)
  46. 46. The immediate future • Increasing scalability for GUI automation • VMs & cloud. • Device management. • Tool resilience & redundancy. • Low level automation continues • Will continue to be necessary for high concurrency tests. • Needed for sub-system testing.
  47. 47. Combined (multi –level) load testing Protocol level load injection 10k virtual users Application level testing (5-50 VUs) Load testing tool
  48. 48. Some conclusions • Get the contracts right – explicit support for testing. • Careful consideration and precision when defining the SUT. Explicit agreement with stakeholders. • Tools and technology must co-evolve. • Cross platform and cross device testing are critical. • Mashup designs require testing at UI. • Load testing automation is moving up the software stack, but low-level interfaces will remain important. • Combining automation approaches often valuable.
  49. 49. Thank you! Questions?

×