'Pushing The Boundaries Of User Experience Test Automation' by Julian Harty


Published on

One of my current responsibilities is to find ways to automate as much as practical of the ‘testing’ of the user experience (UX) of complex web-based applications. In my view, full test automation of UX is impractical and probably unwise, however we can use automation to find potential problems in UX even of rich, complex applications.

I, and others, are working to find ways to use automation to discover various types of these potential problems. Here’s an overview of some of the points I made. The work is ongoing and is likely to have demonstrated significant business benefits by the time of the EuroStar conference. In my experience, heuristics are useful in helping identify potential issues. Various people have managed to create test automation that essentially automates various heuristics. All the examples I’ve provided are available as free opensource software (FOSS). I’ve learnt to value opensource because it reduces the cost of experimentation and allows us to extend and modify the code e.g. to add new heuristics relatively easily (you still need to be able to write code, however the code is freely and immediately available).

Automation is (often) necessary, but not sufficient Automation and automated tests can be beguiling, and paradoxically increase the chances of missing critical problems if we chose to rely mainly or even solely on the automated tests. Even with state of the art (the best we can do across the industry) automated tests I still believe we need to ask additional questions about the software being tested. Sadly, in my experience, most automated tests are poorly designed and implemented, which increases the likelihood of problems eluding the automated tests.

Published in: Technology, Design
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • “Could do better…”
  • Delivery includes automated builds, testing,etcExamples of vast difference in the performance test results from various commercial performance testing products
  • 'Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

    1. 1. Pushing the boundaries of User ExperienceTest Automation@EuroStar 2011JULIAN HARTY15th October 2011
    2. 2. EuroStar 2011What we want to achieveTo automatically detect (some)issues that may adverselyaffect the User Experience2
    3. 3. EuroStar 2011So what are the business problems?• Quality-In-Use:– UX, Cross-browser & Accessibility issues on live site• Engineering Productivity:– Time taken from feature-complete to live site• Quality Engineering:– Over-reliance on manual testing– Existing test-automation under-performing3
    4. 4. EuroStar 2011And some UX Test Automation problems…• Minimal existing work in Industry–Existing tools tend to test static aspects• Academic work appears to be unsuitable• How to automatically measure and test UXat all!4
    5. 5. EuroStar 2011UX = User Experience• Dynamic, based on using the system• We focus on the Human + Computer Interactions5
    6. 6. EuroStar 2011Using Heuristics*1. When I navigate by tabbing I should return towhere I started in a reasonable number ofkey-presses2. Flow should be consistent through web-forms3. Look-alike & work-alike across web browsers4. I should not be able to bypass the security ofthe site from links in the UI* Fallible, but useful, guide/approach/practice6
    7. 7. EuroStar 2011Equivalence of input pathsMouse• Clicks…Keyboard• Keystrokes7NewDropdownSelectSendTabSpaceDown-arrowEnterDown-arrowDown-arrowDown-arrowDown-arrowTabEnterDesigning and Engineering Time by Steven Stow
    8. 8. EuroStar 2011A real example: Information for SpeakersMouseover; click = 2 steps 13 Tabs + 22 Tabs = 35 steps8
    9. 9. EuroStar 2011Automated exploration and discovery*• Automating heuristic tests–Web accessibility testing–Fighting layout bugs–BiDi checker• Using Crawling tools–Crawljax* We’re working on complementary work on interactiveexploration and discovery9
    10. 10. EuroStar 2011Man & Machine• Fully automated execution, humaninspection• Interactive testing, aided by tools10
    11. 11. EuroStar 2011The moving parts…11crawljax End-to-end testsFighting-layout-bugsWeb-accessibility-testingW3CvalidatorSecuritycheckersCross-browser-checkingOur Tests Our TestsWebDriverDECORATORS[1][1] http://en.wikipedia.org/wiki/Decorator_pattern
    12. 12. EuroStar 2011DEMO12
    13. 13. EuroStar 2011Sample screenshots13http://adsense.google.comhttp://www.google.co.inhttp://www.eurostarconferences.com/Registration.aspx?type=delegate&id=4
    14. 14. EuroStar 2011And what about Static Analysis?• Don‟t overlook the value of static analysis(but don‟t rely on it either…)• Find ways to filter the reported results• We‟ve created automated AccessibilityTests, which check the contents of pagesusing WCAG guidelines.14
    15. 15. EuroStar 2011Problems detected:15{"testName":"checkMouseEventHaveKeyboardEvent","description":"Check that elements that have mouse actions also have keyboard actions.","severity":"error","elementCode":"<a href="#"onclick="window.open(/Content/Terms-and-conditions.aspx,termsAndConditions,width=1000,height=600,resizable=yes,toolbars=0,menubar=no,scrollbars=yes,status=no); return false;"> Click here to read termsand conditions</a>"}“checkTitleIsNotEmpty” for the entire page"testName":"checkAltTextOnImage","description":"Check that visible images have alt text","severity":"error","elementCode":"<img src="/themes/Eurostar/images/ico-linkedin.png" />"
    16. 16. EuroStar 2011Now what?“Sure, Accessibility is important:file it with the rest of the thingswe should do...”What should be done seldom gets done...16
    17. 17. EuroStar 20113 Improvements for the price of 1• Accessibility + Testability– Both need to interpret the contents in the browser– Improving one often improves the other• Accessibility + SEO*– Both find value in descriptive labels17[*] SEO: Search Engine Optimizationalt=“”pictureMens Black Levi 501 Jeans 32" Waist /32" Leg Excellent Conditionalt=“picture” alt=“Mens Black ...”
    18. 18. EuroStar 2011Happiness• Improved Accessibility: Users are happier• Improved Testability: We‟re happier• Improved SEO: Business is happier18
    19. 19. EuroStar 2011Beware of (over-)trusting test automation• Automation as servant, not master, of oursoftware delivery– Inaccurate results– “Beware of Automation Bias” by M.L. Cumming[1]• Automated tests and checks miss more thanthey find• Make behavior easier to assess– Screenshots and contents of DOM to verify after tests ran– Automated video recordings[1] http://citeseerx.ist.psu.edu/viewdoc/download?doi=
    20. 20. EuroStar 2011What we think we’re testing…20
    21. 21. EuroStar 2011What our automated test actually interacts with…21
    22. 22. EuroStar 2011(Some of the) things our automated tests may miss…22PromotionsNavigationHistoryLayout,Rendering &FormattingProblemJavaScriptCSS andHTML
    23. 23. EuroStar 2011Beware of poor-quality automated tests• AssertTrue(true);• No/Inadequate/Too many Asserts• Poor code design• Falsehoods (False positives / negatives)• Focus on:– Improving the quality of your automated tests– Finding ways to improve the quality, & testability, ofthe code being tested23
    24. 24. EuroStar 2011Increasing the signal-to-noise of test results• Suppress unwanted „warnings‟–C.f. static analysis tools• Increase potency of tests• Consider dumping ineffective tests24
    25. 25. EuroStar 2011Further reading and researchThe opensource projecthttp://code.google.com/p/web-accessibility-testingFinding Usability Bugs with Automated Testshttp://queue.acm.org/detail.cfm?id=1925091Fighting Layout Bugshttp://code.google.com/p/fighting-layout-bugs/Experiences Using Static Analysis to Find Bugshttp://www.google.com/research/pubs/pub34339.htmlMy bloghttp://blog.bettersoftwaretesting.com/“Beware of Automation Bias” by M.L. Cummingshttp://citeseerx.ist.psu.edu/viewdoc/download?doi= and Engineering Time by Steven StowISBN 978-0-321-50918-525
    26. 26. Questions now?Questions later…julianharty@gmail.comjharty@ebay.comAcknowledgementsThanks to Bart Knaak and Jacob Hooghiemstra atStarEast 2011 who helped me improve the material bytheir comments and ideas; and to Damien Allison andJonas Klink my colleagues on the opensource project.