Viking methodology

433 views
344 views

Published on

A hands on discussion of web accessibility testing techniques

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
433
On SlideShare
0
From Embeds
0
Number of Embeds
27
Actions
Shares
0
Downloads
7
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Viking methodology

  1. 1. 1Monday, May 20, 13 Viking Accessibility: The Warrior's Approach to Hands-on Testing
  2. 2. 2Monday, May 20, 13 Karl Groves karl@simplyaccessible.com @karlgroves
  3. 3. 3Monday, May 20, 13 Goals and Objectives • Understand accessibility testing techniques • Understand common challenges by content type • Gain hands-on knowledge
  4. 4. 4Monday, May 20, 13 Resources http://examples.simplyaccessible.com/vikinghandson/
  5. 5. 5Monday, May 20, 13 Review - Understanding Disability
  6. 6. 6Monday, May 20, 13 Visual Impairment • Blindness • Partially Sighted • Low Vision • Colorblindness What types of challenges will they have on the web?
  7. 7. 7Monday, May 20, 13 Hearing Impairment • Deafness (one/ both ears) • Hard of hearing • High/ low frequency hearing loss What types of challenges will they have on the web?
  8. 8. 8Monday, May 20, 13 Motor Impairment • Loss of limbs, digits • Palsy disorders • Repetitive stress injuries • Arthritis • Spinal cord injuries • more What types of challenges will they have on the web?
  9. 9. 9Monday, May 20, 13 Cognitive Impairment • Autism • Brain injury • Parkinson’s • Dyslexia • Alzheimer’s • more What types of challenges will they have on the web?
  10. 10. 10Monday, May 20, 13 Speech Impairment • Stuttering • Muteness • Dysarthria (resulting from motor control disorders) • Articulation & phonemic disorders What types of challenges will they have on the web?
  11. 11. 11Monday, May 20, 13 Principles of Accessibility
  12. 12. 12Monday, May 20, 13 P erceivable O perable U nderstandable R obust Focuses on user needs, not technology.
  13. 13. 13Monday, May 20, 13 Assistive Technologies
  14. 14. 14Monday, May 20, 13 Screen Readers • Intercept what is sent to standard output • Object info & content rendered in text to speech
  15. 15. Screen Readers 15Monday, May 20, 13 User needs: • Keyboard access • Text alternatives • Headings • Logical/ sequential ordering • Proper labels
  16. 16. Screen Magnification • Enlarges on screen content • Different magnification modes • Contrast Modes • Cursor, pointer enhancement http://flickr.com/photos/justinstravels/322408478/ 16Monday, May 20, 13
  17. 17. Screen Magnification http://flickr.com/photos/justinstravels/322408478/ 17Monday, May 20, 13 User needs: • Text alternatives • Resizable layouts • Flexible content
  18. 18. 18Monday, May 20, 13 Voice Recognition Accepts user commands to activate controls and interact with system
  19. 19. Voice Recognition 19Monday, May 20, 13 User Needs: • Device independence • Accurate text alternatives • Accurate labels
  20. 20. 20Monday, May 20, 13 Hardware As diverse as the array of possible disabilities and severities thereof
  21. 21. • Body Level One Body Level Two Body Level Three 21Monday, May 20, 13 Often Combined http://flickr.com/photos/kazuhito/132436943/
  22. 22. simplyaccessible.comKarl Groves | @karlgroves 22Monday, May 20, 13 Approaches to Testing
  23. 23. 23Monday, May 20, 13 Automated Testing What is it? Use of tool to access web document and subject it to predetermined heuristic checks • Plugins/ Toolbars • Desktop Apps • Web Apps
  24. 24. Automated Testing 24Monday, May 20, 13 Pros: • Unprecedented efficiency (cost per issue) • Some issues don’t require humans Cons: • Incomplete coverage • False positives • Subjectivity in a11y • DOM testing • User interaction
  25. 25. 25Monday, May 20, 13 Manual Testing What is it? Use of hands-on techniques to inspect for potential failures, possibly by emulating disabled user scenarios • Code inspection • Hardware manipulation • Software/ Settings manipulation • AT Testing
  26. 26. Manual Testing 26Monday, May 20, 13 Pros: • Accuracy • Reliability • Judgment Cons: • Time • Reliant on tester’s skill
  27. 27. 27Monday, May 20, 13 Use Case Testing What is it? Analysis of system behavior by subjecting it to scenarios that touch on functional requirements - in this case doing so with assistive technologies
  28. 28. Use Case Testing 28Monday, May 20, 13 Pros: • Can happen concurrently with other testing • Gives glimpse of real-world issues faced by PWD Cons: • Time • Tester must know the AT • Success with one AT !== success with all
  29. 29. 29Monday, May 20, 13 Usability Testing What is it? Observation of test participants using core user tasks, measuring efficiency, accuracy, recall, emotional response.
  30. 30. Usability Testing 30Monday, May 20, 13 Pros: Most closely represents user’s actual experience Cons: • Expensive • Time-Consuming • Results may be skewed by high impact issues
  31. 31. 31Monday, May 20, 13 Tools “It’s a poor mechanic who blames his tools” - Old Man Brian
  32. 32. Tools 32Monday, May 20, 13 Free: • Browser Toolbars/ Plugins • Online evaluators Non-Free: • Enterprise testing suites
  33. 33. 33Monday, May 20, 13 Browser Toolbars/ Plugins Examples: • WAVE • Favelets Bar • Web Accessibility Toolbar • Web Developer Plugin • Fangs
  34. 34. 34Monday, May 20, 13 WAVE from WebAIM http://wave.webaim.org
  35. 35. 35Monday, May 20, 13 Accessibility Evaluation Toolbar
  36. 36. 36Monday, May 20, 13 Web Developer Toolbar
  37. 37. 37Monday, May 20, 13 Favelets Bar
  38. 38. 38Monday, May 20, 13 Fangs
  39. 39. 39Monday, May 20, 13 Online Evaluators Examples: • aChecker • FAE • Cynthia Says • WAVE
  40. 40. 40Monday, May 20, 13 WAVE
  41. 41. 41Monday, May 20, 13 FAE
  42. 42. 42Monday, May 20, 13 aChecker
  43. 43. 43Monday, May 20, 13 Enterprise Tools • AMP - SSB BART Group • Compliance Sheriff - HiSoftware • Worldspace - Deque • Rational Studio - IBM • Compliance Guardian - AvePoint
  44. 44. 44Monday, May 20, 13 Viking Methodology “Failure to plan is planning to fail” - Zig Ziglar
  45. 45. 45Monday, May 20, 13 Principles • Utility • Accuracy • Efficiency • Reliability • Repeatability
  46. 46. 46Monday, May 20, 13 Driving Factors • Modern websites are not composed of static content • Certain types of issues occur more often than others • Certain types of content have more issues than others • Certain types of issues are more impactful than others
  47. 47. 47Monday, May 20, 13 Audits vs. QA Audits: • Should maximize utility • Focus on UI component types • Priority given to high use, high risk features and components QA: • A11y should be part of QA process • Deliver fast, accurate results & guidance • Focus only on in-scope work (i.e. user stories & features under dev)
  48. 48. 48Monday, May 20, 13 Testing Web Content
  49. 49. 49Monday, May 20, 13 Markup and A11y • All content must be marked up using the most appropriate elements & attributes for the job • All scripted controls must operate like the native controls which they mimic
  50. 50. Markup and A11y 50Monday, May 20, 13 Page Structure: What is it? How are users impacted?
  51. 51. Markup and A11y 51Monday, May 20, 13 Page Structure Requirements: • Valid, semantic markup • Page titles: unique, terse, clear, informative How do we test this?
  52. 52. 52Monday, May 20, 13 Page Structure
  53. 53. 53Monday, May 20, 13 Keyboard Access/ Focus Control Keyboard Access, Focus Control: What is it? How are users impacted?
  54. 54. Keyboard Access/ Focus Control 54Monday, May 20, 13 Keyboard Access, Focus Control Requirements: • Focus order matches expected interaction order • Items that should get focus do; Items that should not get focus don’t How do we test this?
  55. 55. 55Monday, May 20, 13 Keyboard Access and Focus Control
  56. 56. 56Monday, May 20, 13 CSS Cascading Stylesheets: What is it? How are users impacted?
  57. 57. CSS 57Monday, May 20, 13 • Content must remain readable and operable • Visual indications must also be represented programmatically • Color contrast How do we test this?
  58. 58. 58Monday, May 20, 13 CSS
  59. 59. 59Monday, May 20, 13 Forms Forms: What are they? How are users impacted?
  60. 60. Forms 60Monday, May 20, 13 Forms Requirements: • All fields labeled tersely, clearly • Constraints identified • All fields operable via keyboard • Errors prevented • Error recovery facilitated How do we test this?
  61. 61. Forms 61Monday, May 20, 13
  62. 62. 62Monday, May 20, 13 Frames Frames: What are they? How are users impacted?
  63. 63. Frames 63Monday, May 20, 13 Frames Requirements: Frames given clear, terse, informative titles How do we test this?
  64. 64. Frames 64Monday, May 20, 13
  65. 65. 65Monday, May 20, 13 Images Images: What are they? How are users impacted?
  66. 66. Images 66Monday, May 20, 13 Image Requirements: • Images not used to replace text • All images have text alternative • All text alternatives sufficiently clear and informative • Background images, sprites not used for actionable items or content How do we test this?
  67. 67. Images 67Monday, May 20, 13
  68. 68. 68Monday, May 20, 13 Media Media: What is this? How are users impacted?
  69. 69. Media 69Monday, May 20, 13 Media Requirements • Captions • Transcripts • Audio Description • Access to controls How do we test this?
  70. 70. Media 70Monday, May 20, 13
  71. 71. 71Monday, May 20, 13 Navigation Navigation: What is it? How are users impacted?
  72. 72. Navigation 72Monday, May 20, 13 Navigation Requirements • “Links”are actual links & use valid href • Link text is unique, terse, clear, informative How do we test this?
  73. 73. Navigation 73Monday, May 20, 13
  74. 74. 74Monday, May 20, 13 Tables Tables: What are they? How are users impacted?
  75. 75. Tables 75Monday, May 20, 13 Table Requirements • No tables for layout • Headers identified • Header relationships identified • Good structure How do we test this?
  76. 76. Tables 76Monday, May 20, 13
  77. 77. 77Monday, May 20, 13 Text Content Text Content: What is it? How are users impacted?
  78. 78. Text Content 78Monday, May 20, 13 Text Content Requirements • Proper use of headings • Headings are unique, terse, clear, informative • Proper use and structure of lists & sub-lists How do we test this?
  79. 79. Text Content 79Monday, May 20, 13
  80. 80. 80Monday, May 20, 13 JavaScript-driven Content JavaScript Content: What is it? How are users impacted?
  81. 81. JavaScript-driven Content 81Monday, May 20, 13 JavaScript Requirements • Device independence • Keyboard access/ focus control • Name, State, Role, Value How do we test this?
  82. 82. JavaScript-driven Content 82Monday, May 20, 13 Name: What do we call this thing? State: What is it doing? (Implicitly) What else can it do? Role: What type of object is it? Value: What is its value (if it can have one) Using standard controls in standard ways gives this to us for free.
  83. 83. JavaScript-driven Content 83Monday, May 20, 13
  84. 84. 84Monday, May 20, 13 simplyaccessible.com Accessibility consulting, strategy and assessments Accessible development and remediation services Training courses, workshops and conferences Karl Groves karl@simplyaccessible.com @karlgroves +1 443.875.7343

×