On Web Accessibility Environments

1,423 views

Published on

The presentation of our work on comparing Web accessibility evaluation environments, and why doing it in the browser is better.

Published in: Technology, Design
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,423
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
7
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • On Web Accessibility Environments

    1. 1. On Web AccessibilityEvaluation Environments W4A 2011 Nádia Fernandes, Rui Lopes, Luís Carriço {nadia.fernandes,rlopes,lmc}@di.fc.ul.pt
    2. 2. Introduction• Web Accessibility is tripartite: Page semantics, Assistive Technologies, Browser capabilities• Modern Web development transcends static HTML• Evaluation tools (typically) stay outside the browser• Goal: to study the impact of evaluating accessibility in the browser
    3. 3. Web Browsing• Web page, first HTTP response• Resources, ancillary transformations (CSS, Javascript)• AJAX, browsing-time transformations
    4. 4. Hypothesis Evaluating Web content in the browser provides moreaccurate and more in-depth analysis of its accessibility. • Need for understanding the differences and limitations of evaluation environments • Evaluation process (i.e., its implementation) must be the same in both environments, for comparison purposes
    5. 5. WebAccessibilityEvaluationEnvironments
    6. 6. Implementation• Javascript, same techniques (18) in both environments• Results transformed into EARL & CSV• Execution in each environment at respective timings: Command line, Browser, after HTTP GET bookmarklet
    7. 7. Testability & Validation• Testbed with 102 HTML documents• Implementation returns the same results in both environments, for the same HTML document
    8. 8. Experimental Study• Analysis of evaluation results in both environments• 82 homepages from Alexa Top 100 Web sites
    9. 9. Data Acquisition and Processing • Time between HTTP GETs, 89.72s (σ ≈ 70s) • Document size average: • Command line, 70KB (σ ≈ 95KB) • Browser, 81KB (σ ≈ 127KB) • Document HTML element count average: • Command line, 915 elements (σ ≈ 95KB) • Browser, 1154 elements (σ ≈ 95KB)
    10. 10. The differences of an HTMLdocument in both environments was observed, and is significative.
    11. 11. Results: Average Outcomes Successes • Command line, 9.67 elements (σ ≈ 19.12) • Browser, 272.78 elements (σ ≈ 297.10) Failures • Command line, 47.44 elements (σ ≈ 70.82) • Browser, 90.10 elements (σ ≈ 125.93) Warnings • Command line, 425.02 elements (σ ≈ 682.53) • Browser, 685.21 elements (σ ≈ 1078.10)
    12. 12. Results: Incorrect Outcomes• False positives and false negatives were found in Command line evaluation• 67% of the criteria yielded false negatives
    13. 13. Results: Criterion 1.1.1• Availability of alternative text content• We detected a high increase of scripted image injection
    14. 14. Results: Other CriteriaCriterion 1.2.3 (Media alternatives)Almost all occurrences in the browserCriterion 2.4.4 (Link purpose)One case of false positives detectionCriterion 3.2.2 (On submit buttons)Some cases where buttons were injected
    15. 15. Conclusions• Accessibility evaluation study of 82 homepages from Alexa Top 100 Web sites• Compare evaluation environments• Scripts alter Web pages in a significant way• Accessibility evaluation is affected, both on incorrect and incomplete results in Command line environment
    16. 16. Limitations• Possibility of artifacts introduced between requests for same Web page• Analysis of HTML DOM tree• Impossibility to check if a given individual result occurs in both environments• Automated evaluation yields limited results
    17. 17. Ongoing Work• Implementation of post-cascading and post- content flow CSS-aware techniques• Continuous monitoring of DOM manipulation (to, e.g., detect live regions)• Detection of differences between DOM trees, to pinpoint results
    18. 18. Thank yourlopes@di.fc.ul.pt

    ×