Injustice - Developers Among Us (SciFiDevCon 2024)
On Web Accessibility Environments
1. On Web Accessibility
Evaluation Environments
W4A 2011
Nádia Fernandes, Rui Lopes, Luís Carriço
{nadia.fernandes,rlopes,lmc}@di.fc.ul.pt
2. Introduction
• Modern Web development transcends
static HTML
• Evaluation tools (typically) stay outside the
browser
• Goal: to study the impact of evaluating
accessibility in the browser
3. Web Browsing
• Web page, first HTTP response
• Resources, ancillary
transformations (CSS, Javascript)
• AJAX, browsing-time
transformations
4. Hypothesis
Evaluating Web content in the browser provides more
accurate and more in-depth analysis of its accessibility.
• Need for understanding the differences and
limitations of evaluation environments
• Evaluation process (i.e., its implementation)
must be the same in both environments, for
comparison purposes
6. Implementation
• Javascript, same techniques (18) in both environments
• Results transformed into EARL & CSV
• Execution in each environment at respective timings:
Command line, Browser,
after HTTP GET bookmarklet
7. Testability & Validation
• Testbed with 102
HTML documents
• Implementation
returns the same
results in both
environments, for the
same HTML document
9. Data Acquisition and Processing
• Time between HTTP GETs, 89.72s (σ ≈ 70s)
• Document size average:
• Command line, 70KB (σ ≈ 95KB)
• Browser, 81KB (σ ≈ 127KB)
• Document HTML element count average:
• Command line, 915 elements (σ ≈ 95KB)
• Browser, 1154 elements (σ ≈ 95KB)
10. The differences of an HTML
document in both environments was
observed, and is significative.
11. Results: Average Outcomes
Successes
• Command line, 9.67 elements (σ ≈ 19.12)
• Browser, 272.78 elements (σ ≈ 297.10)
Failures
• Command line, 47.44 elements (σ ≈ 70.82)
• Browser, 90.10 elements (σ ≈ 125.93)
Warnings
• Command line, 425.02 elements (σ ≈ 682.53)
• Browser, 685.21 elements (σ ≈ 1078.10)
12. Results: Incorrect Outcomes
• False positives and
false negatives were
found in Command
line evaluation
applicability
• 67% of the criteria
yielded false
negatives
13. Results: Criterion 1.1.1
• Availability of alternative text content
• We detected a high increase of scripted
image injection
14. Results: Other Criteria
Criterion 1.2.3 (Media alternatives)
Almost all occurrences in the browser
Criterion 2.4.4 (Link purpose)
One case of false positives detection
Criterion 3.2.2 (On submit buttons)
Some cases where buttons were injected
15. Conclusions
• Accessibility evaluation study of 82
homepages from Alexa Top 100 Web sites
• Compare evaluation environments
• Scripts alter Web pages in a significant way
• Accessibility evaluation is affected, both on
incorrect and incomplete results in Command
line environment
16. Limitations
• Possibility of artifacts introduced between
requests for same Web page
• Analysis of HTML DOM tree
• Impossibility to check if a given individual
result occurs in both environments
• Automated evaluation yields limited results
17. Ongoing Work
• Implementation of post-cascading and post-
content flow CSS-aware techniques
• Continuous monitoring of DOM
manipulation (to, e.g., detect live regions)
• Detection of differences between DOM
trees, to pinpoint results