#W4A2011 - C. Bailey


Published on

Development and Trial of an Educational Tool to Support the Accessibility Evaluation Process - Presented at w4a11, W4A 2011.

Published in: Design, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

#W4A2011 - C. Bailey

  1. 1. Christopher Bailey Dr. Elaine Pearson Teesside University [email_address] Development and Trial of an Educational Tool to Support the Accessibility Evaluation Process
  2. 2. Related Work <ul><li>Manual evaluation plays an important role in accessibility evaluation (WCAG 2.0, UWEM, BW). </li></ul><ul><li>The expertise level of the evaluator is significant. </li></ul><ul><li>Experts take less time, are more productive, more confident in their judgements and can rate pages again different disability types (Yesilada et al, 2009 – Barrier Walkthrogh). </li></ul><ul><li>Experts produce fewer false positive and false negatives than novices (Brajnik, 2010 – WCAG 2.0). </li></ul><ul><li>Only 8 of 25 level-A success criteria reliably testable by novices (Alonso et al, 2010). </li></ul><ul><li>Important factors are Comprehension, Knowledge and Effort. </li></ul>
  3. 3. Context <ul><li>How can novices develop manual evaluation skills? </li></ul><ul><li>Evaluation reports have positive educational and motivational aspects. </li></ul><ul><li>Novices conducting evaluation has the potential for strong pedagogical value. </li></ul><ul><li>Produce a method to guide novices through an evaluation, mimicking an expert process. </li></ul><ul><li>Students in our institution need support with accessibility. </li></ul><ul><li>Developed the Accessibility Evaluation Assistant (AEA) as a solution. </li></ul>
  4. 4. AEA Development <ul><li>The tool lists accessibility checks based on three evaluation contexts: </li></ul><ul><ul><li>User Group </li></ul></ul><ul><ul><li>Site Features </li></ul></ul><ul><ul><li>Check Categories </li></ul></ul><ul><li>Evaluation focussed on Check Categories </li></ul><ul><ul><li>Design Checks: concerned with aspects of visual presentation. </li></ul></ul><ul><ul><li>User Checks: Auditor must interact with the website, results are often subjective. </li></ul></ul><ul><ul><li>Structural Checks: Structural and Semantic information </li></ul></ul><ul><ul><li>Technical Checks: Metadata and Valid Code, </li></ul></ul><ul><ul><li>Core Checks: Checks which refer to specific content, functionality or apply to the entire website. </li></ul></ul>
  5. 5. Structured Walkthrough Method <ul><li>Based on Barrier Walkthrough ( Brajnik) </li></ul><ul><li>A heuristic evaluation technique based on checking for potential accessibility barriers. </li></ul><ul><li>Information for each check: </li></ul><ul><ul><li>The Accessibility Principle they are checking for. </li></ul></ul><ul><ul><li>The User Group(s) affected. </li></ul></ul><ul><ul><li>The nature of the barrier or problem caused. </li></ul></ul><ul><ul><li>A procedure for checking and verifying the issue. </li></ul></ul><ul><ul><li>An example video tutorial . </li></ul></ul><ul><li>Procedure could be manual, automatic, or both. </li></ul>
  6. 6. Trial Methodology <ul><li>Aimed to test the reliability (different evaluators reach same decision) and validity (true accessibility barriers are identified) of the AEA. </li></ul><ul><li>Trialled with 38 Undergraduate Computing Students. </li></ul><ul><li>Participant assigned 1 website (from 4) and had to evaluate 3 page, the Home Page being compulsory. </li></ul><ul><li>For each check the participant had to decide whether the check was either Met; Not Met; Partly Met or Not Applicable. </li></ul><ul><li>Participant had to explain and justify their decision. </li></ul>
  7. 7. Consensus Analysis <ul><li>W3C state a check is reliably human testable if 80% of knowledgeable evaluators reach same conclusion. </li></ul>53% 68% 82% Combined Results 56% (27/48) 67% (32/48) 79% (38/48) WalMart 45% (22/48) 58% (32/48) 71% (34/48) Premier League 50% (24/48) 66% (32/48) 83% (40/48) CNN 60% (29/48) 79% (38/48) 96% (46/48) Vancouver Olympics >70% >60% >50% Home Page
  8. 8. Expert Agreement by Website <ul><li>If we take the expert decision as ‘correct’ we can make some initial conclusions about reliability of AEA method. </li></ul>124/192 60% Combined Results 29/48 58% WalMart 25/48 56% Premier League 31/48 60% CNN 36/48 65% Vancouver Olympics Checks with majority expert agreement Decisions Matching Expert (%) Home Page
  9. 9. Expert Agreement by Category <ul><li>The example refers to the Vancouver website. </li></ul><ul><li>We examined which checks were most likely to be correctly evaluated by the novices, and be ‘valid’. </li></ul>85% Global 76% Technical 68% Structural 49% User 59% Design Novice agreement with expert Check Category
  10. 10. Implications <ul><li>Given that the auditors conducting their first evaluation, we consider these figures promising. </li></ul><ul><li>Validity would have been higher without the option of ‘Partly Met’. Novices had 4 choices compared to 3. </li></ul><ul><li>Novices using the AEA tool are able to identify accessibility barriers so the method is partially effective. </li></ul><ul><li>Structured makes it usable so it can be applied, repeated, learned and remembered. </li></ul><ul><li>Checks that require subjective judgement still have lowest level of accuracy. </li></ul><ul><li>We can provide a method, but we can’t control how well the novices use it, or how much effort they put into using it. </li></ul>
  11. 11. Student Experience <ul><li>As an evaluation methodology there are issues, but potential for AEA as an educational tool. </li></ul><ul><li>Even if students made incorrect decisions, they were made aware of the range of issues and implications. </li></ul><ul><li>“ I did not know there were so many considerations when developing a website – the AEA made me realise there is more than just producing valid code.” </li></ul><ul><li>“ Carrying out an accessibility evaluation really helped me understand the importance of accessibility. Before I carried out the evaluation I didn’t know what checks to carry out”. </li></ul>
  12. 12. Future Work <ul><li>Improve the clarity and accuracy of checks by removing ambiguities. </li></ul><ul><li>Provide clearer guidance on what conditions are required for each decision. </li></ul><ul><li>Improve interface, group related checks and integrate automated tools. </li></ul><ul><li>Further analysis of results to identify overall figure of false positive, false negatives. </li></ul><ul><li>Identify the extent to which the novices identified the accessibility barriers present on the site. </li></ul>
  13. 13. Christopher Bailey Dr. Elaine Pearson Teesside University [email_address] Accessibility Evaluation Assistant http://arc.tees.ac.uk/aea Questions?