This document discusses finding a balance between automated and manual accessibility testing. It notes that automated testing is not a substitute for manual testing but should be part of an accessibility program. Both automated and manual testing have advantages and disadvantages. The document emphasizes that bringing together automated testing, manual testing, and an engaged team with the right skills and responsibilities is key to a successful accessibility strategy. It provides tips on how to divide responsibilities and set up a testing and training program to balance automated and manual approaches.
PAG-UNLAD NG EKONOMIYA na dapat isaalang alang sa pag-aaral.
Tightrope a11yto-rydberg
1. Walking a Tightrope
Finding a Balance of Automated and Manual Testing
http://assets.nydailynews.com/polopoly_fs/1.2318133.1458071056!/img/httpImage/image.jpg_gen/derivatives/gallery_1200/philippe-petit-1974.jpg
7. + Big Picture
+ Saves time
+ Cost effective
+ Scalable
+ Consistent
+ Can schedule scans
- Questionable accuracy
- Literal interpretations
- Data overload
- Can be expensive
- Some success criterion not checked at all
- Assumption that auto testing is enough
Automated
http://media.comicbook.com/uploads1/2015/07/upload-143696.jpg
8. + More likely to find ‘real’ user issues
+ The human element
+ More flexible
+ User testing and design results
+ Allows tester judgement calls
- The human element
- Tester burnout
- Some tests are difficult to perform
- Can’t reuse manual tests
- Less thorough
http://media.comicbook.com/uploads1/2015/07/upload-143696.jpg
Manual
10. False Positives?
Everyone has a process.
Everyone has their own perceptions.
Flagging an item for review isn’t a false
positive, but an efficiency.
Technology is advancing and testing is now
more complex.
https://images.fineartamerica.com/images-medium-large-5/elephant-in-
the-room-wip-leah-saulnier-the-painting-maniac.jpg
17. Team Members
(And the awful stereotypes associated with them!)
Content Managers
Designers
Developers
http://www.findleydesign.co.uk/wp-content/uploads/Web-Team.png
18. Content Managers
Administrative Assistant
AKA PDF Expert
http://www.findleydesign.co.uk/wp-content/uploads/Web-Team.pnghttp://ell.h-cdn.co/assets/15/21/980x490/landscape-1432043369-tcdmam2-ec233-h.jpg
22. Designer
AKA “Single Source Organic Coffee Expert”
Print Designer who has not yet grasped
Web Design
Pixel Pusher
Web Designer who insists on hand coding
EVERYTHING
https://verkoren.files.wordpress.com/2014/11/so-i-googled-hipster-was-not-disappointed.jpg
26. Training and Skills Check
• Basic Web skills
• CMS user only
• Code/no code, copy/paste
• Handwritten instructions to add info to site
(see admin assistant)
• Assistive Tech
• Chrome Vox and or NVDA
• User task-based testing
• PwD Testers
http://www.findleydesign.co.uk/wp-content/uploads/Web-Team.png
27. Accessibility Responsibility Breakdown
• By offering role-based issue sorting, you
can easily assign issues based on the
individual team members’ responsibility.
• Manual testing is typically managed in
one area.
• Manual testing can also be divided among
the team based on their responsibilities.
http://ryanbiery.com/assets/images/launchpad/legos.pn
31. + Big Picture
+ Saves time
+ Cost effective
+ Scalable
+ Consistent
+ Can schedule scans
- Questionable accuracy
- Literal interpretations
- Data overload
- Can be expensive
- Some success criterion not checked at all
- Assumption that auto testing is enough
Automated
http://media.comicbook.com/uploads1/2015/07/upload-143696.jpg
32. + More likely to find ‘real’ user issues
+ The human element
+ More flexible
+ User testing and design results
+ Allows tester judgement calls
- The human element
- Tester burnout
- Some tests are difficult to perform
- Can’t reuse manual tests
- Less thorough
http://media.comicbook.com/uploads1/2015/07/upload-143696.jpg
Manual
33. Find the Automated Tool That
Compliments Your Team
• Scalable
•Single Page
•Site Wide
Configurable
Runs on a Schedule
Can integrate into CMS
• Cost Effective
• Robust
• Matches teams’ skills
•Dev tools
•API
http://www.michiganbusinessnetwork.com/blog/mysterious-badlock-a-brand-name-software-bug-causing-controversy
34. Divide and Conquer
Divide the work based on the roles
o Content Issues to content people
o Dev issues to developers/webmasters
https://www.theatlantic.com/technology/archive/2016/02/when-computers-started-beating-chess-champions/462216/
35. Automated Testing
• Site templates
• Representational content pages
• Dynamic content pages
• Dialog modals and alerts
• Key entry and exit pages (including account login
and recovery pages)
• Help and assistance pages
• Interactive forms
https://www.businessesgrow.com/2013/11/28/how-to-create-an-automated-follow-up-system/
36. Manual Testing
• Page zoom
• Form Elements and Form Validation
• Visible Focus
• Multimedia and Media Control
• Dynamic elements
• Modals and dialog boxes
•Modal Receives Focus
•No Keyboard Trap
•Close modal
https://www.wired.com/wp-content/uploads/2014/10/1e7VCrQLhMai9yRBLAoKlWQ-1.jpeg
37. Training Program
• General Accessibility Testing Knowledge
• HTML Coding/ CMS
• AT Tester Training
• Document Accessibility Training
• Manual Tester Training
• Automated Tool Training
http://www.incontact.com/blog/wp-content/uploads/2015/04/call-center-training-program.png
Leverage the power of automated testing tools while performing the right balance of manual tests can be a bit of a balancing act. We are going to discuss some of the factors and considerations of trying to achieve that balance in a comprehensive accessibility compliance program.
This slide background and also the next few slides, is a picture of Philippe Petit’s famous tightrope walk between the World Trade Center towers from 1974.
But first, there are a few stipulations…
The issue of automated OR manual accessibility testing isn’t the subject today; I’d be surprised if there is anyone in the room that would argue for one or the other at this point. We are going to talk about why automated testing is an important part of a balanced breakfast, as they say.
Also not part of the talk—the number of success criteria that can be successfully checked. Numbers vary wildly—and we get that.
Yes, I represent an automated tool company, but this isn’t a sales pitch. It’s points that we work directly with out customers on each and every day, and we hope to offer some insights into some of those conversations.
We are going to address three main points today: First, a little compare and contrast with the testing, second, we are going to talk about your team and how they may be used and relied upon for different role-driven tasks. And finally, I’ll take the different points and bring them together.
Balance Point #1: automated vs manual. Let’s start with a quick list of Pros and cons: we’ll start with Automated:
Pros are the first list with the plus sign:
Cons are the second list with the minus sign.
The background of the slide is a picture of Batman vs Superman, and I didn’t realize until later that this picture can also be a bit of a metaphor: Batman is automated—he has all the cool toys, and Superman is manual—he doesn’t have all the toys like Batman, he does it all manually by sheer strength.
Manual Pros in the first list with the Plus
Cons in the second list with the minus
This slide features a picture of a little forest creature—I believe it’s a Bush Baby—anyway it’s eyes are huge and it has a look of…shocked and awed?...on it’s face all the time.
TMI—too much information. What do you consider too much information? An explanation of the issue? Having the issue pointed out in the code? Line numbers of the code? Links to resources to fix the issue?
Speaking as a tester, we want to give the best possible information in our accessibility reports, while at the same time taking care not to bog down the report with unnecessary information.
We don't always have a way to know what pieces of information are relevant and which are irrelevant, so it can feel like a gamble deciding how much to provide.
Speaking as a developer, the information offered in an accessibility report is crucial to how valuable we perceive the report to be. The more relevant information you are able to provide, the better and faster we can find and fix those issues you worked so hard to tell us about.
Relevant information provided literally translates directly into value.
And that, really, is my answer. In my opinion, there is no such thing as too much information in an accessibility report, as long as all of that information is relevant.
Ah yes; the elephant in the room—false positives. I don’t like that term. Why? Because I think it’s misunderstood, and overused. Yes, they occur, but the developers who build the tools are very careful in writing the tests they have written, but let’s face it Web browsers loosely interpret the code and render even crappy page code—why is this important? Because there isn’t a developer alive that can anticipate the crap that gets written and rendered as a web page.
About the time they have every angle figured out on a comprehensive test, someone comes up with a new curveball: For example <h8 font-size: large>
Just because you don’t agree with the results of the test doesn’t mean it’s a false positive.
In many cases, the test is written knowing that the check can’t be determined correctly without a decision. But flagging that item for review is NOT a false positive, but an efficiency. Especially if you are searching for page elements in a large site with thousands of pages.
There are tremendous advances in technology that are having a positive influence on Persons with Disabilities.
Think about the advances in quality with Siri; The ability of Alexa to turn your home into a smart home.
And IBM continues to challenge Watson and Watson is up for the challenge!
Keep an eye on Cognitive computing, and advanced algorithms and the advancements that can be directly applied to automated accessibility testing. For instance Image identification capabilities by Facebook, and even auto captioning by YouTube continues to advance, and that type of technology will roll into accessibility testing…eventually.
In the mean time, there are some groups working on advancing the technology and capabilities, and standardizing automated accessibility testing:
Wilco Fiers is leading the Auto WCAG Monitoring group. The objective of this community is to create and maintain tests that can be implemented in large scale monitoring tools for web accessibility.
Testing is a bedrock element of any accessibility program. Without test results, it isn’t possible to accurately gauge your current status, or track progress.
The workshops, papers and panels to be presented this year further our collective goals of better,
more purposeful implementation and management of accessibility testing programs in organizations, and improving the efficiency, repeatability, and quality of manual and automated test processes and tools.
There are some great tools available, and not every tool is going to be the best fit for your organization. Look around and consider if you prefer more developer-based, end-user based, or browser-based fits best. You may consider a combination of tools as well—a good craftsman has multiple tools in their toolbox.
Look, the cost of these tools, while always a consideration, should not be a barrier to a successful accessibility program.
While not covering large-scale, large volume sites, browser based accessibility tools may be used quite successfully to supplement your manual testing.
Balance Point #2 – The Team
There are lots of considerations with automated and manual tests one we bring the whole team into it.
First, are you a one-man shop or part of a multidisciplinary team?
Do you use Centralized or Decentralized Publishing? How much oversight do your editors/content people have?
Let’s take a few minutes and talk about some team members—content people, designers, and developers. These are big buckets, and you may or may not fit into one of them, but I’m talking in generalizations today, which allows us to have a little fun and tackle some stereotypes. The stereotypes presented are terrible. They are not meant to reflect anyone, living or dead, so just relax – and see if you can relate.
How often is this person responsible for website updates?
Find the folded up, hand-written notebook paper.
1. Open the Internet (big blue E shortcut)
2. Log in
3. Find login written on the yellow sticky note on the left side of my computer (Notice I didn’t say monitor)
4. Log in
5. Change password cause this one expired
6. Log in
7. Contact support because I’m locked out
7. Log in
8. Find instruction sheet to save pictures from the camera.
9. Add picture from picnic. Drag the corner to make it smaller cause it’s 10x the size of the screen
You get the idea…
This broad generalization may include:
Administrative assistants
Teachers/Instructors
Students/Interns/Grad Assistants
Volunteers
This is the person who can manage “alt” attributes on images, headings, link text—things like that.
Oh yes—don’t you know it’s much much easier to ‘print’ a pdf and post it instead of trying to make a new page…
Conversion rates
Funnels
Click throughs
Silos
The magic that makes up a marketing team. You don’t have to understand what they mean, just that they need to be used in meetings
In some cases, the individual could be lumped in with our first group, the admin assistants, but they are usually a little more tech savvy, and may have even more responsibility and access to content, and the CMS.