Welcome to the Wolflands! I’m Lisa Fiedor, Web Accessibility [title]. And I’m Beth Shepherd, Instructional Technologist for DELTA. (Images of Beth and Lisa in Real Life )
When we learned that this year’s UNC TLT-C would again be held online, and this year completely in Second Life, our minds immediately began racing – how could we talk about accessibility and keep it in the framework of the environment we were meeting in? And so that’s how we ended up focusing on accessibility in synchronous learning environments.Today we’re going to focus on accessibility issues in three of the most common synchronous learning technologies used in the UNC system, Adobe Connect, Elluminate Live, and Second Life. We’re going to focus on accessibility from a participant’s point of view, because delving into moderating and development is a whole ‘nother issue!We’ll talk about challenges to the disabled participant in these environment, as well as talk about how the technologies are addressing accessibility issues. The great thing is that even in Second Life, the platform that has some of the most prevalent accessibility issues, there are lots of great things happening to embrace those with physical impairments.
So what do we mean when we say “accessibility?”Ask Lisa for accessibility definitionWhat kinds of impairments do you think would limit someone from using these synchronous environments?Answer in local chatWe’ll be talking about the three most common types of physical disability today: visual, auditory, and mobility impairments
Visualblindnesslow vision/legally blindcolor blindnessHow are visually impaired users affected? Cannot use mouse, probably uses a screen reader, might use a screen magnifier to enlarge text.
AuditoryDeafnessHard of hearingHow are auditory impaired users affected? Cannot hear speech, sound alerts, or other audible indicators.
MobilityNeurological disorders resulting in partial or full paralysisRepetitive stress injuriesDegenerative diseases and conditions such as arthritisHow are mobility impaired users affected? Limited in ability to use a mouse, might use alternate input devices for interacting with their computer (voice-dictation software)
For those of you who haven’t used it, Adobe Connect is a very nice looking web conferencing software that allows you to use a whiteboard, share files such as PowerPoint and Word docs, share your desktop, converse audibly, and use instant messaging, among other features.What do you think the issues might be in Adobe Connect for users with the 3 impairments we've discussed?Discuss in local chat
Adobe Connect – screen cap for blind peopleDoesn’t work with screen readers at all
Adobe Connect – Screen cap for audio impairedClosed captioning ability, although have to hire a captioner to remote feed captions in. Without closed captioning, you’d need to limit use to chat and whiteboard/app share/file share (no microphone.
Adobe Connect – screen cap for mobility impairedIf you know the keyboard shortcuts, you can raise your hand, use the mic, talk in the chat, and navigate the menus. You would not be able to contribute to the whiteboard, and it’s very hard to move around between pods or access the many features that don’t have keyboard shortcuts.
Elluminate Live! is another web conferencing software with similar features to Adobe Connect. If you’ve used it, what do you think the issues might be in Elluminatefor users with the 3 impairments we've discussed?Discuss in local chat
A screen reader can read the chat and participant list, but cannot read the whiteboard. However, if you import your PowerPoint speaker notes to the Notes section, the screen reader can read that, although it’s a bit clunky.
If you hire a closed captioner, there is full use for participants with audio impairments. Without a closed captioner, you’d need to limit use to the chat and whiteboard/app share/etc (no microphone). The cool thing about Elluminate is you can assign anyone in the room to be a closed captioner – it’s not a remote feed. Of course, the person has to be experienced to do a good job.
many sections are keyboard accessible, although you might have to go slower to allow users time to navigate through features. web tour and app share are viewable, but participant would not be able to interact. If you use whiteboard for interaction, you would have to make exception for the mobility impaired user (ex: they type in chat instead of on board, or answer audibly)
Second Life for the visually impaired looks a little like what we see here. We have vague landmarks and some keyboard command available. Details of what is around us are vague and sometimes undeterminable. For some, simply increasing the text size will help in interacting with the interface, chat, and other avatars. For others, they may choose to use voice chat or some voice to text software such as Dragon Naturally Speaking. There is also Voice Buddy, a voice recognition package that works with Second Life. Other options are to use a plain text Second Life interface, called MovableLife, using only a Web browser, or SLeek, an open source low tech client, that would be useful for students in low tech environments (e.g. older computers with dialup connections).
Max, the virtual guide dog, helps people with blindness or visual impairments travel easily, and identify avatars and objects around them. Some objects allow keyboard interaction and alert users of their interactivity (see the inset of the IBM site in Second Life). In addition, much like alt text for images on the Web, objects can be tagged with attributes that provide metadata for the interested user (see the inset of the Reuters building doorway’s attributes). Max was debuted last year at Helen Keller Day in Second Life, June 27, 2009.
When I come to a place, Max can tell me if there are any avatars around, and what objects are nearby. Here in the upper right screenshot, he tells me there are no other avatars and that there is a sign north east of me. He also tells me there are two cameras nearby, and their direction. I came to the IBM island to also find out about the Kestrel project, which is a thin client which runs in Firefox that allows blind users virtual world environments. A key aspect of the project allows sighted users to describe objects and locations for the blind. The sighted user is allowed to enter accessible attributes, such as custom names and short and long descriptions. They even provide the capability for users to record and upload a verbal description to the database. http://services.alphaworks.ibm.com/virtualworlds/
For the hearing impaired, Second Life transcribes events such as podcast interviews and Second Life’s 5th birthday keynote. Those that want to transcribe their voice chat, they can record it and use the Amazon Mechanical Turk to get affordable transcription. If you’ve heard of CastingWords, they use MTurk as an intermediary. When presentations like ours are given in voice, those who are deaf and hearing impaired, or whose native language is not that of the speaker, may not understand what is being discussed. The folks on Virtual Ability Island provide voice to text transcription services within Second Life, that allows for meaning-for-meaning real-time text for the audience to follow. Similarly, those who are dyslexic, or those who have low vision may have trouble following the speaker’s textual presentation. Text to voice transcriptionists read aloud text offered by the presenter. The Virtual Ability transcriptionists provide weekly transcription services for the Metanomics show, as well as for other events in Second Life. On Virtual Ability island, those with deafness can join the Deaf Chat Coffee Group and hang out in the Coffee House.
Virtual Ability Island is also where you can pick up Max, as well have an in-depth orientation to Second Life, useful for anyone! They also have dances on their dance floor, and virtual field trips to walk in the woods, climb mountains, and go virtual skydiving. The SecondAbility Mentor group helps new users when needed, and the island has two small classrooms and a large accessible auditorium for presentations and meetings.
For those with mobility impairments, Second Life can be a freeing experience. While often not able to interact with people on a daily basis, those that are able to use the extensive keyboard commands, and/or a modified mouse, or even a head pointing device can experience and socially interact with others. The Wheelies nightclub on the Second Ability sim is a place for people with disabilities to hang out and dance. They have a couple hundred visitors per week. Here at the bottom you can see Simon Walsh, the owner of Wheelies, above, the custom wheelchair of duilioCimino at Wheelies, and in the lower right, the wheelchairs available for use at Wheelies.
At Keio University in Japan, they are even researching the ability to control your avatar in Second Life via brainwaves. Interacting in Second Life this way would be an amazing accomplishment for someone with a profound disability. It’s wonderful what technology can do for those with disabilities, even in Second Life.
We’re not sure what’s in the works for Adobe Acrobat Connect, because they are not working with any outside accessibility groups. However, the developments with Elluminate are really exciting. Lisa and I are both members of the Elluminate Accessibility Taskforce, and while we’re sworn to secrecy for the next release, we can say that the focus of version 10 is an accessibility overhaul. There are many great changes in the works to make more of Elluminate truly accessible. Members of the accessibility task force include Elluminate developers as well as outside testers who are Elluminate users with disabilities. Not all of the necessary changes will make it into the first release of version 10, but they hope to fix any outstanding issues with future patches.Second Life unfortunately took a big step back with accessibility in the new viewer (their words), but they are aware of the issues and looking to improve. They especially acknowledge the need to reinforce the keyboard accessibility, and even have bug submissions to make Second Life Section 508 compatible. At a recent Second Life User Experience Interest Group meeting on March 18th of this year two Lindens were present to hear ideas and give updates about accessibility issues. They acknowledge the desire to have Second Life work with various screen readers, the need for object and land owners to tag their objects with descriptive attributes, the limitations caused by the mouse-heavy interface for media controls, and the desire for the ability to map keyboard controls specified by the user. Esbee Linden said that they will be updating some general tab control and keyboard shortcut components, and are working on a roadmap for accessibility.
What questions do you have?
UNC TLT-C 2010 Embracing Change for All: Addressing Accessibility in Online Synchronous Teaching Technologies