CSUN 2014 talk by Professor Jonathan Hassell describing how Hassell Inclusion, Gamelab UK, and Reflex Arc are using Natural User Interface technologies like Microsoft Kinect to create a whole new generation of assistive technologies based around the movements, gestures and signs different groups of disabled people make.
Two projects are described:
Nepalese Necklace movement games for blind and partially-sighted children that encourage blind and partially-sighted children to engage more readily with their early mobility training through making the body- and spatial-awareness exercises they have to perform the controls for motivational 3D audio-games;
uKinect sign language eLearning games to help people who use sign language to more easily transition into employment by enabling them to learn workplace-specific sign vocabularies using instructive video and our innovative Kinect sign-language recognition system.
NB. All videos in my CSUN presentation had captions, but it's not currently possible to caption the embedded videos in this slideshare. If you need access to the captioned videos, email firstname.lastname@example.org