An HCI Principles based Framework to Support Deaf Community
Alexander Wolff & James Aymer UROP Poster FINAL
1. Abdelrahmane Bray supported by James Aymer
UROP
Starting Point
The starting point for this research was a Kinect-based game developed
as part of the PURR (Prescription Software for Use in Recovery and
Rehabilitation) project, being undertaken with the Royal Berkshire
Hospital and Headway Brain Injury Trust to aid motor rehabilitation of
patients.
Fig. 1 Skeletal Position Tracking Fig. 2 Signal Officer Game
Speech for Language Development &
Rehabilitation
The Signal Officer game was developed using C# and Unity 3D, which
enables a patient's movement to be sensed and tracked using skeletal
positioning, in order to promote physical rehabilitation. The focus of
the UROP project work was to investigate:
1. How audio and speech can be integrated into Kinect-based games
for speech and language development or therapy such that they are
intuitive to set-up and use, engaging to play and which provide
useful data for therapists.
2. The differing capabilities of the Microsoft Kinect -1 (released 2010)
and Kinect-2 (released 2013) sensors in order to understand and
identify their respective capabilities in relation to the specified
problem; to help ascertain the technical challenges associated with
developing audio and speech based games; and to identify the most
appropriate languages, APIs (Application Programming Interfaces)
and SDKs (Software Development Kits) to use with the Kinect
sensors for development of speech-based games.
3. To report these findings in order that they can be used in the
development of games targeted at speech development and
therapy.
Way Forward
The results of this work are being fed into three research proposals
currently under preparation by the School of Systems Engineering,
School of English Language and Applied Linguistics, School of
Psychology and Clinical Languages, the Institute of Education, the Royal
Berkshire Hospital and Headway Brain Injury Trust related to speech and
language development and rehabilitation.
Acknowledgements
• Professor Rachel McCrindle, School of Systems Engineering, Project Supervisor
• School of English Language and Applied Linguistics, School of Psychology and Clinical Languages, Institute of
Education, Royal Berkshire Hospital, Headway Brain Injury Trust who are collaborating on this work
Introduction
Since its initial inception in 2010 by Microsoft Corporation, the
Microsoft Kinect has become well known for its diversity as a motion
sensing input device. The Kinect is an immersive product, capable of
detecting and tracking an end user's movement via skeletal
positioning, whilst also offering high quality voice and facial
recognition capabilities.
The Kinect is capable of capturing a user's speech using its in-built
microphone array, which subsequently offers more accurate sound
quality than a single microphone. Speech recognition is one of the key
functionalities of the Kinect's natural user interface, and offers
developers an affordable solution for the development of speech
therapy and developmental applications.
Rationale
There is a wealth of empirical research which corroborates the
necessity of being able to communicate effectively using language, and
how centric it is to a human's development. Within the UK, 2.5 million
people suffer with some form of speech difficulty, which requires
continued treatment with the aid of language therapists (RCSLT;
NIDCD). It is also reported that nationally, one in six children have
difficulty in learning to talk and understand speech patterns (RCLST).
The aim of this project was to capitalize upon the low cost
development that the Kinect intrinsically offers, to investigate how
Kinect-based games can be used to promote speech development in
both a therapeutic and educational context.
Using the Kinect's microphone array and speech recognition
capabilities, Kinect-based games can be created to help young children
develop and enhance their language and communication skills, as well
to help patients regain their communication and cognitive skills whilst
undergoing rehabilitation following stroke or other brain trauma. By
blurring the distinction between play and therapy, the ludic nature of
the games will help people with speech and language deficits to better
organise, interpret and use linguistic knowledge and in doing so
enhance their social well-being and integration.
Collaboration with experts in the fields of Linguistics and Phonetics,
Speech Therapy and Education ensures that the games produced are
both engaging and fit for purpose.
Microsoft Kinect Capabilities
Fig. 1 Kinect-1 Sensor Fig. 2 Kinect Sound File
Communication and Speech Development for
Educational and Therapeutic Domains Utilizing the
Kinect
References
1. Microsoft, Speech, [Online] Accessed November 10
th
2014, Available http://www.msdn.microsoft.com/en-
us/library/jj131034.aspx.
2. Microsoft, Kinect For Windows, [Online] Accessed November 10
th
2014, Available http://www.microsoft.com/en-
us/kinectforwindows.
3. RCSLT, Roral College of Speech and Language Therapists, The All Part Parliamentary Group on Speech and
Language Difficulties, [Online] Accessed Noveber 10
th
2014, Available
http://www.rcslt.org/about/parliamentary_work/appg_sld_history
The Kinect offers
• Skeletal motion analysis (20 joints or 25 joints per user)
• Multi-array microphone
• Facial recognition
• Voice recognition
• Depth sensor