Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project

3,266 views
3,146 views

Published on

This talk was given as keynote at the International Conference on Innovation in Medicine and Healthcare 2013 (http://inmed13.innovationkt.org/) within the Ambient TeleCare invited session (http://inmed13.innovationkt.org/cmsISdisplay.php).
This presentation gives an overview on technologies assisting visually impaired persons and describes the progress made so far within the ALICE project (http://www.alice-project.eu/)

The event had a multi-disciplinary participation consisting of researchers, engineers, managers, students and practitioners from the medical arena, gathered for discussions on the ways the innovation, knowledge exchange and enterprise can be applied to issues related to medicine, healthcare and the issues of an ageing population.

Published in: Technology, Business
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,266
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project

  1. 1. Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project Andrei BURSUC, Prof. Titus ZAHARIA Institut Mines-Télécom; Télécom SudParis firstname.lastname@telecom-sudparis.eu Invited talk by DemaCare FP7 Project
  2. 2. • Context and objectives • The ALICE project and AAL • State-of-the-art • User requirements • System prototype • Obstacle detection • Navigation assistant • Human-Machine interface • Conclusion and perspectives 2 Outline Experiences from the ALICE project
  3. 3. • VI persons face many problems every day: – overall contextual understanding of space semantics – interaction with surrounding objects – planning, orientation, communication, navigation • 285M registered visually impaired people: 39M blind, 246M with low vision (WHO report) • The degree of visual impairment is increasing with an ageing population 3 Context and objectives Nowadays Experiences from the ALICE project
  4. 4. • Provide navigational assistive device for elderly blind with cognitive capabilities: – Positioning – Obstacle detection/alerting – Landmark/object recognition • Offer VI users a cognitive description based on a fusion of of perceptions gathered from multiple sensors • Personal benefits: – Enable independency of blind and partially sighted people – Save stress and time of the end-users – Improve the individual self-esteem 4 Context and objectives Objectives Experiences from the ALICE project
  5. 5. • 7 partners (academics, SMEs, VI persons associations) • 4 European countries (ES, FR, SI, UK) • Duration: June 2012 – November 2014 • Final product: device consisting of smartphone with additional sensors, wirelessly connected with local processing unit The project ZVEZA SLEPIH 5Experiences from the ALICE project
  6. 6. • Ambient Assisted Living - funding activity that aims: – to create better condition of life for the older adults – to strengthen the industrial opportunities in Europe through the use of ICT • Funding across-national projects involving SMEs, research bodies and user’s organizations • Time-to-market perspective of max 2-3 years after the end of the project • Project total budget: 1-7 M€ (funding 3 M€ at most) AAL Joint Programme 6Experiences from the ALICE project
  7. 7. Experiences from the ALICE project 7 What’s possible? State of the art
  8. 8. How VI orient themselves? • With the help of the guide (other person) • Using a white cane, guide dog • Using electronic devices, GPS • By listening familiar sounds • By looking for something familiar (edge of pavements, curves, crossroads, very large inscriptions) • Underfoot textures, different surfaces • Sun, wind directions, smell • Road signs 8 State of the Art Experiences from the ALICE project
  9. 9. Experiences from the ALICE project How VI orient themselves? • Current techniques are still not very advanced 9 State of the Art
  10. 10. Experiences from the ALICE project How VI orient themselves? • Cane and dogs are still kings! 10 State of the Art
  11. 11. How VI (could) orient themselves? • Navigation systems: – GPS + computer vision (clear path, landmark recognition) • Object recognition systems: – Grocery shopping assistant – RFID tags on objects – OCR (Optical Character Recognition) – Detectors: crosswalk , walk lights, staircase, street signs, pedestrians • Obstacle avoidance systems: – Integrating depth information – Step and curb detection 11 State of the Art Experiences from the ALICE project
  12. 12. • Conclusions: – Few systems work in real time – Many approaches require the use of heavy equipment – Some systems need tags – The research field should get a new boost with the advent of the Google Glass How VI (could) orient themselves? 12 State of the Art [Lee, 2012] [Marduchi, 2012] [Pradeep, 2010] Experiences from the ALICE project
  13. 13. • Limited computational resources: light and low powerful wearable devices • Real-time responsiveness • Reliability and no false positives • Adequate and non-overwhelming communication with the user (alerts, indications) 13 State of the Art Challenges Experiences from the ALICE project
  14. 14. 24 July 2013 14 Setting up the path User feedback and requirements
  15. 15. Experiences from the ALICE project • Participants’ profile: – Age: 55-75 – Countries: Slovenia, UK – Degree of visual impairness: blind and partially sighted – Total: 40 participants (20 from each country) Questionnaire for end-users 15 User requirements
  16. 16. Questionnaire conclusions • 50 % of participants are using only familiar routes • Most participants need someone to guide them to certain places. • Some of them need the guide every time – often they depend on the time and will of others. • It is important to know where they are positioned, how far the destination is and the vicinity of the route 16 User requirements Experiences from the ALICE project
  17. 17. Questionnaire conclusions - Device • Not very much confidence placed in the electronic navigation system (only after several successful tests) • Necessity of training and information about electronic devices. • Half of users use speech synthesis • Willingness to use headphones, but hearing shouldn‘t be obstructed. • “Turn by turn” functionality should not give too much info 17 User requirements Experiences from the ALICE project
  18. 18. Questionnaire conclusions - Indoor • 85 % of respondents have difficulties with orientation through indoor public institutions. • Difficulties the users are facing in indoor environments: – the size of the room – glittering surfaces – room darkness – no orientating points to navigate with white cane – difficulties to recognize the landmarks – background music. 18 User requirements Experiences from the ALICE project
  19. 19. Questionnaire conclusions - Obstacles • Obstacles that users want to be warned about: – pillars – curves – overhanging branches – edge of pavements – street furniture – steps – down slopes – ramps – holes – bumps 19 User requirements Experiences from the ALICE project
  20. 20. Experiences from the ALICE project User expectations • The device should be accurate: – Exact info about the obstacles – Find safe corridors for walking – Warn the user when is safe to cross the road, the green light is on, if traffic is coming (especially bikes, electric cars) • The device should be small, portable, phone sized. 20 User requirements
  21. 21. User expectations • Other features: – Give the distance to the building – Find the right bus stop, post box. – Text-to-speech for: letters, journey‘s instructions , street inscriptions, shop names – Tell the weather, temperature, local taxi availability. – Recognize faces and the person‘s name. 21 User requirements Experiences from the ALICE project
  22. 22. Experiences from the ALICE project 22 First tests and experiments System prototype
  23. 23. Sensor evaluation • Evaluation of multiple sensors: camera (ToF, stereo, web), compass, gyroscope, ultra-sonic ranger, GPS, pedometer) • Samsung Galaxy S3 used as baseline 23 System prototype Image Comunication Sound commands Tactile comunication Orientation Positioning Light sensor Inclination Experiences from the ALICE project
  24. 24. Sensor evaluation • Sensors have different sampling speeds 24 System prototype Experiences from the ALICE project
  25. 25. Sensor evaluation - Conclusions • All sensors in Samsung S3 are superior than the external ones tested (except GPS). • External GPS has better reception due to antena – but in areas with strong multipath effect, the advantage is reduced • Accuracy of GPS: 10 – 40 meters in urban areas • Ultrasonic ranger would be useful for obstacles in front of the user 25 System prototype Experiences from the ALICE project
  26. 26. Possible camera positions 26 System prototype Experiences from the ALICE project
  27. 27. Possible camera positions 27 System prototype Experiences from the ALICE project
  28. 28. Possible camera positions • Setting used for video recording 28 System prototype Experiences from the ALICE project
  29. 29. Headphones • Bone conduction headphones: – Effective even in very loud enviroment (city traffic) – Does not obscure sounds from enviroment – Very High frequencies not as good as in normal headphones 29 System prototype Experiences from the ALICE project
  30. 30. 30 Platform configuration System prototype Experiences from the ALICE project
  31. 31. Conclusion 24 July 2013 31 Conclusion and perspectives Parsing the visual domain Obstacle detector
  32. 32. 32 Input video stream Method overview Obstacle detection Experiences from the ALICE project
  33. 33. 33 Input video stream Interest points extraction Grid of points regularly spread in a frame Method overview Obstacle detection Experiences from the ALICE project
  34. 34. 34 Input video stream Interest points extraction Grid of points regularly spread in a frame Interests points matching and tracking Multiscale Lucas-Kanade algorithm Method overview Obstacle detection Experiences from the ALICE project
  35. 35. 35 Input video stream Interest points extraction Interests points matching and tracking Multiscale Lucas-Kanade algorithm Background / Camera motion estimation Global geometric transform – RANSAC algorithm Method overview Obstacle detection Experiences from the ALICE project
  36. 36. 36 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Global geometric transform – RANSAC algorithm Static / Dynamic obstacle motion estimation Agglomerative clustering based on proximity computation Method overview Obstacle detection Experiences from the ALICE project
  37. 37. 37 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Static / Dynamic obstacle motion estimation Agglomerative clustering based on proximity computation Interest points refinement K-NN algorithm and small clusters removal Method overview Obstacle detection Experiences from the ALICE project
  38. 38. 38 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Static / Dynamic obstacle motion estimation Interest points refinement Obstacles classification K-NN algorithm and small clusters removal Method overview Obstacle detection Experiences from the ALICE project
  39. 39. Experiences from the ALICE project 39 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Static / Dynamic obstacle motion estimation Interest points refinement Obstacles classification Obstacle classification based on position and direction relative to the video camera Experimental results Method overview Obstacle detection
  40. 40. 40 Experimental results Obstacle detection Experiences from the ALICE project
  41. 41. 41 The algorithms were run on an Intel Xeon Machine 3.6 GHz, RAM 16 GB RAM and on a NVIDIA Quadro 4000 video board (256 cores CUDA, 256 bits of external memory interface and 9945 MB graphical memory), under a Windows 7 platform (desktop). Preprocessing steps Time - without GPU (msec) Time - with GPU (msec) Interest points detection (image grid) 0.05 – 0.5 Interests points matching and tracking (unidirectional Lucas – Kanade optical flow) 22 - 23 10 - 11 Background / camera motion estimation (unidirectional homographic motion model (RANSAC) 6.5 - 8.0 Object / obstacle motion estimation (agglomerative clustering) 0.05 – 0.15 Interest points refinement (K-NN algorithm) 0.05 – 0.1 Obstacle classification (approaching / departing and urgent / normal) 0.05 - 0.1 Saving results (video) 1.5 – 2.05 TOTAL TIME / FRAME (average) 31 ms 20 ms Computational time Obstacle detection Experiences from the ALICE project
  42. 42. Objectives Human-Machine interface Taking the path Navigation assistant
  43. 43. Accessible Maps • Crow-sourced application for maps annotation • Routes are entered, edited and shared with Google Maps • OpenStreetMaps used as repository and online access to information about points of interest. 43 Navigation assistant Experiences from the ALICE project
  44. 44. Accessible Maps • Waypoints annotations: – WHAT: presence of crosswalk, traffic lights in an intersection, type of intersection, walk buttons, Stop signs, median strips. – WHERE: information in form of absolute geographic form (Lat, Long) 44 Navigation assistant Experiences from the ALICE project
  45. 45. Experiences from the ALICE project Assistance • Crossing ahead: • Turn left and then cross: 45 Navigation assistant
  46. 46. Assistance • Demo: 46 Navigation assistant Experiences from the ALICE project
  47. 47. Objectives Human-Machine interface Making the connection Human-Machine interface
  48. 48. Objectives Human-Machine interface • Create a communication/presentation system: – Highly adapted to user needs – Enable the VI to perceive and interact with the surrounding environment • Instructions for navigation will have to acknowledge that user perception is similar to moving blindfolded in a maze: – Verbalization: for description of surrounding objects – Enactive methods: for presenting orientation, distance, motion and position of moving objects 48Experiences from the ALICE project
  49. 49. Methods Human-Machine interface • 2 separate groups of users according to: – Level of visual impairment – Other criteria (age, education, etc.) • Interface modalities: – Audio semantics using sound, music and synthesized voice – Text-to-speech synthesis using headphones • Input modalities: screen, tapping, gestures, voice • Output modalities: audio, haptic, tactile 49Experiences from the ALICE project
  50. 50. Enactive methods Human-Machine interface • Communication with the user: what, when, how – Not just how to transfer information between the system and the user, but what information and when. – The timely delivery of the right information avoids information overload. – Translate the sensory impressions about the surroundings into tactile or sound information ( faster and easier to comprehend than verbalization). 50Experiences from the ALICE project
  51. 51. User warning • Directional warnings: earcons • Positional warning: – alerting a user must give user enough time to prepare (2-3 sec for a voice message) – acoustic signal (sequence of beeps) with varying frequencies – vibrations in the bone conduction headphones 51 Human-Machine interface Experiences from the ALICE project
  52. 52. Menu • Hierarchical menu 52 Human-Machine interface Experiences from the ALICE project
  53. 53. Georgie prototype • Sample user-interface 53 Human-Machine interface Experiences from the ALICE project
  54. 54. 24 July 2013 54 Next steps Conclusion and Perspectives
  55. 55. Conclusion • Encouraging first achievements within the ALICE project • Human-Machine interfacing is a difficult challenge • User feedback is essential • Still plenty of things left to improve 55 Conclusion and perspectives Experiences from the ALICE project
  56. 56. Perspectives • Learning and recognizing user-defined landmarks and objects of interest • Obstacle classification according to degree of risk to the user and generation of adequate alerts • Improve navigation and recognition at key points of trip (start and finish) • Navigation and obstacle recognition modules integrated into a single application 56 Conclusion and perspectives Experiences from the ALICE project
  57. 57. ALICE benefits in day-to-day life? • Jean: – is partially sighted – works at UBPS – travels the same route to his office every day 57 Conclusion and perspectives Experiences from the ALICE project
  58. 58. ALICE benefits in day-to-day life? • Jean: – knows the route – with his white cane he manages to travel safely from the bus stop to the building. 58 Conclusion and perspectives Experiences from the ALICE project
  59. 59. ALICE benefits in day-to-day life? • Paul: – is blind – goes at the UBPS once a week – uses different route (he doesn’t feel safe enough) 59 Conclusion and perspectives Experiences from the ALICE project
  60. 60. ALICE benefits in day-to-day life? • Paul: – Paul’s route 60 Conclusion and perspectives Experiences from the ALICE project
  61. 61. Experiences from the ALICE project ALICE benefits in day-to-day life? • Paul and some other blind people usually need to take longer routes (more then 400m) 61 Conclusion and perspectives Paul’s routeJean’s route
  62. 62. How can ALICE bring benefits? 24 July 2013 62 Conclusion and perspectives Find out more at www.alice-project.euThank you!
  63. 63. Experiences from the ALICE project • Slide 2: http://www.flickr.com/photos/gullevek/3240421172/ • Slide 7: http://www.flickr.com/photos/pointshoot/3590816656/ • Slide 10: http://blog.grdodge.org/wp-content/uploads/2011/08/Morris-and-Buddy-1.jpg http://www.iowablindhistory.org/sites/default/files/image/History%20Site%20Images%20and%20Audio%20/Pic%20o f%20Jernigan.jpg http://www.flickr.com/photos/library_of_congress/8190452507/ http://www.globalride-sf.org/images/0608/images/2_PedInfra_TactileWarnings.jpg http://images.ookaboo.com/photo/m/Geleidehond_testparcours_m.jpg http://www.robertschroeder.com/wordpress/wp-content/uploads/2011/01/GuidedWalkSchroeder.jpg http://abramsonscorner.files.wordpress.com/2011/06/img_9072-13-of-54-version-2-1-of-1.jpg • Slide 14: http://farm4.staticflickr.com/3459/3188288778_3d44b943b4_b.jpg • Slide 15: http://blockingfortheblind.org/wp-content/uploads/2013/02/peoplewithcanes.jpg • Slide 20: http://i.huffpost.com/gen/819993/thumbs/r-BLIND-MAN-TASERED-large570.jpg • Slide 31: http://www.flickr.com/photos/swiiffer/4593608484/ • Slide 42: http://upload.wikimedia.org/wikipedia/commons/thumb/a/af/Blind_Leading_the_Blind_by_Lee_Mclaughlin.jpg/1024px- Blind_Leading_the_Blind_by_Lee_Mclaughlin.jpg • Slide 47: http://i.imgur.com/f3fqnEY.jpg • Slide 54: http://www.flickr.com/photos/84681882@N00/5467879589 • Slide 62: http://www.austindowntownlions.org/Resources/Pictures/Gucci%20looking%20forward%20and%20canes.jpg 63 Photo credits

×