Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
J. Eduardo Pérez a b, Myriam Arrue b, Masatomo Kobayashi a,
Hironobu Takagi a & Chieko Asakawa a
a. IBM Research – Tokyo
b...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
MotivationLabora...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
OpenStreetMap	
 ...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Shopping Center ...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Experiment
> Nav...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Laboratory of HC...
Thank you for your attention
Questions?
Assessment of Semantic Taxonomies
for Blind Indoor Navigation
Based on a Shopping ...
Upcoming SlideShare
Loading in …5
×

Assessment of Semantic Taxonomies for Blind Indoor Navigation Based on a Shopping Center Use Case

442 views

Published on

#w4a2017, Perth, West Australia

  • Login to see the comments

  • Be the first to like this

Assessment of Semantic Taxonomies for Blind Indoor Navigation Based on a Shopping Center Use Case

  1. 1. J. Eduardo Pérez a b, Myriam Arrue b, Masatomo Kobayashi a, Hironobu Takagi a & Chieko Asakawa a a. IBM Research – Tokyo b. EGOKITUZ: Laboratory of HCI for Special Needs University of the Basque Country (UPV/EHU) Assessment of Semantic Taxonomies for Blind Indoor Navigation Based on a Shopping Center Use Case :  www.jeduardoperez.info   :  @j_eduardoperez   :  juaneduardo.perez@ehu.eus   Laboratory of HCI for Special Needs April 4th 2017, Perth (Australia) Session 6: Evaluating and Measuring Accessibility The 14th Web for All Conference (W4A 2017) Research  –  Tokyo   a. IBM Research – Tokyo b. EGOKITUZ: Laboratory of HCI for Special Needs University of the Basque Country (UPV/EHU)
  2. 2. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case MotivationLaboratory of HCI for Special Needs Research  –Tokyo   §  Many  loca9on-­‐based  services  (LBS)  available  thanks  to  the  ubiquity  of  smartphones   §  LBS  provide  personalized  assistance,  in  any  loca9on  and  for  a  huge  variety  of  applica9ons     (e.g.,  provide  visually  impaired  with  turn-­‐by-­‐turn  naviga8on  support  through  unknown  environments   by  using  vocal  instruc8ons  based  on  accurate  localiza8on)  -­‐   §  Despite  this,  we  know  liIle  about  WHAT  environmental  elements  and  features  are  more   useful  to  improve  naviga<onal  assistance  for  people  with  visual  impairments.   Tac9le  paving   Tac9le  cues  to  support   naviga9on  of  visually  impaired   Braille  buIons   2  /  16  
  3. 3. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Research  –Tokyo   ①  Survey  real  world  seman9c  taxonomies  to  characterize  indoor  environments   ②  Create  a  set  of  environmental  informa<on  for  a  shopping  center   ③  Perform  an  indoor  naviga<on  experiment  with  visually  impaired  par9cipants,  in  order   to  evaluate  (by  mean  of  subjec9ve  assessments)  the  usefulness  of  naviga9on  assistance   >  a  smartphone-­‐based  naviga8onal  assistant  –NavCog,  including  vocal  instruc8ons  enriched  with   environmental  informa8on  was  used  by  par8cipants  during  experiments  <   Motivation §  Many  loca9on-­‐based  services  (LBS)  available  thanks  to  the  ubiquity  of  smartphones   §  LBS  provide  personalized  assistance,  in  any  loca9on  and  for  a  huge  variety  of  applica9ons     (e.g.,  provide  visually  impaired  with  turn-­‐by-­‐turn  naviga8on  support  through  unknown  environments   by  using  vocal  instruc8ons  based  on  accurate  localiza8on)   §  Despite  this,  we  know  liIle  about  WHAT  environmental  elements  and  features  are  more   useful  to  improve  naviga<onal  assistance  for  people  with  visual  impairments.   3  /  16  
  4. 4. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Research  –Tokyo   Ø  Few  specifica9ons  despite  the  many  loca9on-­‐based  services  available   Ø  Very  limited  concerning  indoor  environments  (working  draY  or  proposals)   Wayfindr  [2]   -­‐  Non-­‐profit  organiza9on  from  UK   -­‐  1st  working  draY  of  an  open   standard  for  audio-­‐based   wayfinding  assistance   -­‐  Purpose:  assist  naviga9on  of  people   with  visual  impairments  within   built-­‐environments  by  means  of   audio  instruc9on   OpenStreetMap  [1]   -­‐  popular  collabora9ve  community   -­‐  Aims  at  a  free  &  editable  map  of  the   world  (no  specific  applica9on  or   popula9on  targeted,  but  considers   a11y  issues)   -­‐  Mainly  for  outdoors  areas   -­‐  Surveyed  latest  proposals  for  indoor   environments   Japanese  MLITT  [3]   -­‐  Working  draY  about  data   specifica9on  for  modeling  outdoor   pedestrian  spaces   -­‐  Purpose:  several  applica9ons   (including  naviga9on  assistance   services  for  different  groups  of   people  who  encounter  barriers)   Surveyed works [1]  OpenStreetMap:  wiki.openstreetmap.org   [2]  Wayfindr:  www.wayfindr.net/wp-­‐content/uploads/2016/07/Wayfindr-­‐  Open-­‐Standard-­‐Working-­‐DraY-­‐1.0.pdf   [3]  Japanese  MLITT:  www.mlit.go.jp/common/000124059.pdf   4  /  16  
  5. 5. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case OpenStreetMap   Wayfindr   Japanese  MLITT   Pathways   §  type  of  pathway   §  width   §  access  restric<ons   §  tac<le  paving  availability   §  slope  (wheelchair  access)     §  type  of  pathway   §  length   §  tac<le  paving  availability   §  junc9ons,  significant  curve,  type  of  tac9le   paving     §  type  of  pathway   §  width,  length   §  access  restric<ons   §  tac<le  paving  availability     §  slope  (gradient,  wheelchair  access)   §  surface  condi9on,  direc9on  of  travel,  open  hours,  name     Doorways   §  type  of  doorway   §  width §  wheelchair  accessible §  steps  counts §  entrance  name §  handle  type,  opening  direc9on,  ramp,   handrail,  access  restric9ons,  level §  type  of  doorway §  venues  connected   §  opening  buIon  (door  side  and  height) §  type  of  doorway §  width §  step  height  (only  one) §  entrance  name Elevators   §  tac<le/braille  support §  levels  connected §  wheelchair  accessible §  access  restric9ons,  opening  hours §  audible  announcements §  tac<le/braille  support §  levels  connected §  call  buIons  loca9on  (side  and  height)   side  doors  open  (if  more  than  1  door) -­‐  defined  as  type  of  pathway §  audible  announcements §  braille  support §  wheelchair  accessible Escalators   §  direc<on  of  travel §  tac<le  paving  availability   §  width,  incline,  lanes,  access  restric9ons §  direc<on  of  travel  (may  change  -­‐  peak  hours) §  tac<le  paving  availability   §  handrail  loca9on,  side  to  stand  during  travel -­‐  defined  as  type  of  pathway §  direc<on  of  travel  (pathway  feature) §  tac<le  paving  availability   Stairs   §  number  of  steps §  handrail  loca<on §  levels  connected §  tac<le  paving  availability   §  width,  incline,  ramp  (for  wheelchair),  name §  number  of  steps §  handrail  loca<on §  levels  connected §  tac<le  paving  availability §  type  of  stairs,  landing/flight  of  stairs -­‐  defined  as  type  of  pathway §  number  of  steps §  handrail  loca<on §  tac<le  paving  availability   §  assis9ve  mechanism  available Public   toilets   §  wheelchair  accessible §  gender §  opening  hours §  access  restric9ons,  diaper  changing  table,   drinking  water,  hand  washing,  paper  supply NOT  INCLUDED §  accessibility  level  (wheelchair  accessible  and  colostomy   support) §  gender §  opening  hours §  crib   Rooms/ venues   §  name §  purpose §  level §  name §  purpose NOT  INCLUDED Building/ facili8es   §  name §  address §  purpose,  levels,  entrance,  access  restric9ons NOT  INCLUDED §  name §  address §  phone  number,  opening  hours,  toilets  accessibility  level Laboratory of HCI for Special Needs Research  –Tokyo   Survey > environmental information for indoor areas 5  /  16  
  6. 6. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Research  –Tokyo   Features   Examples  of  vocal  messages   Pathways   §  type  of  pathway  [corridor/elevator]   §  length,  width   §  tac9le  paving  availability   Locate  tac9le  paving   “proceed  9  meters  on  Braille  blocks,  and  turn  right”   Doorways   §  type  of  doorway   “Coredo  Muromachi  2  underground  level  main  entrance…  to   access  there  are  2  automa6c  doors”     Elevators   §  outside  &  inside  buIons  loca9on   (door  side  &  height)   §  buIons  with  Braille  support   Travel  by  elevator   “elevator  is  on  your  leP;  go  down  to  the  1st  floor”   “aPer  geTng  off  the  elevator  turn  right”   Find  elevator  buIons  (outside  &  inside)   “call  bu7on  with  Braille  is  right  side  of  the  elevator  door”   “Go  to  the  1st  floor;  control  bu7ons  with  Braille  are  right  side   of  the  exit”   Venues   (stores)   §  name   §  doorway  entrance   Recognize  nearby  stores   “Coffee  Rin  is  on  your  leP”   Obstacles   §  heading   §  angle   Predict  nearby  obstacles   “proceed  20  meters,  there  are  obstacles  in  both  sides…”   Elements  and  features  included  based  on:   -­‐  Needs  of  people  with  visual  impairments   -­‐  Environmental  informa9on  present  in  the   shopping  center  along  experimental  routes   Shopping Center Use Case > Set of environmental information 6  /  16  
  7. 7. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Shopping Center Use Case > Network route editor Laboratory of HCI for Special Needs Research  –Tokyo   For  each  floor  of  the  shopping   center:     ①  Define  fixed  posi<ons  for  the   different  environmental  elements   (pathways,  doorways,  elevators,   stores  &  obstacles)   ②  Include  seman<c  informa<on  for   each  element   7  /  16  
  8. 8. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Research  –Tokyo   Features   Examples   Pathways   §  type  of  pathway  [corridor/elevator]   §  length,  width   §  tac9le  paving  availability   >  Locate  tac9le  paving   “proceed  9  meters  on  Braille  blocks,  and  turn  right”   Doorways   §  type  of  doorway   “…  to  access  there  are  2  automa6c  doors”   Elevators   §  outside  &  inside  buIons  loca9on   (door  side  &  height)   §  buIons  with  Braille  support   >  Travel  by  elevator   “elevator  is  on  your  leP;  go  down  to  the  1st  floor”   “aPer  geTng  off  the  elevator  turn  right”   >  Find  elevator  buIons  (outside  &  inside)   “call  bu7on  with  Braille  is  right  side  of  the  elevator  door”   “Go  to  the  1st  floor;  control  bu7ons  with  Braille  are  right  side   of  the  exit”   Venues   (stores)   §  name   §  doorway  entrance   >  Recognize  nearby  stores   “Coffee  Rin  is  on  your  leP”   Obstacles   §  heading   §  angle   >  Predict  nearby  obstacles   “proceed  20  meters,  there  are  obstacles  in  both  sides…”   Shopping Center Use Case > Vocal instructions to assist navigation 8  /  16  
  9. 9. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Research  –Tokyo   Large-­‐scale  shopping  center:     -­‐  3  adjacent  towers   -­‐  14  floors  in  total  (basement  to  4th  floor)   -­‐  98  stores  (restaurants,  fashion  stores,   cinemas,…)   -­‐  Elevators  with  Braille  support   Basement  floor  connects  the  shopping   center  with  a  metro  sta<on  access  by  an   indoor  open  area  between  the  3  towers   including  tac9le  paving  support   Experiment > Shopping center outline 9  /  16  
  10. 10. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Experiment > Navigational tasks Laboratory of HCI for Special Needs Research  –Tokyo   3rd   route   Metro  sta9on   access   Cinema  9cket   counters   Food   store   BASEMENT   3  FLOOR   1  FLOOR   183  meters   Naviga9on  through  the  shopping  center  was   divided  into  3  different  routes:   -­‐  total  distance  of  429  meters   -­‐  Routes  included  representa9ve  indoor  areas  (indoor   open  space,  entrance,  corridors  with  different  widths,   junc9ons’  complexity  and  number  of  obstacles,  tac9le   paving  support)   -­‐  each  route  included  travel  between  floor  by  elevator     1st   route   2nd   route   10  /  16  
  11. 11. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs •  9  subjects  with  visual  impairments:   -­‐  Visual  condi9on:      (5)  totally  blind  (4)  residual  vision   -­‐  Mobility  aid:        (8)  white  cane    (1)  guide  dog   -­‐  Smartphone  exp.:      (5)  Yes      (4)  No   -­‐  Voice  nav.  app  exp.:    (3)  Yes      (6)  No   •  5  categories  to  rate  usefulness  of  vocal  messages     (7  points  Likert  scale):   ①  Locate  tac9le  paving   ②  find  elevator  buIons   ③  travel  by  elevator   ④  predict  nearby  obstacles   ⑤  recognize  nearby  stores     Experiment > Participants & subjective ratings Research  –Tokyo   11  /  16  
  12. 12. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Results > all participants Research  –Tokyo   -­‐  Overall,  posi9ve  assessments  about  the  usefulness   of  the  different  vocal  messages  used   (7-­‐strongly  posi9ve  to  1-­‐strongly  nega9ve)   §  71.1%  posi9ve  /  13.3%  neutral  /  15.6%  nega9ve   -­‐  Preferred  messages:  “elevator  naviga<on”  closely   followed  by  “elevator  buUons”   -­‐  Less  preferred  messages:  “obstacles”   -­‐  High  standard  devia9on  (SD)  values:   §  Opposing  opinions   1   2   3   4   5   6   7   tac9le   paving   elevator   buIons   elevator   naviga9on   obstacles   stores   All  par<cipants  (N=9)   M   5.22   5.67   5.89   4.33   5.33   SD   1.99   1.41   1.27   1.94   1.87   12  /  16  
  13. 13. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Research  –Tokyo   Results grouped by > mobility aid & visual condition 1   2   3   4   5   6   7   tac9le   paving   elevator   buIons   elevator   naviga9on   obstacles   stores   White  cane  users  (N=8)   M   5.75   5.5   5.75   4.75   5.13   SD   1.28   1.41   1.28   1.58   1.89   Totally  blind  (N=4)   M   6   5.25   5.5   4.75   5.25   SD   1.41   0.96   1.29   1.26   1.5   Residual  vision  (N=4)   M   5.5   5.75   6   4.75   5   SD   1.29   1.89   1.41   2.06   2.45   Guide  dog  user  (N=1)   1   7   7   1   7   -­‐  White  cane  users:     §  Preferred  messages:  “elevator  naviga<on”     and  “tac<le  paving”   §  Less  preferred  messages:  “obstacles”   -­‐  Similar  ra<ngs  among  white  cane  users  who  were  totally   blind  and  with  residual  vision  for  the  different  vocal   messages   -­‐  Guide  dog  user:   §  Lowest  ra9ngs  for  messages  about  “tac<le  paving”     and  “obstacles”  (unnecessary  and  confusing  for  her)   [opposing  to  the  average  assessments]   §  Highest  scores  for  the  different  messages  about   elevators  and  those  announcing  stores   13  /  16  
  14. 14. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Research  –Tokyo   Results grouped by > experience with technology 1   2   3   4   5   6   7   tac9le   paving   elevator   buIons   elevator   naviga9on   obstacles   stores   EXPERIENCE  with  SMARTPHONES  (N=4)   M   6   6.5   6.75   4.75   5.5   SD   1.41   0.58   0.5   2.06   2.38   NO  experience  with  SMARTPHONES  (N=4)   M   5.5   4.5   4.75   4.75   4.75   SD   1.29   1.29   0.96   1.26   1.5   EXPERIENCE  with  VOICE  NAVIGATION  APPS  (N=3)   M   6.67   5.67   6   4   4.67   SD   0.58   0.58   1   1.73   2.31   NO  experience  with  VOICE  NAVIGATION  APPS  (N=5)   M   5.2   5.4   5.6   5.2   5.4   SD   1.3   1.82   1.52   1.48   1.82   -­‐  On  average  experienced  par9cipants  with   smartphones  and  voice  naviga9on  apps  gave  higher   ra<ngs  than  par9cipants  without  previous  exp.:   §  Specially  no9ceable  for  messages  about  “elevator   naviga<on”,  “elevator  buUons”  and  “tac<le  paving”   (up  to  2  points  more)  –  and  lower  SD  values   -­‐  On  the  contrary  this  fact  was  less  no9ceable  for   messages  about  “obstacles”  and  “stores”:   §  Difficulty  to  define  a  fixed  posi9on  of  some  obstacles   (e.g.  chairs)  resulted  in  localiza9on  issues   §  Reduced  informa9on  about  stores  (name)   -­‐  Lower  ra<ngs  by  inexperienced  par9cipants  with   experimental  technologies  may  be  due  to:   §  Issues  caused  by  inexperience  use  technologies,   or  certain  reluctance  to  use  new  technologies   §  Inaccurate  localiza9on  by  naviga9on  system     [also  affect  assessments  of  other  par9cipants]   14  /  16  
  15. 15. W4A 2017Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case Laboratory of HCI for Special Needs Conclusions Research  –Tokyo   ü  Overall  posi9ve  assessment  of  vocal  messages  show  a  high  level  of  acceptance  for   audio-­‐based  naviga9on  assistance   ü  Environmental  informa9on  presented  was  some9mes  unnecessary  (obstacles  and   tac8le  paving  for  guide  dog  user),  too  brief  (some  par8cipants  suggested  more   detailed  informa8on  than  only  store  names),  or  inaccurate  (localiza8on  issues  with  not   fixed  obstacles)   ü  Other  technical  approaches  should  be  studied  to  achieve  effec9ve  usage  during   naviga9on  guidance  (e.g.,  image  recogni8on  for  accurate  obstacles  predic8on)   Future  work     q  Analyze  par9cipants’  behaviors  during  naviga9onal  tasks  (system  logs  and  video  recordings)   15  /  16  
  16. 16. Thank you for your attention Questions? Assessment of Semantic Taxonomies for Blind Indoor Navigation Based on a Shopping Center Use Case :  www.jeduardoperez.info   :  @j_eduardoperez   :  juaneduardo.perez@ehu.eus   Laboratory of HCI for Special Needs April 4th 2017, Perth (Australia) Session 6: Evaluating and Measuring Accessibility The 14th Web for All Conference (W4A 2017) Research  –  Tokyo  

×