Neo4j	  in	  the	  Future	  of	  Interaction	  DesignA	  novel	  approach	  to	  gesture	  recognition	  integrating	  Neo...
This	  Talk!  Introduction!  Interaction	  Design!  The	  Tyranny	  of	  Finger-­‐On-­‐Glass!  The	  Leap	  Motion!  Promi...
Who	  am	  I?!  Education 	  !  Work
My	  Collaborators
The	  Leap	  Motionhttp://youtu.be/3b4w749Tud8
A	  Brief	  History	  of	  Interaction	  Design
Basic	  Technology	  and	  Indirect	  Mappings
Higher	  Layers	  of	  Abstraction
Fingers	  On	  Glass
Why	  is	  This	  Bad?
Enter	  The	  Leap	  Motion
Here	  are	  Some	  Live	  Demos
There	  Are	  Even	  Simple	  Gestures	  Included
But	  Something	  is	  Rotten	  in	  Denmark!  Complex	  Motions	  are	  infeasible!  Self-­‐Obfuscation	  is	  a	  huge	 ...
The	  Real	  FaceoffVs.
Developers?
Gesture	  Recognition
Problems	  with	  Classical	  Approaches	  to	  Gestures!  Geared	  towards	  easily	  benchmarked,	  previously	  studied...
Hidden	  Markov	  Models
Problems	  With	  HMMs!  State	  depends	  only	  on	  current	  state,	  intuitive	  hand	  gestures	  are	  inherently	 ...
Size?!  Minimum	  6	  DoF	  per	  finger	  +	  7	  for	  the	  palm!  2	  hands,	  even	  assuming	  only	  two	  modes	  o...
Motion	  as	  a	  Graph
Pros!  Basic	  mathematics	  is	  close	  enough	  to	  that	  of	  HMMs	  that	  much	  of	  the	  established	  infrastr...
Cons!  The	  Leap	  is	  very	  fast,	  and	  sub	  graph	  comparisons	  are	  computationally	  intensive!  Lots	  of	  ...
Karger’s	  Algorithm
Is	  That	  Really	  a	  Big	  Difference	  Though?!  Syncs	  up	  well	  with	  our	  natural	  perception	  of	  gestures...
Preliminary	  Results!  Constrained	  digit	  recognition	  benchmarked	  at	  93.4%!  Maximum	  latency	  for	  immersion...
Probing	  the	  Future	  of	  Human-­‐Interface	  Design
What’s	  Coming	  Next?
Any	  Questions?Slater.r.victoroff@gmail.com
Upcoming SlideShare
Loading in …5
×

Neo4j Integration with the Leap Motion as a Gesture Recognition System - Slater Victoroff @ GraphConnect Boston 2013

1,593 views

Published on

The Leap Motion is a small hand-position measurment device that promises to revolutionize the way we interact with computers, allowing for sub-millimeter fingertip position accuracy. Unfortunately, the design of the device makes it extremely sensitive to occlusion issues, greatly hindering its use as an input device. We used Neo4j to model a novel approach to gesture detection, by using nodes as relative positions, and edges as the entries in a Markov Chain. This allows us to consider each individual gesture as a path on this graph, eliminating the need for constant finger-tip tracking. We used Neo4j's RESTful API in conjunction with Unity 3D's WWW module and an OSC server used to integrate the Leap Motion with the free version of Unity, leading to a full integration between Neo4j and the Leap Motion. There is ongoing research being done on the efficacy of this system for enhancing human-computer interactions in the greater Boston area.

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,593
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
125
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Neo4j Integration with the Leap Motion as a Gesture Recognition System - Slater Victoroff @ GraphConnect Boston 2013

  1. 1. Neo4j  in  the  Future  of  Interaction  DesignA  novel  approach  to  gesture  recognition  integrating  Neo4j  with  the  Leap  Motion
  2. 2. This  Talk!  Introduction!  Interaction  Design!  The  Tyranny  of  Finger-­‐On-­‐Glass!  The  Leap  Motion!  Promises  and  Limitations!  Gesture  Recognition!  Current  State-­‐of-­‐the-­‐Art!  Building  a  New  Strategy  for  the  Leap!  Conclusions
  3. 3. Who  am  I?!  Education  !  Work
  4. 4. My  Collaborators
  5. 5. The  Leap  Motionhttp://youtu.be/3b4w749Tud8
  6. 6. A  Brief  History  of  Interaction  Design
  7. 7. Basic  Technology  and  Indirect  Mappings
  8. 8. Higher  Layers  of  Abstraction
  9. 9. Fingers  On  Glass
  10. 10. Why  is  This  Bad?
  11. 11. Enter  The  Leap  Motion
  12. 12. Here  are  Some  Live  Demos
  13. 13. There  Are  Even  Simple  Gestures  Included
  14. 14. But  Something  is  Rotten  in  Denmark!  Complex  Motions  are  infeasible!  Self-­‐Obfuscation  is  a  huge  problem!  Interface  is  surprisingly  exhausting!  Drivers  are  proprietary  and  imperfect!  Bounding  box  is  small!  Data  is  fundamentally  inconsistent
  15. 15. The  Real  FaceoffVs.
  16. 16. Developers?
  17. 17. Gesture  Recognition
  18. 18. Problems  with  Classical  Approaches  to  Gestures!  Geared  towards  easily  benchmarked,  previously  studied  problems.  !  Primarily  Developed  by  narrowly-­‐defined  industry  applications
  19. 19. Hidden  Markov  Models
  20. 20. Problems  With  HMMs!  State  depends  only  on  current  state,  intuitive  hand  gestures  are  inherently  hysteretic.!  Depends  on  discrete  gesture-­‐identification,  no  sense  of  “variations  on  a  theme”!  Storage  space  exponentiates  when  faced  with  inconsistent  data-­‐streams!  NOT  built  for  the  Leap
  21. 21. Size?!  Minimum  6  DoF  per  finger  +  7  for  the  palm!  2  hands,  even  assuming  only  two  modes  of  motion:1.9  *  1022
  22. 22. Motion  as  a  Graph
  23. 23. Pros!  Basic  mathematics  is  close  enough  to  that  of  HMMs  that  much  of  the  established  infrastructure  can  be  leveraged!  Path  similarity  doesn’t  rely  on  consistent  data  streams  and  allows  for  regression  testing!  Database  can  easily  be  trimmed  to  reduce  size  concerns
  24. 24. Cons!  The  Leap  is  very  fast,  and  sub  graph  comparisons  are  computationally  intensive!  Lots  of  data  that  isn’t  hugely  useful  to  us.!  Continuous  data  ends  up  being  very  sensitive  to  slight  perturbations  in  paths!  A  few  orders  of  magnitude  down,  but  just  a  few
  25. 25. Karger’s  Algorithm
  26. 26. Is  That  Really  a  Big  Difference  Though?!  Syncs  up  well  with  our  natural  perception  of  gestures!  Reduction  of  almost  7  full  orders  of  magnitude  for  comprehensive  gesture  coverage!  Diffs  from  node  epicenters  are  more  robust  and  improve  regression  results!  Greatly  reduces  number  of  calls  made  to  REST  API
  27. 27. Preliminary  Results!  Constrained  digit  recognition  benchmarked  at  93.4%!  Maximum  latency  for  immersion  is  ~120  ms!  Learning  rates  for  gesture  based  interface  is  about  40%  faster  than  for  gesture-­‐free  interfaces!  Partnership  with  zSpace!  Continued  mentoring  from  SolidWorks  and  Belmont  Labs  founder,  Scott  Harris.
  28. 28. Probing  the  Future  of  Human-­‐Interface  Design
  29. 29. What’s  Coming  Next?
  30. 30. Any  Questions?Slater.r.victoroff@gmail.com

×