• Like

Neo4j Integration with the Leap Motion as a Gesture Recognition System - Slater Victoroff @ GraphConnect Boston 2013

  • 974 views
Uploaded on

The Leap Motion is a small hand-position measurment device that promises to revolutionize the way we interact with computers, allowing for sub-millimeter fingertip position accuracy. Unfortunately, …

The Leap Motion is a small hand-position measurment device that promises to revolutionize the way we interact with computers, allowing for sub-millimeter fingertip position accuracy. Unfortunately, the design of the device makes it extremely sensitive to occlusion issues, greatly hindering its use as an input device. We used Neo4j to model a novel approach to gesture detection, by using nodes as relative positions, and edges as the entries in a Markov Chain. This allows us to consider each individual gesture as a path on this graph, eliminating the need for constant finger-tip tracking. We used Neo4j's RESTful API in conjunction with Unity 3D's WWW module and an OSC server used to integrate the Leap Motion with the free version of Unity, leading to a full integration between Neo4j and the Leap Motion. There is ongoing research being done on the efficacy of this system for enhancing human-computer interactions in the greater Boston area.

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
974
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
114
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Neo4j  in  the  Future  of  Interaction  DesignA  novel  approach  to  gesture  recognition  integrating  Neo4j  with  the  Leap  Motion
  • 2. This  Talk!  Introduction!  Interaction  Design!  The  Tyranny  of  Finger-­‐On-­‐Glass!  The  Leap  Motion!  Promises  and  Limitations!  Gesture  Recognition!  Current  State-­‐of-­‐the-­‐Art!  Building  a  New  Strategy  for  the  Leap!  Conclusions
  • 3. Who  am  I?!  Education  !  Work
  • 4. My  Collaborators
  • 5. The  Leap  Motionhttp://youtu.be/3b4w749Tud8
  • 6. A  Brief  History  of  Interaction  Design
  • 7. Basic  Technology  and  Indirect  Mappings
  • 8. Higher  Layers  of  Abstraction
  • 9. Fingers  On  Glass
  • 10. Why  is  This  Bad?
  • 11. Enter  The  Leap  Motion
  • 12. Here  are  Some  Live  Demos
  • 13. There  Are  Even  Simple  Gestures  Included
  • 14. But  Something  is  Rotten  in  Denmark!  Complex  Motions  are  infeasible!  Self-­‐Obfuscation  is  a  huge  problem!  Interface  is  surprisingly  exhausting!  Drivers  are  proprietary  and  imperfect!  Bounding  box  is  small!  Data  is  fundamentally  inconsistent
  • 15. The  Real  FaceoffVs.
  • 16. Developers?
  • 17. Gesture  Recognition
  • 18. Problems  with  Classical  Approaches  to  Gestures!  Geared  towards  easily  benchmarked,  previously  studied  problems.  !  Primarily  Developed  by  narrowly-­‐defined  industry  applications
  • 19. Hidden  Markov  Models
  • 20. Problems  With  HMMs!  State  depends  only  on  current  state,  intuitive  hand  gestures  are  inherently  hysteretic.!  Depends  on  discrete  gesture-­‐identification,  no  sense  of  “variations  on  a  theme”!  Storage  space  exponentiates  when  faced  with  inconsistent  data-­‐streams!  NOT  built  for  the  Leap
  • 21. Size?!  Minimum  6  DoF  per  finger  +  7  for  the  palm!  2  hands,  even  assuming  only  two  modes  of  motion:1.9  *  1022
  • 22. Motion  as  a  Graph
  • 23. Pros!  Basic  mathematics  is  close  enough  to  that  of  HMMs  that  much  of  the  established  infrastructure  can  be  leveraged!  Path  similarity  doesn’t  rely  on  consistent  data  streams  and  allows  for  regression  testing!  Database  can  easily  be  trimmed  to  reduce  size  concerns
  • 24. Cons!  The  Leap  is  very  fast,  and  sub  graph  comparisons  are  computationally  intensive!  Lots  of  data  that  isn’t  hugely  useful  to  us.!  Continuous  data  ends  up  being  very  sensitive  to  slight  perturbations  in  paths!  A  few  orders  of  magnitude  down,  but  just  a  few
  • 25. Karger’s  Algorithm
  • 26. Is  That  Really  a  Big  Difference  Though?!  Syncs  up  well  with  our  natural  perception  of  gestures!  Reduction  of  almost  7  full  orders  of  magnitude  for  comprehensive  gesture  coverage!  Diffs  from  node  epicenters  are  more  robust  and  improve  regression  results!  Greatly  reduces  number  of  calls  made  to  REST  API
  • 27. Preliminary  Results!  Constrained  digit  recognition  benchmarked  at  93.4%!  Maximum  latency  for  immersion  is  ~120  ms!  Learning  rates  for  gesture  based  interface  is  about  40%  faster  than  for  gesture-­‐free  interfaces!  Partnership  with  zSpace!  Continued  mentoring  from  SolidWorks  and  Belmont  Labs  founder,  Scott  Harris.
  • 28. Probing  the  Future  of  Human-­‐Interface  Design
  • 29. What’s  Coming  Next?
  • 30. Any  Questions?Slater.r.victoroff@gmail.com