Haptics   Final Project Presentation
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Haptics Final Project Presentation

on

  • 1,467 views

 

Statistics

Views

Total Views
1,467
Views on SlideShare
1,467
Embed Views
0

Actions

Likes
0
Downloads
19
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Haptics Final Project Presentation Presentation Transcript

  • 1. Haptics Final Project: Using a Sensor Glove to Write in the Air Paul Taele Spring 2008
  • 2. Goals
    • Write stuff in the air without a pen.
  • 3. Initial Gestures
  • 4. Original Posture Classifier Setup
    • Tools:
      • P5
      • CyberGlove
    • Posture Classifiers:
      • k-Nearest Neighbor
      • Naïve Bayes
      • Neural Network
  • 5. Postures - Results
    • P5
      • NB: 10%
      • kNN: 50%
      • NN: 70%
    • CyberGlove
      • NN: 75% (all 23 sensors)
      • NN: 100% (3 index finger sensors)
  • 6. Postures - Analysis
    • Desired 100% for posture classification.
    • Used CyberGlove device and Neural Network classifier for postures.
    • Used two easily separable gestures instead of four.
  • 7. Hand Gesture Segmentation
    • Simple for two very separable gestures.
    • Classify each time state in an instance using the trained NN.
  • 8. Final Project Setup
    • Tools: CyberGlove
    • Language: Java
    • Posture Classifier: Neural Network
    • Sketch Classifier: $1
    • # of Postures: 2
    • # of Gestures: 4
  • 9. Final Postures
  • 10. Final Gestures
  • 11. Training Data ($1)
    • Created templates from 3 users.
    • Each user gave 5 examples for each sketch gesture.
  • 12.  
  • 13.  
  • 14.  
  • 15.  
  • 16. Test Data ($1)
    • Data was tested on consecutively-inputted sketch gesture.
    • Postures first extracted from gesturing stream.
    • Time points of those postures used to classify sketch gestures.
  • 17. Target: Circle -> Triangle Actual: Rectangle -> Triangle
  • 18. Target: Rectangle -> Circle Actual: Rectangle -> Rectangle
  • 19. Target: Triangle -> X Actual: X -> X
  • 20. Target: X -> Rectangle Actual: Rectangle -> X
  • 21. Conclusion
    • $1 sucks.