Koki Yamashita, Takashi Kikuchi, Katsutoshi Masai, Maki Sugimoto, Bruce H. Thomas, Yuta Sugiura
In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST '17), ACM, Article 19, 8 pages, November 8-10, 2017, Gothenburg, Sweden.
CheekInput: Turning Your Cheek into an Input Surface by Embedded Optical Sensors on a Head-mounted Display (VRST 2017)
1. CheekInput:
Turning Your Cheek into an Input Surface by
Embedded Optical Sensors on a Head-mounted Display
Koki Yamashita,Takashi Kikuchi, Katsutoshi Masai,
Maki Sugimoto, Bruce H.Thomas,Yuta Sugiura
2. Google Glass
Direct touch gesture
• OST-HMDs allow us to interact with augmented reality environments
in our daily lives.
Optical See-through head-mounted Displays(OST-HMD)
9th Nov 2017 ACM VRST2017 CheekInput 2
Microsoft Hololens
Aerial gesture
EPSON MOBERIO BT 300
Mobile device
3. • Input methods for interacting with HMD systems has become
important, and various methods have been proposed.
Existing Input Methods of OST-HMD
9th Nov 2017 ACM VRST2017 CheekInput 3
Microsoft Hololens
Necessary for a certain amount of
space to recognize gesture
Google Glass
Image projection is disturbed
when the frame is touched directly.
Direct touch gesture Aerial gesture
EPSON MOBERIO BT 300
Necessary for carrying external
devices for inputting information
Mobile device
New type of input method is required.
5. • Distance from the HMD to the cheek changes when the skin is deformed.
• Measure cheek deformation with photo-reflective sensors
Principle
9th Nov 2017 ACM VRST2017 CheekInput 5
Photo transistoar
Photo
transistor
Infrared
LED
Photo-reflective sensor
OST-HMD
Photo-reflective
sensor
6. Hardware
9th Nov 2017 ACM VRST2017 CheekInput 6
Photo-reflective sensors
Photo-reflective sensors
• A device with twenty photo-reflective sensors mounted on a HMD.
7. Related Work
9th Nov 2017 ACM VRST2017 CheekInput 7
Skin as input surface
SenSkin [Ogata 2013]
Detecting cheek movement
AffectiveWear [Masai 2015]
Understanding palm-based
imaginary interfaces: [Gustafson, 2013]
Touch interface on back of the hand.
[Nakatsuma, 2011]
Tongue-in-Cheek [Goel 2015]Chewing jockey [Koizumi 2011]
8. • Sensor values are transmitted wirelessly through Microcontroller.
• Central computer recognize gesture and output the result on the HMD.
System Configuration
9th Nov 2017 ACM VRST2017 CheekInput 8
Photo-reflective sensor
OST-HMD Microcontroller XBee
XBeeAndroid
WiFi
WirelessProcessing
PSVM
1$ Gesture
9. • Adopt Support Vector Machine(SVM) for training the gesture classifier.
Gesture Classification
9th Nov 2017 ACM VRST2017 CheekInput 9
Training phase
Touch cheek
Obtain sensor values
Learn touch gesture
Create SVM classifier
Recognition Phase
Touch cheek
Obtain sensor values
Classify gesture
Output result
10. • Collected sensor data enables us to
recognize the direction which the user
pulled the cheek.
Recognition of Directional Gestures
9th Nov 2017 ACM VRST2017 CheekInput 10
Recognition of directional gesturesTraining the SVM with direction dataset
11. • Recognize directional gestures for
both right and left cheek at the same time
• Double-side gestures consist of
4 direction *4 direction = 16 gestures.
• Two ways to input double-side gestures:
with double hands and with single hand.
Recognition of Double-side Gestures
9th Nov 2017 ACM VRST2017 CheekInput 11
Down, UP Up, Right
Right, Right Down, Left
Double hands gestures Single hands gestures
12. • From the directional input, a stroke input is created by plotting 2D points.
• $1 Unistroke Recognizer is used for gesture recognition.
Recognition of Symbolic Gestures
9th Nov 2017 ACM VRST2017 CheekInput 12
line v
caret stairs
14. Google Glass
Symbolic gesture
Evaluation
9th Nov 2017 ACM VRST2017 CheekInput 14
Microsoft Hololens
Double-side Gesture
EPSON MOBERIO BT 300
Single-side gesture
Three user studies to evaluate recognition accuracy.
15. Evaluation – Single-side Gestures
9th Nov 2017 ACM VRST2017 CheekInput 15
• Accuracy of 5 basic gestures with 3 conditions:
We conducted a user study to investigate the
recognition accuracy of single-side gestures.
Sitting Walking Re-wearing the device
Gesture Up, Down, Right, Left, Neutral
Participants 7(Male) + 1(Female)
Sampling rate 30 fps
Total samples 5 direction *100 samples * 5 trial
Evaluation 5-fold cross validation
16. sitting walking re-wearing
• The recognition accuracy was 89.9% (sitting), 82.6% (walking) and
78.0% (re-wearing).
Accuracy (Single-side Gestures)
9th Nov 2017 ACM VRST2017 CheekInput 16
Confusion matrixRecognition accuracy
Predicted
Neutral
Up
Right
Left
Down
Neutral Up Left Right Down
Truth
Neutral Up Left Right Down
RecognitionAccuracy
17. sitting walking re-wearing
• The accuracy was lower when walking (82.6%) than when sitting (89.9%).
→Vibration caused by the body movement reduced the accuracy slightly .
Discussion (Single-side Gestures)
9th Nov 2017 ACM VRST2017 CheekInput 17
Confusion matrixRecognition accuracy
Predicted
Neutral
Up
Right
Left
Down
Neutral Up Left Right Down
Truth
sitting walking re-wearing
Recognition accuracy
Neutral Up Left Right Down
RecognitionAccuracy
18. sitting walking re-wearing
• When re-wearing the device, the accuracy was 78.0%.
→Every time the mounted position of the OST-HMD is close.
Discussion (Single-side Gestures)
9th Nov 2017 ACM VRST2017 CheekInput 18
Confusion matrixRecognition accuracy
Predicted
Neutral
Up
Right
Left
Down
Neutral Up Left Right Down
Truth
sitting walking re-wearing
Recognition accuracy
Neutral Up Left Right Down
RecognitionAccuracy
19. Evaluation – Symbolic Gestures
9th Nov 2017 ACM VRST2017 CheekInput 19
• Accuracy of 4 gestures with 2 conditions
We conducted a user study to investigate the
recognition accuracy of symbolic gestures.
With visual aid Eyes free
Gesture 4(line, v, caret, stairs)
Participants 3(Male)
Sampling rate 30 fps
20. Accuracy (Symbolic Gestures)
9th Nov 2017 ACM VRST2017 CheekInput 20
Confusion matrix
Recognition accuracy
• Many participants tried to improve their gesture input by going back
and forth with visual aids.
• The recognition accuracy was 91.7%(with visual aid) and 93.4%(eyes free).
Predicted
Truth
line caretv
with visual aids eyes free
stairs
RecognitionAccuracy
Confusion Matrix
21. Evaluation – Double-side Gestures
9th Nov 2017 ACM VRST2017 CheekInput 21
• Accuracy of 17 gestures
We conducted a user study to investigate the
recognition accuracy of double-side gestures.
Single Hand
dominant hand/ non-dominant hand
Double hands
Gesture 17(4 direction * 4 direction + Neutral)
Participants 7(Male) + 1(Female)
Sampling rate 30 fps
Total samples 17 direction *100 samples * 5 trial
Evaluation 5-fold cross validation
22. • The recognition accuracy was 74.3% (dominant hand) and 74.8% (non-
dominant hand).
• There was no significant difference in the results whether dominant hand is
used or not → Dependence on handedness is small for ease of input.
Accuracy (Double-side Gestures (Single Hand) )
9th Nov 2017 ACM VRST2017 CheekInput 22
Confusion matrix of
recognition accuracy
(dominant hand)
Confusion matrix of
recognition accuracy
(non-dominant hand)
23. • The average recognition accuracy of all participants was 80.5%.
→ The accuracy was lower when using double hands than single hand.
Accuracy (Double-side Gestures (Double Hands) )
9th Nov 2017 ACM VRST2017 CheekInput
Confusion matrix
Predicted
Truth
24. Applications
9th Nov 2017 ACM VRST2017 CheekInput 24
Photo viewing application Music application
Character inputMap application
25. • Don’t touch cheek strongly. Otherwise,
your cheek will get damage.
• Affected by ambient light such as sunlight
• Makeup will come off by touching cheek.
• Users moves (e.g. running and jumping)
increases false positives → Combing other
sensors to recognize the user’s state
Limitations & Future work
9th Nov 2017 ACM VRST2017 CheekInput 25
26. Conclusion
9th Nov 2017 ACM VRST2017 CheekInput 26
Keio University, University of South Australia
Koki Yamashita, Takashi Kikuchi, Katsutoshi Masai, Maki Sugimoto, Bruce
H.Thomas, Yuta Sugiura
Motivation
&
Basic Idea
Implementation
Evaluation
Input for OST-HMD Cheek as input interface
Photo-reflective sensors measure skin deformation Recognize directional and symbolic gestures
83.5% accuracy for
directional gestures
92.6% accuracy for
symbolic gestures
74.6% average accuracy for
double-side gestures with single hand
80.45% accuracy for
Double hands gestures
System configuration