GestEarrings: Developing Gesture-Based Input
Techniques for Earrings
2025 IEEE/SICE International Symposium on System Integration
Takehisa Furuuchi1)
, Takumi Yamamoto1)
, Takashi Amesaka1)
, Katsutoshi Masai2)
, Yuta Sugiura1)
1)
Keio University
2)
Kyushu University
2
• Integration of wearable devices into daily life
• Hearables
• Smart ring
• Demand for devices with discreet appearance and input actions
• Screenless input devices
Background: Demand for unobtrusive wearable devices
Smart ring
Background Gesture Exploration Gesture Identification
Hearables
3
• High social acceptability of earrings
• Widely accepted as a fashion accessory
• Natural for wearing in various situations (e.g., conversations, meetings,
parties)
• Three types of earrings
• Gesture sets and sensing for each shape
• Considering the design variation as a fashion item
Objective: Developing input interfaces using earrings
Three types of earrings and
earring devices with gesture examples
Background Gesture Exploration Gesture Identification
Application Scenarios
4
• Transforming existing items into interfaces
• The appearance of wearing the device is socially accepted
• Positioning of this study
• Gesture input by touching the earrings
• Discreet input operation
• Eyes-free input operation
• Diverse designs
[1] Christine Dierk, Scott Carter, Patrick Chiu, Tony Dunnigan, and Don Kimber. Use Your Head! Exploring Interaction Modalities for Hat Technologies. DIS '19: Proceedings of the 2019 on Designing Interactive Systems Conference.
[2] Katia Vega, Marcio Cunha, Hugo Fuks. Hairware: Conductive Hair Extensions as a Capacitive Touch Input Device. IUI '15 Companion: Companion Proceedings of the 20th International Conference on Intelligent User Interfaces.
[3] Takumi Yamamoto, Katsutoshi Masai, Anusha Withana, Yuta Sugiura. Masktrap: Designing and Identifying Gestures to Transform Mask Strap into an Input Interface. IUI '23: Proceedings of the 28th International Conference on Intelligent User Interfaces.
Related Work: Input Interfaces with Accessories
Interaction between the hat
and the wearer[1]
Hat Hair extension
Inputting information by
touching the extensions[2]
Mask
Inputting information by
moving the mask straps[3]
Background Gesture Exploration Gesture Identification
5
• Limitations
• Conspicuous facial gesture
input
[4] Kyosuke Futami, Kohei Oyama, and Kazuya Murao. Augmenting Ear Accessories for Facial Gesture Input Using Infrared Distance Sensor Array. Electronics 2022, 11(9), 1480; https://doi.org/10.3390/electronics11091480.
[5] Jatin Arora, Kartik Mathur, Aryan Saini, and Aman Parnami. Gehna: Exploring the Design Space of Jewelry as an Input Modality. CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
Related Work: Earring Device
Facial expression recognition
using infrared distance sensors[4]
Facial Gesture Gesture suggestions
Exploring the gesture design space[5]
• Limitations
• Gestures were designed by researchers,
not users
• Not implemented
Background Gesture Exploration Gesture Identification
6
1. Gesture Exploration
• Investigation of gestures preferred by users
2. Gesture Identification
• Implementation of hardware based on (1)
• Implementation of gesture identification software
• Performance evaluation through user experiments
Flow of Our Study
Gesture
Exploration
Gesture
Identification
Background Gesture Exploration Gesture Identification
(e.g. scroll right)
Experimenter Participant
Devise the gesture
Flow of our study Gesture exploration study
7
• Objective
• Survey of gestures defined by users
• Gesture devise by experiment participants
• Three types of earrings
• Final gesture set
• Decided by majority vote
Overview of Gesture Exploring Study
Background Gesture Exploration
Target actions
[Note] Approved by the Keio University Research Ethics Committee
The three different types of earrings used in the study
Hanging earring
Surface earring
Hoop earring
Participant information
Gesture Identification
8
• The largest number of gesture groups due to the complex
mechanism
Gesture Set: Hanging Earring
Gesture Set for Hanging Earring (N=11)
Orb Glide Up Twist Counterclockwise
Twist Clockwise Pinch
Orb Glide Down
Flip Forward Flip Backward Orb Glide Right
Orb Glide Left
Tap
Tap Clasp
Flip Orb Glide
Twist
Tap
Background Gesture Exploration
Orb Glide
Gesture Identification
9
• The gestures are limited due to being fixed
Gesture Set: Surface Earring
Gesture Set for Surface Earring (N=8)
Tap Long Tap
Swipe Up Swipe Down Swipe Right Swipe Left
Double Tap
Tap
Swipe
Background Gesture Exploration
Swipe
Swipe Counterclockwise
Gesture Identification
10
• Few gesture groups, but the largest variety of gestures
• Both gesture groups have directionality
Gesture Set: Hoop Earring
Gesture Set for Hoop Earring (N=12)
Swipe Up Swipe Down
Pinch&Glide
Around the Ear
Pinch&Glide
Forward
Pinch&Glide
Backward
Pinch
Pinch&Glide
Left
Pinch&Glide
Right
Tap Pinch In
Pinch Out
Swipe
Pinch and Glide
Pull Down
Background Gesture Exploration Gesture Identification
11
Gesture Set List
• Characteristics of the decided gestures
• Operations using the fingertip
• Intuitive operations involving direction
• Simple operations with minimal motion and operation time
Background Gesture Exploration
Gesture Set (a: hanging earring, b: surface earring, and c: hoop earring)
Gesture Identification
• Device design capable of identifying defined gestures
Implementation: Hardware
Background Gesture Identification
Hanging Earring Surface Earring Hoop Earring
Sensor: 9-axis sensor Sensor: Photo-reflective sensor Sensor: Capacitive touch sensor
Data: Acceleration, angular
velocity, and geomagnetism
Data: Distance between the
surface and the finger
Data: Discrete contact position
Data dimensions: 9 Data dimensions: 4 Data dimensions: 5
A device with a 9-axis sensor
fixed to the earring
A device with a sensor embedded
in a 3D printed earring
A device with a capacitive sensor
attached to the hoop
12
Gesture Exploration
13
• Data acquisition through Arduino
• Record data for a specific number of frames
• Hanging earring: 20 frames (about 1.1 seconds)
• Surface earring: 150 frames (about 2.1 seconds)
• Hoop earring: 20 frames (about 2.1 seconds)
• Start timing for gesture data recording
• Surface earring, Hoop earring
• Threshold based on the difference between consecutive frames
• Hanging earring
• Automatic start
Implementation: Software (Data Collection)
Example of time-series sensor data
(Surface earring, Swipe Up)
1 10 19 28 37 46 55 64 73 82 91 100109118127136145
0
100
200
300
400
500
600
700
Background Gesture Exploration
Frame count
Sensor
value
Gesture Identification
14
• Machine learning
• Features
• 5 representative values × 5 blocks × sensor dimensions
• Algorithm
• Random Forest
• SVC
• XGBoost
Implementation: Software (Gesture Identification)
Feature extraction flow in machine learning for gesture identification.
Background Gesture Exploration
5 split
Feature
Extraction
5 5 5 5 5
Time series
average
variance
median
maximum
minimum
Sensor1
Sensor2
SensorN
Time
Sensor
value
1 16 31 46 61 76 91 106121136
0
100
200
300
400
500
600
700
Gesture Identification
15
• Gesture data collection by users
• 20 times for each gesture
• Worn on the right ear
• User-defined gesture set
• Hanging earring: 11 gestures
• Surface earring: 8 gestures
• Hoop earring: 12 gestures
• Evaluation
• Within-individual
• 10-fold cross-validation for each participants
• Between-individual
• 10-fold cross-validation with one user as the
test data
Overview of Gesture Identification Experiment
Appearance when wearing each earring
Background Gesture Exploration
Participant information
Gesture Identification
16
• Accuracy: Avg 83.6% (Std 6.3%)
• Result
• False detection between specific gestures
• Flip forward / backward
• Orb Glide Left / Right / Up
• Insights
• The direction of movement cannot be
identified with high accuracy
• The types of movements can be identified
Within-individual Model: Hanging Earring
Within-individual model in hanging
earring
Background Gesture Exploration Gesture Identification
17
• Accuracy: Avg 96.6% (Std 2.0%)
• Result
• All gestures have similar accuracy
• Insight
• Gestures that involve touching the surface have
high reproducibility
Within-individual Model: Surface Earring
Within-individual model in surface earring
Surface Earring
Background Gesture Exploration Gesture Identification
18
• Accuracy: Avg 87.0% (Std 6.2%)
• Result
• Gestures that involve moving the earring itself
have low accuracy
• Insight
• It is difficult to capture the movement of the
earring through contact with the finger or cheek
• Solutions
• Add sensors
Within-individual Model: Hoop Earring
Within-individual model in hoop earring
Background Gesture Exploration Gesture Identification
19
Between-individual Model
• Accuracy
• Hanging earring: 73.6%
• Within-individual: 83.6%
• Surface earring: 84.4%
• Within-individual: 96.6%
• Hoop earring: 69.3%
• Within-individual: 87.0%
Confusion matrix in the within-individual model
Background Gesture Exploration Gesture Identification
Confusion matrix in the between-individual model
20
• Gesture detection
• Hanging earring
• Distinguishing from vibrations caused by walking or everyday activities
• Improved Appearance
• Wireless Implementation
• Built-in sensor
Future Work
Current earring device
21
Summary
Background
Demand for devices with discreet appearance and inconspicuous
input actions
Related work
Input interfaces with accessories
Gesture elicitation study
Proposed
method
Exploration and implementation of gestures using three different
earring shapes
Evaluation
User study (Within-individual model, between-individual-model)
10-fold cross-validation
Result
Hanging earring: 83.6% (within-individual), 73.6% (between-
individual)
Surface earring: 96.6% (within-individual), 84.4% (between-
individual)
Hoop earring: 87.0% (within-individual), 69.3% (between-individual)
Future work Gesture detection, improved appearance
GestEarrings: Developing Gesture-Based Input Techniques for Earrings
Takehisa Furuuchi, Takumi Yamamoto, Takashi Amesaka, Katsutoshi Masai, Yuta Sugiura
Appendix
23
3D modeling of a surface earring
3D model of surface earring
24
[6] J.-L. Perez-Medina, S. Villarreal, and J. Vanderdonckt, “A gesture ´ elicitation study of nose-based gestures,” Sensors, vol. 20, no. 24, 2020. [Online]. Available: https://www.mdpi.com/1424-8220/20/24/ 7118
[7] ] K. Masai, K. Kunze, D. Sakamoto, Y. Sugiura, and M. Sugimoto, “Face commands - user-defined facial gestures for smart glasses,” in 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2020, pp. 374–
386.
[8] Y.-C. Chen, C.-Y. Liao, S.-w. Hsu, D.-Y. Huang, and B.-Y. Chen, “Exploring user defined gestures for ear-based interactions,” Proc. ACM Hum.-Comput. Interact., vol. 4, no. ISS, nov 2020. [Online]. Available:
https://doi.org/10.1145/3427314
Related Work: Gesture Elicitation Study
• Gesture elicitation study
• Construction of user-defined gesture set
• Flow
1. The experimenter presents the task (e.g. place a call)
2. The participants design gestures for the task
3. The experimenter categorizes the gestures
4. The final gesture set is determined
Background Gesture Exploration Gesture Identification
Nose-based interaction[6]
Facial gestures for
smart glasses[7]
Ear-based interaction[8]

GestEarrings: Developing Gesture-Based Input Techniques for Earrings

  • 1.
    GestEarrings: Developing Gesture-BasedInput Techniques for Earrings 2025 IEEE/SICE International Symposium on System Integration Takehisa Furuuchi1) , Takumi Yamamoto1) , Takashi Amesaka1) , Katsutoshi Masai2) , Yuta Sugiura1) 1) Keio University 2) Kyushu University
  • 2.
    2 • Integration ofwearable devices into daily life • Hearables • Smart ring • Demand for devices with discreet appearance and input actions • Screenless input devices Background: Demand for unobtrusive wearable devices Smart ring Background Gesture Exploration Gesture Identification Hearables
  • 3.
    3 • High socialacceptability of earrings • Widely accepted as a fashion accessory • Natural for wearing in various situations (e.g., conversations, meetings, parties) • Three types of earrings • Gesture sets and sensing for each shape • Considering the design variation as a fashion item Objective: Developing input interfaces using earrings Three types of earrings and earring devices with gesture examples Background Gesture Exploration Gesture Identification Application Scenarios
  • 4.
    4 • Transforming existingitems into interfaces • The appearance of wearing the device is socially accepted • Positioning of this study • Gesture input by touching the earrings • Discreet input operation • Eyes-free input operation • Diverse designs [1] Christine Dierk, Scott Carter, Patrick Chiu, Tony Dunnigan, and Don Kimber. Use Your Head! Exploring Interaction Modalities for Hat Technologies. DIS '19: Proceedings of the 2019 on Designing Interactive Systems Conference. [2] Katia Vega, Marcio Cunha, Hugo Fuks. Hairware: Conductive Hair Extensions as a Capacitive Touch Input Device. IUI '15 Companion: Companion Proceedings of the 20th International Conference on Intelligent User Interfaces. [3] Takumi Yamamoto, Katsutoshi Masai, Anusha Withana, Yuta Sugiura. Masktrap: Designing and Identifying Gestures to Transform Mask Strap into an Input Interface. IUI '23: Proceedings of the 28th International Conference on Intelligent User Interfaces. Related Work: Input Interfaces with Accessories Interaction between the hat and the wearer[1] Hat Hair extension Inputting information by touching the extensions[2] Mask Inputting information by moving the mask straps[3] Background Gesture Exploration Gesture Identification
  • 5.
    5 • Limitations • Conspicuousfacial gesture input [4] Kyosuke Futami, Kohei Oyama, and Kazuya Murao. Augmenting Ear Accessories for Facial Gesture Input Using Infrared Distance Sensor Array. Electronics 2022, 11(9), 1480; https://doi.org/10.3390/electronics11091480. [5] Jatin Arora, Kartik Mathur, Aryan Saini, and Aman Parnami. Gehna: Exploring the Design Space of Jewelry as an Input Modality. CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems Related Work: Earring Device Facial expression recognition using infrared distance sensors[4] Facial Gesture Gesture suggestions Exploring the gesture design space[5] • Limitations • Gestures were designed by researchers, not users • Not implemented Background Gesture Exploration Gesture Identification
  • 6.
    6 1. Gesture Exploration •Investigation of gestures preferred by users 2. Gesture Identification • Implementation of hardware based on (1) • Implementation of gesture identification software • Performance evaluation through user experiments Flow of Our Study Gesture Exploration Gesture Identification Background Gesture Exploration Gesture Identification (e.g. scroll right) Experimenter Participant Devise the gesture Flow of our study Gesture exploration study
  • 7.
    7 • Objective • Surveyof gestures defined by users • Gesture devise by experiment participants • Three types of earrings • Final gesture set • Decided by majority vote Overview of Gesture Exploring Study Background Gesture Exploration Target actions [Note] Approved by the Keio University Research Ethics Committee The three different types of earrings used in the study Hanging earring Surface earring Hoop earring Participant information Gesture Identification
  • 8.
    8 • The largestnumber of gesture groups due to the complex mechanism Gesture Set: Hanging Earring Gesture Set for Hanging Earring (N=11) Orb Glide Up Twist Counterclockwise Twist Clockwise Pinch Orb Glide Down Flip Forward Flip Backward Orb Glide Right Orb Glide Left Tap Tap Clasp Flip Orb Glide Twist Tap Background Gesture Exploration Orb Glide Gesture Identification
  • 9.
    9 • The gesturesare limited due to being fixed Gesture Set: Surface Earring Gesture Set for Surface Earring (N=8) Tap Long Tap Swipe Up Swipe Down Swipe Right Swipe Left Double Tap Tap Swipe Background Gesture Exploration Swipe Swipe Counterclockwise Gesture Identification
  • 10.
    10 • Few gesturegroups, but the largest variety of gestures • Both gesture groups have directionality Gesture Set: Hoop Earring Gesture Set for Hoop Earring (N=12) Swipe Up Swipe Down Pinch&Glide Around the Ear Pinch&Glide Forward Pinch&Glide Backward Pinch Pinch&Glide Left Pinch&Glide Right Tap Pinch In Pinch Out Swipe Pinch and Glide Pull Down Background Gesture Exploration Gesture Identification
  • 11.
    11 Gesture Set List •Characteristics of the decided gestures • Operations using the fingertip • Intuitive operations involving direction • Simple operations with minimal motion and operation time Background Gesture Exploration Gesture Set (a: hanging earring, b: surface earring, and c: hoop earring) Gesture Identification
  • 12.
    • Device designcapable of identifying defined gestures Implementation: Hardware Background Gesture Identification Hanging Earring Surface Earring Hoop Earring Sensor: 9-axis sensor Sensor: Photo-reflective sensor Sensor: Capacitive touch sensor Data: Acceleration, angular velocity, and geomagnetism Data: Distance between the surface and the finger Data: Discrete contact position Data dimensions: 9 Data dimensions: 4 Data dimensions: 5 A device with a 9-axis sensor fixed to the earring A device with a sensor embedded in a 3D printed earring A device with a capacitive sensor attached to the hoop 12 Gesture Exploration
  • 13.
    13 • Data acquisitionthrough Arduino • Record data for a specific number of frames • Hanging earring: 20 frames (about 1.1 seconds) • Surface earring: 150 frames (about 2.1 seconds) • Hoop earring: 20 frames (about 2.1 seconds) • Start timing for gesture data recording • Surface earring, Hoop earring • Threshold based on the difference between consecutive frames • Hanging earring • Automatic start Implementation: Software (Data Collection) Example of time-series sensor data (Surface earring, Swipe Up) 1 10 19 28 37 46 55 64 73 82 91 100109118127136145 0 100 200 300 400 500 600 700 Background Gesture Exploration Frame count Sensor value Gesture Identification
  • 14.
    14 • Machine learning •Features • 5 representative values × 5 blocks × sensor dimensions • Algorithm • Random Forest • SVC • XGBoost Implementation: Software (Gesture Identification) Feature extraction flow in machine learning for gesture identification. Background Gesture Exploration 5 split Feature Extraction 5 5 5 5 5 Time series average variance median maximum minimum Sensor1 Sensor2 SensorN Time Sensor value 1 16 31 46 61 76 91 106121136 0 100 200 300 400 500 600 700 Gesture Identification
  • 15.
    15 • Gesture datacollection by users • 20 times for each gesture • Worn on the right ear • User-defined gesture set • Hanging earring: 11 gestures • Surface earring: 8 gestures • Hoop earring: 12 gestures • Evaluation • Within-individual • 10-fold cross-validation for each participants • Between-individual • 10-fold cross-validation with one user as the test data Overview of Gesture Identification Experiment Appearance when wearing each earring Background Gesture Exploration Participant information Gesture Identification
  • 16.
    16 • Accuracy: Avg83.6% (Std 6.3%) • Result • False detection between specific gestures • Flip forward / backward • Orb Glide Left / Right / Up • Insights • The direction of movement cannot be identified with high accuracy • The types of movements can be identified Within-individual Model: Hanging Earring Within-individual model in hanging earring Background Gesture Exploration Gesture Identification
  • 17.
    17 • Accuracy: Avg96.6% (Std 2.0%) • Result • All gestures have similar accuracy • Insight • Gestures that involve touching the surface have high reproducibility Within-individual Model: Surface Earring Within-individual model in surface earring Surface Earring Background Gesture Exploration Gesture Identification
  • 18.
    18 • Accuracy: Avg87.0% (Std 6.2%) • Result • Gestures that involve moving the earring itself have low accuracy • Insight • It is difficult to capture the movement of the earring through contact with the finger or cheek • Solutions • Add sensors Within-individual Model: Hoop Earring Within-individual model in hoop earring Background Gesture Exploration Gesture Identification
  • 19.
    19 Between-individual Model • Accuracy •Hanging earring: 73.6% • Within-individual: 83.6% • Surface earring: 84.4% • Within-individual: 96.6% • Hoop earring: 69.3% • Within-individual: 87.0% Confusion matrix in the within-individual model Background Gesture Exploration Gesture Identification Confusion matrix in the between-individual model
  • 20.
    20 • Gesture detection •Hanging earring • Distinguishing from vibrations caused by walking or everyday activities • Improved Appearance • Wireless Implementation • Built-in sensor Future Work Current earring device
  • 21.
    21 Summary Background Demand for deviceswith discreet appearance and inconspicuous input actions Related work Input interfaces with accessories Gesture elicitation study Proposed method Exploration and implementation of gestures using three different earring shapes Evaluation User study (Within-individual model, between-individual-model) 10-fold cross-validation Result Hanging earring: 83.6% (within-individual), 73.6% (between- individual) Surface earring: 96.6% (within-individual), 84.4% (between- individual) Hoop earring: 87.0% (within-individual), 69.3% (between-individual) Future work Gesture detection, improved appearance GestEarrings: Developing Gesture-Based Input Techniques for Earrings Takehisa Furuuchi, Takumi Yamamoto, Takashi Amesaka, Katsutoshi Masai, Yuta Sugiura
  • 22.
  • 23.
    23 3D modeling ofa surface earring 3D model of surface earring
  • 24.
    24 [6] J.-L. Perez-Medina,S. Villarreal, and J. Vanderdonckt, “A gesture ´ elicitation study of nose-based gestures,” Sensors, vol. 20, no. 24, 2020. [Online]. Available: https://www.mdpi.com/1424-8220/20/24/ 7118 [7] ] K. Masai, K. Kunze, D. Sakamoto, Y. Sugiura, and M. Sugimoto, “Face commands - user-defined facial gestures for smart glasses,” in 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2020, pp. 374– 386. [8] Y.-C. Chen, C.-Y. Liao, S.-w. Hsu, D.-Y. Huang, and B.-Y. Chen, “Exploring user defined gestures for ear-based interactions,” Proc. ACM Hum.-Comput. Interact., vol. 4, no. ISS, nov 2020. [Online]. Available: https://doi.org/10.1145/3427314 Related Work: Gesture Elicitation Study • Gesture elicitation study • Construction of user-defined gesture set • Flow 1. The experimenter presents the task (e.g. place a call) 2. The participants design gestures for the task 3. The experimenter categorizes the gestures 4. The final gesture set is determined Background Gesture Exploration Gesture Identification Nose-based interaction[6] Facial gestures for smart glasses[7] Ear-based interaction[8]