SKIN PUT:APPROPRIATING THE BODY AS AN INPUT SURFACE P.PRANNOY CHAKRAVARTHI & K.B.S.MANIKANTA BHIMAVARAM INSTITUTE OF ENGINEERING & TECHNOLOGY
EVOLUTION OF TOUCHSCREEN Motion tracking is the principle used in touch- screen WHAT’S NEXT?
WHAT IT IS?A novel input technique that allowsthe skin to be used as a finger inputsurface.To capture this acoustic information , http://walyou.com/touchthey developed a wearable armband that screen-interface-on-your-is non-invasive and easily removable arm-with-skinput/
ENERGY THROUGH ARM Transverse Wave Propagation Longitudinal Wave Propagation
ARM BAND Two arrays of five sensing elements. Bone conduction microphones. Microphones Placed near: Humerus http://www.robaid.com/gadgets/ Radius skinput-.htm Ulna
HOW SKINPUT WORKS Data was then sent from the client over a local socket to our primary application, written in Java. Key function of application are: Live visualization. http://www.bestnweb.com/sk Segmentation of data stream. input-touch.html Classification of Input instances.
EXPERIMENTParticipants 13-> 7 female, 6 male. Ages ranged from 20 to 56. Body mass indexes (BMIs) ranged From 20.5 (normal) to 31.9 (obese). Each participant was made to memorize the locations for a minute .
RESULTS Five Fingers When classification was incorrect, the system believed the input to be an adjacent finger 60.5% of the time. Ring finger constituted 63.3% percent of the misclassifications.
RESULTS Whole Arm Below elbow placed the sensors closer to the input targets than the other conditions. The margin of error got double or tripled when eyes were closed.
RESULTS Fore Arm Classification accuracy for the ten-location forearm condition stood at 81.5%.
BMI EFFECTS High BMI is correlated with decreased accuracies. No direct relation with gender of the participant.
ADVANTAGESUI will appear much larger than onscreenCan be used without a visual screenIdeal for anyone with little to or noeyesight
CONCLUSIONSince we cannot simply make buttons andscreens larger it will be an alternativeapproachSkin puts are not available yet, butcould be in the next few years.
REFERENCES1. Ahmad, F., and Musilek, P. A Keystroke and Pointer ControlInput Interface for Wearable Computers. In Proc. IEEEPERCOM ’06, 2-11.2. Amento, B., Hill, W., and Terveen, L. The Sound of OneHand: A Wrist-mounted Bio-acoustic Fingertip Gesture Interface.In CHI ‘02 Ext. Abstracts, 724-725.3. Argyros, A.A., and Lourakis, M.I.A. Vision-based Interpretationof Hand Gestures for Remote Control of a ComputerMouse . In Proc. ECCV 2006 Workshop on Computer Vision inHCI, LNCS 3979, 40-51.4.Images from www.google.com