• Save
Skinput by AVINASH AMBATI, VNR Vignana Jyothi Institute of Engg & Tech
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Skinput by AVINASH AMBATI, VNR Vignana Jyothi Institute of Engg & Tech

on

  • 10,743 views

skinput is a technolgy of using skin as a touch screen....

skinput is a technolgy of using skin as a touch screen....
who ever want this ppt ask me.. avinashambati.avi@gmail.com or message to 8818869222

Statistics

Views

Total Views
10,743
Views on SlideShare
10,685
Embed Views
58

Actions

Likes
23
Downloads
2
Comments
32

3 Embeds 58

https://tasks.crowdflower.com 50
http://www.slashdocs.com 7
http://www.docshut.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

15 of 32 Post a comment

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • plz send this ppt on my mail...mail id is gaikwadsanjay444@gmail.com
    Are you sure you want to
    Your message goes here
    Processing…
  • please send me this ppt i need it urgent sir and my mail id is sravan693.1@gmail.com
    Are you sure you want to
    Your message goes here
    Processing…
  • sir plz send this ppt to my mail
    jatin.arora619@gmial.com
    Are you sure you want to
    Your message goes here
    Processing…
  • Hello.........
    please mail it to khushisravanthi@gmail.com
    Are you sure you want to
    Your message goes here
    Processing…
  • ur presentation is very nice
    kindly mail ur ppt on my email id :nazneenbegum8074@gmail.com
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • One option is to opportunistically appropriate surface area from the environment for interactive purposes.tables and walls. there is one surface that has been previous overlooked as an input canvas, and one that happens to always travel with us: our skin.
  • Variations in bone density ,size and mass as well as faltering effects from soft tissue and joints mean different locations are accoustically different.The software that developed listens for impacts and classify them.different interactive capabilities can be bound to different locations.here the user uses his fingers as control pads to play the game of tetris

Skinput by AVINASH AMBATI, VNR Vignana Jyothi Institute of Engg & Tech Presentation Transcript

  • 1. Appropriating body as the input surface Presented by : Avinash Ambati ECE B.Tech IV year
  • 2. Introduction
    • Advances in Electronics
    • Mobile devices are becoming extra small
    • Limited user interaction space
    • Requirement of large User interaction space without losing the primary benefit of small size
    • Alternative approaches that enhance interactions with small mobile systems.
  • 3. Proprioception and Easily accessible
  • 4. BASIC PRINCIPLE Skinput  is an input technology that uses bio-acoustic sensing to localize finger taps on the skin. 
    • When augmented with a Pico-projector, the device can provide a direct manipulation, graphical user interface on the body. The technology was developed by Chris Harrison, Desney Tan, and Dan Morris, at Microsoft Research's Computational User Experiences Group. 
  • 5. Brain sensing technologies such as electroencephalography (EEG) and functional near-infrared spectroscopy (fNIR) have been used by HCI researchers to assess cognitive and emotional state. This work also primarily looked at involuntary signals. EEG and fNIR : BCI : Direct brain computer interfaces (BCIs) generally used for paralyzed still lack the bandwidth required for everyday computing tasks, and require levels of focus, training, and concentration that are incompatible with typical computer interaction. EMG Electrical signals generated by muscle activation during normal hand movement through electromyography
  • 6.
    • Transverse wave propagation : Finger impacts displace the skin, creating transverse waves (ripples). The sensor is activated as the wave passes underneath .
    2. Longitudinal wave propagation: Finger impacts create longitudinal (compressive) waves that cause internal skeletal structures to vibrate. This, in turn, creates longi- tudinal waves that emanate outwards from the bone (along its entire length) toward the skin. Transverse Wave Longitudinal Wave
  • 7. Variations in bone density, size and mass and the soft tissue and joints create Acoustically different locations
  • 8. Playing Tetris: Using Fingers as Control Pad Connect to any mobile device like ipod
  • 9. Projection of a Dynamic Graphical Interface Finger Inputs are Classified & processed in Real Time
  • 10. A wearable, bio-acoustic sensing array built into an armband. Sensing elements detect vibrations transmitted through the body. The two sensor packages shown above each contain five, specially weighted, cantilevered piezo films, responsive to a particular frequency range.
  • 11. Two sensor packages :
  • 12. Ten channels of acoustic data generated by three finger taps on the forearm, followed by three taps on the wrist. The exponential average of the channels is shown in red. Segmented input windows are highlighted in green. Note how different sensing elements are actuated by the two locations.
  • 13. Designed Software Listens for impacts & Classifies them. Then Different Interactive Capabilities are bounded on different regions .
  • 14. Five Fingers : Classification accuracy remained high for the five-finger condition, averaging 87.7% (SD=10.0%, chance=20%) across participants. Whole Arm: The below-elbow placement performed the best, posting a 95.5% (SD=5.1%, chance=20%) average accuracy. This is not surprising, as this condition placed the sensors closer to the input targets than the other conditions. Moving the sensor above the elbow reduced accuracy to 88.3% (SD=7.8%, chance=20%). The eyes-free input condition yielded lower accuracies than other conditions, averaging 85.0% (SD=9.4%, chance=20%). This represents a 10.5% drop from its vision
  • 15. Higher accuracies can be achieved by collapsing the ten input locations into groups. A-E and G were created using a design-centric strategy. F was created following analysis of per-location accuracy data. Fore arm: Classification accuracy for the ten-location forearm condition stood at 81.5% (SD=10.5%, chance=10%), a surprisingly strong result.
  • 16. The testing phase took roughly three minutes to complete (four trials total: two participants, two conditions). The male walked at 2.3 mph and jogged at 4.3 mph; the female at 1.9 and 3.1 mph, respectively. In both walking trials, the system never produced a false positive input. Meanwhile, true positive accuracy was 100%.Classification accuracy was 100% for male and 86% for female. In jogging ,the showed 100% true positive accuracy .Classifcation accuracy was 100% for male and 83.4% for female.
  • 17. The prevalence of fatty tissues and the density/mass of bones tend to dampen or facilitate the transmission of acoustic energy in the body. The participants with the three highest BMIs (29.2, 29.6, and 31.9 – representing borderline obese to obese) produced the three lowest average accuracies.
  • 18. Skinput represents a novel, wearable bio-acoustic sensing array that was built into an armband in order to detect and localize finger taps on the forearm and hand. Results from the experiments have shown that system performs very well for a series of gestures, even when the body is in motion which improves in further work.
  • 19.