SlideShare a Scribd company logo
A Technical Seminar Report On
TOUCHLESS TOUCHSCREEN
Submitted in partial fulfilment of the requirements Of Technical
Seminar for the award of the Degree of
BACHELOR OF TECHNOLOGY
In
Computer Sciences AndEngneering
By
A.SWAPNA PRIYA
14A81A0561
3rd
CSE- B
SRI VASAVI ENGINEERING COLLEGE
Pedatadepalli,Tadepalligudem-534101,AP.
2015-2016
Contents
Introduction.................................................................................................................... 1
What‟s New?................................................................................................................. 2
System Requirements.................................................................................................... 3
Software Installation ..................................................................................................... 4-5
Hardware Installation..................................................................................................... .6
Sensor Mounting............................................................................................................ .7
Configuration.................................................................................................................8
Calibration......................................................................................................................8-9
Calibration with Reflective Surfaces (Whiteboards, Glass etc).....................................10
Licensing........................................................................................................................10-11
Troubleshooting.............................................................................................................12-15
Analysis............................................................................................................................16
Applications.....................................................................................................................18
Acoustics..........................................................................................................................18-24
Conclusion........................................................................................................................25
ANALYSIS
It obviously requires a sensor but the sensor is neither hand mounted nor present on the
screen. The sensor can be placed either onthe table or near the screen. And the hardware
setup is so compact that it can be fitted into a tiny device like a MP3 player or a mobile
phone. It recognizes the position of an object from as 5 feet.
WORKING:
The system is capable of detecting movements in 3-dimensions without ever having to put
your fingers on the screen. Their patented touchless interface doesn‟t require that you wear
any special sensors on your hand either. You just point at the screen (from as far as 5 feet
away), and you can manipulate objects in 3D.
Sensors are mounted around the screen that is being used, by interacting in the line-of-sight
of these sensors the motion is detected and interpreted into on-screen movements. What is to
stop unintentional gestures being used as input is not entirely clear, but it looks promising
nonetheless
APPLICATIONS
TOUCH LESS MONITOR:
Sure, everybody is doing touchscreen interfaces these days, but this is the first time I‟ve seen
a monitor that can respond to gestures without actually having to touch the screen. The
monitor, based on technology from TouchKowas recently demonstrated by White Electronic
Designs and Tactyl Services at the CeBIT show. Designed for applications where touch may
be difficult, such as for doctors who might be wearing surgical gloves, the display features
capacitive sensors that can read movements from up to 15cm away from the screen. Software
can then translate gestures into screen commands.
Touchscreen interfaces are great, but all that touching, like foreplay, can be a little bit of a
drag. Enter the wonder kids from Elliptic Labs, who are hard at work on implementing
a touchless interface. The input method is, well, in thin air. The technology detects motion
in 3D and requires no special worn-sensors for operation. By simply pointing at the
screen,users can manipulate the object being displayed in 3D. Details are light on how this
actually functions..
Touch-less Gives Glimpse of GBUI:
We have seen the futuristic user interfaces of movies like Minority Report and the Matrix
Revolutions where people wave their hands in 3 dimensions and the computer understands
what the user wants and shifts and sorts data with precision. Microsoft's XD
Huang demonstrated how his company sees the future of the GUI at ITEXPO this past
September in fact. But at the show, the example was in 2 dimensions..
Microsoft's vision on the UI
The GBUI as seen in Minority Report in their Redmond headquarters and it involves lots of
gestures which allow you to take applications and forward them on to others with simple
hand movements. The demos included the concept of software understanding business
processes and helping you work. So after reading a document - you could just push it off the
side of your screen and the system would know to post it on an intranet and also send a link
to a specific group of people.
Touch-less UI:
The basic idea described in the patent is that there would be sensors arrayed around the
perimeter of the device capable of sensing finger movements in 3-D space. The user could
use her fingers similarly to a touch phone, but actuallywithout having to touch the screen.
That's cool, isn't it? I think the idea is not only great, because user input will not be limited to
2-D anymore, but that I can use my thick, dirty, bandaged, etc. fingers as well (as opposed to
"plain" touch UI). I'm a bit skeptic, though, how accurate it can be, whether the software will
have AI ! Finally, there is one more thing to mention, it's the built-in accelerometer.
For the first time, to our knowledge, our team used a touchless NUI system with a Leap
Motion controller during dental surgery. Previous reports have used a different motion sensor
(MS Kinect, Microsoft Corp., Redmond, USA) for touchless control of images in general
surgery,5,6 and for controlling virtual geographic maps7 among other uses.
A Kinect sensor works on a different principle from that of the Leap Motion. The MS Kinect
and Xtion PRO (ASUS Computer Inc., Taipei, Taiwan) are basically infrared depth-sensing
cameras based on the principle of structured light.3 Our team has been using the MS Kinect
sensor with an NUI system for the last two years as an experimental educational tool during
workshops of courses on digital dental photography.
It has been very useful for this purpose. It has also been tested during clinical situations at
dental offices but found to be inadequate for dental scenarios, mainly because the Kinect uses
a horizontal tracking approach and needs a minimal working distance (approximately 1.2 m);
in contrast, the Leap Motion tracks the user's hands from below.
The interaction zone of the MS Kinect is larger (approximately 18m3
) than that of the Leap
Motion (approximately 0.23 m3
). This means that when using the MS Kinect, the operating
room has to be considerably wider and the user has to walk out of the interaction zone in
order to stop interacting with the system. In the case of the Leap Motion, the surgeon just has
to move his hands out of the smaller interaction zone. The MS Kinect tracks the whole body
or the upper part of the body,6 which implies unnecessarily wider movements of the user's
arms; this could lead to fatigue during the procedure. On the other hand, the Leap Motion
tracks only the user's hands and fingers and has a higher spatial resolution and faster frame
rate, which leads to a more precise control for image manipulation.
The proposed system performed quite well and fulfilled the objective of providing access and
control of the system of images and surgical plan, without touching any device, thus allowing
the maintenance of sterile conditions. This motivated a perceived increase in the frequency of
intraoperative consultation of the images. Further, the use of modified dental equipment made
the experience of using an NUI for intraoperative manipulation of dental images particularly
ergonomic.
The great potential of the amazing developments in the fields of dental and medical imaging
can only be exploited if these developments help healthcare professionals during medical
procedures. The interaction between surgical staff, under sterile conditions, and computer
equipment has been a key point. A possible solution was to use an assistant outside the
surgical field for manipulating the computer-generated images, which was not practical and
required one additional person.
The proposed solution seems to be closer to an ideal one. A very important point is that the
cost of the sensor is quite low and all the system components are available at a relatively low
cost; this could allow the easy incorporation of this technology in the budget of clinical
facilities of poor countries across the globe, allowing them to reduce the number of personnel
required in the operating room, who could otherwise be doing more productive work. On the
basis of the success of this proof of concept as demonstrated by this pilot report, we have
undertaken further research to optimize the gestures recognized by the NUI.
The NUI is producing a revolution in human-machine interaction; it has just started now and
will probably evolve over the next 20 years. User interfaces are becoming less visible as
computers communicate more like people, and this has the potential to bring humans and
technology closer. The contribution of this revolutionary new technology is and will be
extremely important in the field of healthcare and has enormous potential in dental and
general surgery, as well as in daily clinical practice. What is more important: This would
greatly benefit the diagnosis and treatment of a number of diseases and improve the care of
people, which is our ultimate and greatest goal.
Touch-less SDK:
The Touchless SDK is an open source SDK for .NET applications. It enables developers to
create multi-touch based applications using a webcam for input. Color based markers defined
by the user are tracked and their information is published through events to clients of the
SDK. In a nutshell, the Touchless SDK enables touch without touching. Well, Microsoft
Office Labs has just released “Touchless,” a webcam-driven multi-touch interface SDK that
enables “touch without touching”
Using the SDK lets developers offer users “a new and cheap way of experiencing multi-
touch capabilities, without the need of expensive hardware or software. All the user needs is a
camera,” to track the multi-colored objects as defined by the developer. Just about any
webcam will work
Touchless started as Mike Wasserman‟s college project at Columbia University. The main
idea: to offer users a new and cheap way of experiencing multi-touch capabilities, without the
need of expensive hardware or software. All the user needs is a camera, which will track
colored markers defined by the user.
Mike presented the project at the Microsoft Office Labs Productivity Science Fair, Office
Labs fell in love with it, and Touchless was chosen as a Community Project. Our deliverables
include an extensible demo application to showcase a limited set of multi-touch capabilities,
but mainly we are delivering an SDK to allow users to build their own multi-touch
applications.
Now, Touchless is released free and open-source to the world under the Microsoft Public
License (Ms-PL) on CodePlex. Our goals are to drive community involvement and use of the
SDK as it continuesto develop.
Remember that this is just the beginning; and you're invited to join our journey. Send us your
questions and feedback, use Touchless SDK in your .NET applications and XNA games, and
support the community by contributing to the source code
Software Development Kit Create multi-touch application .NET development SDK Multi-
touch.NET Development
System requirements
 Visual Studio 2005 or 2008, or Visual Studio Express Edition
 .NET 3.0 or greater
 "TouchlessLib.dll" and "WebcamLib.dll"
Touch-less demo:
The Touch less Demo is an open source application that anyone with a webcam can use to
experience multi-touch, no geekiness required. The demo was created using the Touch less
SDK and Windows Forms with C#. There are 4 fun demos: Snake - where you control a
snake with a marker, Defender - up to 4 player version of a pong-like game, Map - where you
can rotate, zoom, and move a map using 2 markers, and Draw the marker is used to guess
what…. draw!
Mike demonstrated Touch less at a recent Office Labs‟ Productivity Science Fair where
it was voted by attendees as “most interesting project.” If you wind up using the SDK, one
would love to hear what use you make of it!
Touchless Demo is an open source application that anyone with a webcam can use to
experience multi-touch, no geekiness required. There are 4 fun demos: Snake – where you
control a snake with a marker, Defender – up to 4 player version of a pong-like game, Map –
where you can rotate, zoom, and move a map using 2 markers, and Draw – the marker is used
to guess what…. draw!
Touchless SDK is an open source SDK that enables developers to create multi-touch based
applications using a webcam for input, geekiness recommended.
This project is an Office Labs community project, which means a Microsoft employee
worked on this in their spare time. It is also an open source project, which means that anyone
can view, use, and contribute to the code.
Window Shop pic:
In addition, it is worth pointing out that you may need a few cameras in stereo to maximize
accuracy and you could theoretically use your hands as a mouse - meaning you can likely
take advantage of all the functions of the GBUI while resting your hand on the desk in front
of you for most of the day.
At some point we will see this stuff hit the OS and when that happens, the
consumer can decide if the mouse and keyboard will rule the future or the GBUI will be the
killer tech of the next decade.
Touch wall:
Touch Wall refers to the touch screen hardware setup itself; the corresponding software to
run Touch Wall, which is built on a standard version of Vista, is called Plex. Touch Wall and
Plex are superficially similar to Microsoft Surface, a multi-touch table computer that was
introduced in 2007 and which recently became commercially available in
select AT&T stores. It is a fundamentally simpler mechanical system, and is also
significantly cheaper to produce. While Surface retails at around $10,000, the hardware to
“turn almost anything into a multi-touch interface” for Touch Wall is just “hundreds of
dollars”.
Touch Wall consists of three infrared lasers that scan a surface. A camera notes when
something breaks through the laser line and feeds that information back to the Plex software.
Early prototypes, say Pratley and Sands, were made, simply, on a cardboard screen.
A projector was used to show the Plex interface on the cardboard, and a the system worked
fine. Touch Wall certainly isn‟t the first multi-touch product we‟ve seen (see iPhone). In
addition to Surface, of course, there are a number of early prototypes emerging in this space.
But what Microsoft has done with a few hundred dollars worth of readily available hardware
is stunning.
It‟s also clear that the only real limit on the screen size is the projector, meaning that entire
walls can easily be turned into a multi touch user interface. Scrap those white boards in the
office, and make every flat surface into a touch display instead. You might even save some
money.
Sounds like the future, right? Yeah, well the future is here today and picking up steam faster
than some marketers might care to acknowledge.
Kiosks powered by webcams and flash-driven augmented reality apps have been on the
market for years, some used in short-lived campaigns and others finding long-term adoption
and retention in theme parks, airports and other high-traffic spaces.
In 2010 Microsoft released the Kinect, which shook up the possibilities of touch-free
gestures as a means to interact with things.Smart phones and the browser are nearly
ubiquitous. GPS and Natural-language interactions systems are getting better by the moment,
making SIRI seem as sophisticated as a Speak &
Preliminary usability testing was carried out by two surgeons for accessing all kinds of
supported digital images, simulating typical dental surgery situations. During this phase, the
positions of all components were calibrated and adjusted. After trying different positions, we
chose the final location of the controller, taking into account the fact that the interaction space
of the controller allowed the operator to move his/her hands in an ergonomic way in order to
avoid fatigue during the gestures. Different light conditions were tested to verify whether the
controller performance was affected. The data transfer rate of the universal serial bus (USB)
line and hub was checked. Different Leap Motion control settings were tested for a smooth
and stable interaction, and the proposed system was set at 42 fps with a processing time of
23.2 ms; the interaction height was set as automatic. Touchless application (Leap Motion
Inc., San Francisco, USA) in advanced mode was used.
In this pilot study, 11 dental surgery procedures were conducted using the abovementioned
custom system and included in this report as a case series. An overview of the procedures
accomplished is presented . This study was in compliance with the Helsinki Declaration, and
the protocol was approved by the Institutional Review Board at CORE Dental Clinic in
Chaco, Argentina. Each subject signed a detailed informed consent form.
Acoustic Touch Panel
Have you seen Minority Report (come on, who hasn‟t?) and watched as John Anderton went
through holographic reports and databases solely with his gloved hands? Touchless
technology based on gestures instead of clicks and typing may have been an element from
a sci-fi movie in 2002 but it‟s no longer science fiction today.
As we make more advancements in tech, design and gesture navigation, this is the time where
we can navigate through a computing system without the use of a keyboard, a mouse, or even
touching anything. Feast your eyes on these amazing technology that work with motion
sensors and gesture technology that probably grew from the seeds that were planted in this
amazingly accurate Steven Spielberg movie.
1.Tobii Rex
Tobii Rex is an eye-tracking device from Sweden which works with any computer running on
Windows 8. The device has a pair of infrared sensors built in that will track the user‟s eyes.
Users need just place Tobii Rex on the bottom part of the screen and it will capture eye
movements, engaging in Gaze Interaction.
Basically you use your eyes like you would the mouse cursor. Wherever you are looking, the
cursor will appear in the precise spot on screen. To select you can use the touchpad. Although
not entirely touchless, at least now you need not move a bulky mouse around. It‟s also a great
alternative to using the finger on a touch tablet, which blocks off the view of what you want
to click or select.As of now Tobii is not out in the market for consumers yet but you can get
an invite to for earlier access to it here.
2.Elliptic Labs
Elliptic Labs allows you to operate your computer without touching it with the Windows 8
Gesture Suite. It uses ultrasound so it works not with cameras but with your audio tools.
Ideally, you need 6 speakers and 8 microphones but the dedicated speakers on laptops and a
normal micrphone could work too.
The speaker will emit ultrasound which will bounce to microphones so that it could track a
user‟s hand movements which will be interpreted by the Elliptic Labs software.This
technology is designed to work on the Windows 8 platform and is expected to work on
tablets, smartphones and even cars. Elliptic Labs is not out for consumers to buy as the
company is focusing on marketing it to Original Equipment Manufacturers (OEM).
3.Airwriting
Airwriting is a technology that allows you to write text messages or compose emails by
writing in the air. Airwriting comes in the form of a glove which recognizes the path your
hands and fingers move in as you write. The glove contains sensors that can record hand
movements.
What happen is, when the user starts „airwriting‟, the glove will detect it and send it to the
computer via wireless connection. The computer will capture it and decode the movements.
The system is capable of recognizing capital letters and has 8000 vocabulary words. For now,
the glove is only a prototype and it‟s nowhere near perfect as it still has a 11% error rate.
However, the system will self-correct and adapt to the user‟s writing style, pushing the error
rate down to 3%.
Google has awarded the creator ChristophAmmait‟s Google Faculty Research Award (of
over $80,000) in hopes that it could help him to developed this system
4.Eyesight
EyeSight is a gesture techonology which allows you to navigate through your devices by just
pointing at it. Much like how you use a remote to navigate your TV, you don‟t have to touch
the screen.And get this, the basic requirement for eyeSight to work is to have a basic 2D
webcam (even the built-in ones work) and the software. Your screen need not even be one
with touch technology
To navigate, you just move your finger to move the cursor, push your finger (like how you
push a button) to click. eyeSight does not only work with laptops and computers but it also
works with a lot of other devices such as tablets, televisions and much more. As of now,
eyeSight is not for consumers use, but it is nowoffering software development kits (SDK) for
the Windows, Android and Linux platforms.
5.MauzMauz is a third party device that turns your iPhone into a trackpad or mouse.
Download the driver into your computer and the app to your iPhone then connect the device
to your iPhone via the charger port. Mauz is connected to the computer via Wi-Fi connection.
Start navigating through your computer like you would a regular mouse: left click, right click
and scroll as normal.
Now comes the fun part, you can use gestures with Mauz too. With the iPhone camera on,
move your hands to the left to bring you a page back on your browser and move it right to
bring yourself a page forward. If there‟s an incoming call or a text message simply intercept
it and resume using Mauz right after.Unfortunately, Mauz is not out for consumers to buy just
yet.
6.PointGrab
PointGrab is something similar to eyeSight, in that it enables users to navigate on their
computer just by pointing at it. PointGrab comes in the form of a software and only needs a
2D webcam. The camera will detect your hand movements and with that you can control your
computer. PointGrab works with computers that run on Windows 7 and 8, smartphones,
tablets and television.
Fujitsu, Acer and Lenovo has already implemented this technology in their laptops and
computers that run on Windows 8. The software comes with the specific laptops and
computers and is not by itself available for purchase.
7.Leap Motion
Leap Motion is a motion sensor device that recognizes the user‟s fingers with its infrared
LEDs and cameras. As it works by recognizing only your fingers, when you hover over it to
type on the keyboard, nothing registers. But when you hover your fingers above it, you can
navigate your desktop like you would your smartphone or tablet: flick to browse pages or
pinch to zoom, etc.
It‟s a small USB device that works the moment you connect it to your computer. You don‟t
need to charge it and it works even with non-touch sensitive screens.Leap Motion works well
with gaming and 3D-related sofwares. You can pre-order Leap Motion for $79.99.
8.Myoelectric Arm BandMyoelectric armband or MYO armband is a gadget that allows
you to control your other bluetooth enabled devices using your finger or your hands. How it
works is that, when put on, the armband will detect movements in in your muscle and
translate that into gestures that interact with your compute.

By moving your hands up/down it will scroll the page you are browsing in. By waving it, it
will slide through pictures in a photo album or switch between applications running in your
system. What would this be good for? At the very least, it will be very good for action games.
MYO armband is out for pre-order at the price of $149.
Conclusion:
Today‟s thoughts are again around user interface. Efforts are being put to better the
technology day-in and day-out. The Touchless touch screen user interface can be used
effectively in computers, cell phones, webcams and laptops. May be few years down the line,
our body can be transformed into a virtual mouse, virtual keyboard and what not??, Our body
may be turned in to an input device!
Many personal computers will likely have similar screens in the near future. But touch
interfaces are nothing new -- witness ATM machines.
How about getting completely out of touch? A startup called LM3Labs says it's working with
major computer makers in Japan, Taiwan and the US to incorporate touch less navigation into
their laptops, Called Airstrike; the system uses tiny charge- coupled device (CCD) cameras
integrated into each side of the keyboard to detect user movements.
You can drag windows around or close them, for instance, bypointing and gesturing in
midair above the keyboard.You should be able to buy an Airstrike-equipped laptop next year,
with high-end stand-alone keyboards to follow.
Any such system is unlikely to replace typing and mousing. But that's not the point.
Airstrike aims to give you an occasional quick break from those activities.
REFERENCES
http://dvice.com/archives/2008/05/universal_remot.php?p=4&cat=undefined#more
www.hitslot.com
http://hitslot.com/?p=214
http://www.touchuserinterface.com/2008/09/touchless-touch-screen-that-senses-
your.html
http://technabob.com/blog/2007/03/19/the-touchless-touchscreen-monitor/
http://www.etre.com/blog/2008/02/elliptic_labs_touchless_user_interface/
http://lewisshepherd.wordpress.com/2008/10/13/stop-being-so-touchy
http://www.engadget.com/tag/interface/
http://comogy.com/concepts/170-universal-remote-concept.html

More Related Content

What's hot

Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
Sandeep Jangid
 
Touchless Technology
Touchless TechnologyTouchless Technology
Touchless Technology
Nani Vasireddy
 
Web cam sensing using sdk tool
Web cam sensing using sdk tool Web cam sensing using sdk tool
Web cam sensing using sdk tool
eSAT Journals
 
Touchless technology
Touchless technologyTouchless technology
Touchless technology
Internet User
 
Touchless interactivity is the new frontier
Touchless interactivity is the new frontierTouchless interactivity is the new frontier
Touchless interactivity is the new frontier
LM3LABS
 
Touchless touch screen
Touchless touch screenTouchless touch screen
Touchless touch screen
Lovely Professional University
 
Main ppt
Main pptMain ppt
Main ppt
Manish Mani
 
Touchless Touchscreen Technology
Touchless Touchscreen TechnologyTouchless Touchscreen Technology
Touchless Touchscreen Technology
Saurabh Tripathi
 
52497104 seminar-report
52497104 seminar-report52497104 seminar-report
52497104 seminar-report
dhiru8342
 
Touchless technology Seminar Presentation
Touchless technology Seminar PresentationTouchless technology Seminar Presentation
Touchless technology Seminar Presentation
Aparna Nk
 
Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
Saptarshi Dey
 
Touch Less touch screen
Touch Less touch screenTouch Less touch screen
Touch Less touch screen
Aakashjit Bhattacharya
 
Multi touch technology
Multi touch technologyMulti touch technology
Multi touch technology
SonuRana20111045
 
Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
Tasnin Khan
 
Optical Multi Touch Technology
Optical Multi Touch TechnologyOptical Multi Touch Technology
Optical Multi Touch Technology
Darshan Vithani
 
Touch screen technology
Touch screen technologyTouch screen technology
Touch screen technology
Vivektech
 
touch screen technology presentation
touch screen technology presentationtouch screen technology presentation
touch screen technology presentation
shri. ram murti smarak college of engg. & technology
 
multi touch screen
multi touch screenmulti touch screen
multi touch screen
Piyusha Singh
 
Report on touch screen
Report on touch screenReport on touch screen
Report on touch screen
Alisha Korpal
 
zForce Touch Screen Technology
zForce Touch Screen TechnologyzForce Touch Screen Technology
zForce Touch Screen Technology
Suryakanta Rout
 

What's hot (20)

Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
 
Touchless Technology
Touchless TechnologyTouchless Technology
Touchless Technology
 
Web cam sensing using sdk tool
Web cam sensing using sdk tool Web cam sensing using sdk tool
Web cam sensing using sdk tool
 
Touchless technology
Touchless technologyTouchless technology
Touchless technology
 
Touchless interactivity is the new frontier
Touchless interactivity is the new frontierTouchless interactivity is the new frontier
Touchless interactivity is the new frontier
 
Touchless touch screen
Touchless touch screenTouchless touch screen
Touchless touch screen
 
Main ppt
Main pptMain ppt
Main ppt
 
Touchless Touchscreen Technology
Touchless Touchscreen TechnologyTouchless Touchscreen Technology
Touchless Touchscreen Technology
 
52497104 seminar-report
52497104 seminar-report52497104 seminar-report
52497104 seminar-report
 
Touchless technology Seminar Presentation
Touchless technology Seminar PresentationTouchless technology Seminar Presentation
Touchless technology Seminar Presentation
 
Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
 
Touch Less touch screen
Touch Less touch screenTouch Less touch screen
Touch Less touch screen
 
Multi touch technology
Multi touch technologyMulti touch technology
Multi touch technology
 
Touchless Touchscreen
Touchless TouchscreenTouchless Touchscreen
Touchless Touchscreen
 
Optical Multi Touch Technology
Optical Multi Touch TechnologyOptical Multi Touch Technology
Optical Multi Touch Technology
 
Touch screen technology
Touch screen technologyTouch screen technology
Touch screen technology
 
touch screen technology presentation
touch screen technology presentationtouch screen technology presentation
touch screen technology presentation
 
multi touch screen
multi touch screenmulti touch screen
multi touch screen
 
Report on touch screen
Report on touch screenReport on touch screen
Report on touch screen
 
zForce Touch Screen Technology
zForce Touch Screen TechnologyzForce Touch Screen Technology
zForce Touch Screen Technology
 

Similar to 14 561

Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
IRJET Journal
 
Real time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applicationsReal time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applications
ijujournal
 
Real time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applicationsReal time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applications
ijujournal
 
SIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORTSIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORT
JISMI JACOB
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
IRJET Journal
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET Journal
 
Virtual surgery [new].ppt
Virtual surgery [new].pptVirtual surgery [new].ppt
Virtual surgery [new].ppt
Sreeraj Rajendran
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
IRJET Journal
 
Skinput
SkinputSkinput
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
eSAT Journals
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
eSAT Publishing House
 
Virtual Smart Phones
Virtual Smart PhonesVirtual Smart Phones
Virtual Smart Phones
IRJET Journal
 
virtual surgery
virtual surgeryvirtual surgery
virtual surgery
Makka Vasu
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Renjith Ravi
 
HGR-thesis
HGR-thesisHGR-thesis
HGR-thesis
Sadiq Yerraballi
 
complete
completecomplete
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
IRJET Journal
 
Paper id 25201413
Paper id 25201413Paper id 25201413
Paper id 25201413
IJRAT
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
IRJET Journal
 
IRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex Sensors
IRJET Journal
 

Similar to 14 561 (20)

Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
 
Real time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applicationsReal time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applications
 
Real time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applicationsReal time hand gesture recognition system for dynamic applications
Real time hand gesture recognition system for dynamic applications
 
SIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORTSIXTH SENSE TECHNOLOGY REPORT
SIXTH SENSE TECHNOLOGY REPORT
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
 
Virtual surgery [new].ppt
Virtual surgery [new].pptVirtual surgery [new].ppt
Virtual surgery [new].ppt
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
Skinput
SkinputSkinput
Skinput
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
Virtual Smart Phones
Virtual Smart PhonesVirtual Smart Phones
Virtual Smart Phones
 
virtual surgery
virtual surgeryvirtual surgery
virtual surgery
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
HGR-thesis
HGR-thesisHGR-thesis
HGR-thesis
 
complete
completecomplete
complete
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
 
Paper id 25201413
Paper id 25201413Paper id 25201413
Paper id 25201413
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
 
IRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex Sensors
 

More from Chaitanya Ram

14 599
14 59914 599
14 598
14 59814 598
14 595
14 59514 595
14 593
14 59314 593
14 587
14 58714 587
14 586
14 58614 586
14 585
14 58514 585
14 584
14 58414 584
14 583
14 58314 583
14 581
14 58114 581
14 577
14 57714 577
14 576
14 57614 576
14 575
14 57514 575
14A81A0574
14A81A057414A81A0574
14A81A0574
Chaitanya Ram
 
14 572
14 57214 572
14 571
14 57114 571
14 570
14 57014 570
14 569
14 569 14 569
14 569
Chaitanya Ram
 
14 568
14 56814 568
14 567
14 56714 567

More from Chaitanya Ram (20)

14 599
14 59914 599
14 599
 
14 598
14 59814 598
14 598
 
14 595
14 59514 595
14 595
 
14 593
14 59314 593
14 593
 
14 587
14 58714 587
14 587
 
14 586
14 58614 586
14 586
 
14 585
14 58514 585
14 585
 
14 584
14 58414 584
14 584
 
14 583
14 58314 583
14 583
 
14 581
14 58114 581
14 581
 
14 577
14 57714 577
14 577
 
14 576
14 57614 576
14 576
 
14 575
14 57514 575
14 575
 
14A81A0574
14A81A057414A81A0574
14A81A0574
 
14 572
14 57214 572
14 572
 
14 571
14 57114 571
14 571
 
14 570
14 57014 570
14 570
 
14 569
14 569 14 569
14 569
 
14 568
14 56814 568
14 568
 
14 567
14 56714 567
14 567
 

Recently uploaded

Skybuffer SAM4U tool for SAP license adoption
Skybuffer SAM4U tool for SAP license adoptionSkybuffer SAM4U tool for SAP license adoption
Skybuffer SAM4U tool for SAP license adoption
Tatiana Kojar
 
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development ProvidersYour One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
akankshawande
 
Demystifying Knowledge Management through Storytelling
Demystifying Knowledge Management through StorytellingDemystifying Knowledge Management through Storytelling
Demystifying Knowledge Management through Storytelling
Enterprise Knowledge
 
Dandelion Hashtable: beyond billion requests per second on a commodity server
Dandelion Hashtable: beyond billion requests per second on a commodity serverDandelion Hashtable: beyond billion requests per second on a commodity server
Dandelion Hashtable: beyond billion requests per second on a commodity server
Antonios Katsarakis
 
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-Efficiency
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyFreshworks Rethinks NoSQL for Rapid Scaling & Cost-Efficiency
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-Efficiency
ScyllaDB
 
Apps Break Data
Apps Break DataApps Break Data
Apps Break Data
Ivo Velitchkov
 
“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...
“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...
“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...
Edge AI and Vision Alliance
 
"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota
Fwdays
 
Main news related to the CCS TSI 2023 (2023/1695)
Main news related to the CCS TSI 2023 (2023/1695)Main news related to the CCS TSI 2023 (2023/1695)
Main news related to the CCS TSI 2023 (2023/1695)
Jakub Marek
 
GraphRAG for LifeSciences Hands-On with the Clinical Knowledge Graph
GraphRAG for LifeSciences Hands-On with the Clinical Knowledge GraphGraphRAG for LifeSciences Hands-On with the Clinical Knowledge Graph
GraphRAG for LifeSciences Hands-On with the Clinical Knowledge Graph
Neo4j
 
GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)
Javier Junquera
 
AppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSFAppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSF
Ajin Abraham
 
The Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptxThe Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptx
operationspcvita
 
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
Alex Pruden
 
Christine's Product Research Presentation.pptx
Christine's Product Research Presentation.pptxChristine's Product Research Presentation.pptx
Christine's Product Research Presentation.pptx
christinelarrosa
 
Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...
Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...
Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...
Pitangent Analytics & Technology Solutions Pvt. Ltd
 
A Deep Dive into ScyllaDB's Architecture
A Deep Dive into ScyllaDB's ArchitectureA Deep Dive into ScyllaDB's Architecture
A Deep Dive into ScyllaDB's Architecture
ScyllaDB
 
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
Jason Yip
 
"Scaling RAG Applications to serve millions of users", Kevin Goedecke
"Scaling RAG Applications to serve millions of users",  Kevin Goedecke"Scaling RAG Applications to serve millions of users",  Kevin Goedecke
"Scaling RAG Applications to serve millions of users", Kevin Goedecke
Fwdays
 
Introduction of Cybersecurity with OSS at Code Europe 2024
Introduction of Cybersecurity with OSS  at Code Europe 2024Introduction of Cybersecurity with OSS  at Code Europe 2024
Introduction of Cybersecurity with OSS at Code Europe 2024
Hiroshi SHIBATA
 

Recently uploaded (20)

Skybuffer SAM4U tool for SAP license adoption
Skybuffer SAM4U tool for SAP license adoptionSkybuffer SAM4U tool for SAP license adoption
Skybuffer SAM4U tool for SAP license adoption
 
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development ProvidersYour One-Stop Shop for Python Success: Top 10 US Python Development Providers
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
 
Demystifying Knowledge Management through Storytelling
Demystifying Knowledge Management through StorytellingDemystifying Knowledge Management through Storytelling
Demystifying Knowledge Management through Storytelling
 
Dandelion Hashtable: beyond billion requests per second on a commodity server
Dandelion Hashtable: beyond billion requests per second on a commodity serverDandelion Hashtable: beyond billion requests per second on a commodity server
Dandelion Hashtable: beyond billion requests per second on a commodity server
 
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-Efficiency
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyFreshworks Rethinks NoSQL for Rapid Scaling & Cost-Efficiency
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-Efficiency
 
Apps Break Data
Apps Break DataApps Break Data
Apps Break Data
 
“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...
“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...
“Temporal Event Neural Networks: A More Efficient Alternative to the Transfor...
 
"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota
 
Main news related to the CCS TSI 2023 (2023/1695)
Main news related to the CCS TSI 2023 (2023/1695)Main news related to the CCS TSI 2023 (2023/1695)
Main news related to the CCS TSI 2023 (2023/1695)
 
GraphRAG for LifeSciences Hands-On with the Clinical Knowledge Graph
GraphRAG for LifeSciences Hands-On with the Clinical Knowledge GraphGraphRAG for LifeSciences Hands-On with the Clinical Knowledge Graph
GraphRAG for LifeSciences Hands-On with the Clinical Knowledge Graph
 
GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)
 
AppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSFAppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSF
 
The Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptxThe Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptx
 
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
 
Christine's Product Research Presentation.pptx
Christine's Product Research Presentation.pptxChristine's Product Research Presentation.pptx
Christine's Product Research Presentation.pptx
 
Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...
Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...
Crafting Excellence: A Comprehensive Guide to iOS Mobile App Development Serv...
 
A Deep Dive into ScyllaDB's Architecture
A Deep Dive into ScyllaDB's ArchitectureA Deep Dive into ScyllaDB's Architecture
A Deep Dive into ScyllaDB's Architecture
 
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
 
"Scaling RAG Applications to serve millions of users", Kevin Goedecke
"Scaling RAG Applications to serve millions of users",  Kevin Goedecke"Scaling RAG Applications to serve millions of users",  Kevin Goedecke
"Scaling RAG Applications to serve millions of users", Kevin Goedecke
 
Introduction of Cybersecurity with OSS at Code Europe 2024
Introduction of Cybersecurity with OSS  at Code Europe 2024Introduction of Cybersecurity with OSS  at Code Europe 2024
Introduction of Cybersecurity with OSS at Code Europe 2024
 

14 561

  • 1. A Technical Seminar Report On TOUCHLESS TOUCHSCREEN Submitted in partial fulfilment of the requirements Of Technical Seminar for the award of the Degree of BACHELOR OF TECHNOLOGY In Computer Sciences AndEngneering By A.SWAPNA PRIYA 14A81A0561 3rd CSE- B SRI VASAVI ENGINEERING COLLEGE Pedatadepalli,Tadepalligudem-534101,AP. 2015-2016
  • 2. Contents Introduction.................................................................................................................... 1 What‟s New?................................................................................................................. 2 System Requirements.................................................................................................... 3 Software Installation ..................................................................................................... 4-5 Hardware Installation..................................................................................................... .6 Sensor Mounting............................................................................................................ .7 Configuration.................................................................................................................8 Calibration......................................................................................................................8-9 Calibration with Reflective Surfaces (Whiteboards, Glass etc).....................................10 Licensing........................................................................................................................10-11 Troubleshooting.............................................................................................................12-15 Analysis............................................................................................................................16 Applications.....................................................................................................................18 Acoustics..........................................................................................................................18-24 Conclusion........................................................................................................................25
  • 3. ANALYSIS It obviously requires a sensor but the sensor is neither hand mounted nor present on the screen. The sensor can be placed either onthe table or near the screen. And the hardware setup is so compact that it can be fitted into a tiny device like a MP3 player or a mobile phone. It recognizes the position of an object from as 5 feet. WORKING: The system is capable of detecting movements in 3-dimensions without ever having to put your fingers on the screen. Their patented touchless interface doesn‟t require that you wear any special sensors on your hand either. You just point at the screen (from as far as 5 feet away), and you can manipulate objects in 3D. Sensors are mounted around the screen that is being used, by interacting in the line-of-sight of these sensors the motion is detected and interpreted into on-screen movements. What is to stop unintentional gestures being used as input is not entirely clear, but it looks promising nonetheless APPLICATIONS TOUCH LESS MONITOR: Sure, everybody is doing touchscreen interfaces these days, but this is the first time I‟ve seen a monitor that can respond to gestures without actually having to touch the screen. The monitor, based on technology from TouchKowas recently demonstrated by White Electronic Designs and Tactyl Services at the CeBIT show. Designed for applications where touch may be difficult, such as for doctors who might be wearing surgical gloves, the display features capacitive sensors that can read movements from up to 15cm away from the screen. Software can then translate gestures into screen commands. Touchscreen interfaces are great, but all that touching, like foreplay, can be a little bit of a drag. Enter the wonder kids from Elliptic Labs, who are hard at work on implementing a touchless interface. The input method is, well, in thin air. The technology detects motion in 3D and requires no special worn-sensors for operation. By simply pointing at the screen,users can manipulate the object being displayed in 3D. Details are light on how this actually functions.. Touch-less Gives Glimpse of GBUI: We have seen the futuristic user interfaces of movies like Minority Report and the Matrix Revolutions where people wave their hands in 3 dimensions and the computer understands what the user wants and shifts and sorts data with precision. Microsoft's XD Huang demonstrated how his company sees the future of the GUI at ITEXPO this past September in fact. But at the show, the example was in 2 dimensions..
  • 4. Microsoft's vision on the UI The GBUI as seen in Minority Report in their Redmond headquarters and it involves lots of gestures which allow you to take applications and forward them on to others with simple hand movements. The demos included the concept of software understanding business processes and helping you work. So after reading a document - you could just push it off the side of your screen and the system would know to post it on an intranet and also send a link to a specific group of people. Touch-less UI: The basic idea described in the patent is that there would be sensors arrayed around the perimeter of the device capable of sensing finger movements in 3-D space. The user could use her fingers similarly to a touch phone, but actuallywithout having to touch the screen. That's cool, isn't it? I think the idea is not only great, because user input will not be limited to 2-D anymore, but that I can use my thick, dirty, bandaged, etc. fingers as well (as opposed to "plain" touch UI). I'm a bit skeptic, though, how accurate it can be, whether the software will have AI ! Finally, there is one more thing to mention, it's the built-in accelerometer. For the first time, to our knowledge, our team used a touchless NUI system with a Leap Motion controller during dental surgery. Previous reports have used a different motion sensor (MS Kinect, Microsoft Corp., Redmond, USA) for touchless control of images in general surgery,5,6 and for controlling virtual geographic maps7 among other uses. A Kinect sensor works on a different principle from that of the Leap Motion. The MS Kinect and Xtion PRO (ASUS Computer Inc., Taipei, Taiwan) are basically infrared depth-sensing cameras based on the principle of structured light.3 Our team has been using the MS Kinect sensor with an NUI system for the last two years as an experimental educational tool during workshops of courses on digital dental photography. It has been very useful for this purpose. It has also been tested during clinical situations at dental offices but found to be inadequate for dental scenarios, mainly because the Kinect uses a horizontal tracking approach and needs a minimal working distance (approximately 1.2 m); in contrast, the Leap Motion tracks the user's hands from below. The interaction zone of the MS Kinect is larger (approximately 18m3 ) than that of the Leap Motion (approximately 0.23 m3 ). This means that when using the MS Kinect, the operating room has to be considerably wider and the user has to walk out of the interaction zone in order to stop interacting with the system. In the case of the Leap Motion, the surgeon just has to move his hands out of the smaller interaction zone. The MS Kinect tracks the whole body or the upper part of the body,6 which implies unnecessarily wider movements of the user's arms; this could lead to fatigue during the procedure. On the other hand, the Leap Motion tracks only the user's hands and fingers and has a higher spatial resolution and faster frame rate, which leads to a more precise control for image manipulation. The proposed system performed quite well and fulfilled the objective of providing access and control of the system of images and surgical plan, without touching any device, thus allowing
  • 5. the maintenance of sterile conditions. This motivated a perceived increase in the frequency of intraoperative consultation of the images. Further, the use of modified dental equipment made the experience of using an NUI for intraoperative manipulation of dental images particularly ergonomic. The great potential of the amazing developments in the fields of dental and medical imaging can only be exploited if these developments help healthcare professionals during medical procedures. The interaction between surgical staff, under sterile conditions, and computer equipment has been a key point. A possible solution was to use an assistant outside the surgical field for manipulating the computer-generated images, which was not practical and required one additional person. The proposed solution seems to be closer to an ideal one. A very important point is that the cost of the sensor is quite low and all the system components are available at a relatively low cost; this could allow the easy incorporation of this technology in the budget of clinical facilities of poor countries across the globe, allowing them to reduce the number of personnel required in the operating room, who could otherwise be doing more productive work. On the basis of the success of this proof of concept as demonstrated by this pilot report, we have undertaken further research to optimize the gestures recognized by the NUI. The NUI is producing a revolution in human-machine interaction; it has just started now and will probably evolve over the next 20 years. User interfaces are becoming less visible as computers communicate more like people, and this has the potential to bring humans and technology closer. The contribution of this revolutionary new technology is and will be extremely important in the field of healthcare and has enormous potential in dental and general surgery, as well as in daily clinical practice. What is more important: This would greatly benefit the diagnosis and treatment of a number of diseases and improve the care of people, which is our ultimate and greatest goal. Touch-less SDK: The Touchless SDK is an open source SDK for .NET applications. It enables developers to create multi-touch based applications using a webcam for input. Color based markers defined by the user are tracked and their information is published through events to clients of the SDK. In a nutshell, the Touchless SDK enables touch without touching. Well, Microsoft Office Labs has just released “Touchless,” a webcam-driven multi-touch interface SDK that enables “touch without touching” Using the SDK lets developers offer users “a new and cheap way of experiencing multi- touch capabilities, without the need of expensive hardware or software. All the user needs is a camera,” to track the multi-colored objects as defined by the developer. Just about any webcam will work Touchless started as Mike Wasserman‟s college project at Columbia University. The main idea: to offer users a new and cheap way of experiencing multi-touch capabilities, without the
  • 6. need of expensive hardware or software. All the user needs is a camera, which will track colored markers defined by the user. Mike presented the project at the Microsoft Office Labs Productivity Science Fair, Office Labs fell in love with it, and Touchless was chosen as a Community Project. Our deliverables include an extensible demo application to showcase a limited set of multi-touch capabilities, but mainly we are delivering an SDK to allow users to build their own multi-touch applications. Now, Touchless is released free and open-source to the world under the Microsoft Public License (Ms-PL) on CodePlex. Our goals are to drive community involvement and use of the SDK as it continuesto develop. Remember that this is just the beginning; and you're invited to join our journey. Send us your questions and feedback, use Touchless SDK in your .NET applications and XNA games, and support the community by contributing to the source code Software Development Kit Create multi-touch application .NET development SDK Multi- touch.NET Development System requirements  Visual Studio 2005 or 2008, or Visual Studio Express Edition  .NET 3.0 or greater  "TouchlessLib.dll" and "WebcamLib.dll" Touch-less demo: The Touch less Demo is an open source application that anyone with a webcam can use to experience multi-touch, no geekiness required. The demo was created using the Touch less SDK and Windows Forms with C#. There are 4 fun demos: Snake - where you control a snake with a marker, Defender - up to 4 player version of a pong-like game, Map - where you can rotate, zoom, and move a map using 2 markers, and Draw the marker is used to guess what…. draw! Mike demonstrated Touch less at a recent Office Labs‟ Productivity Science Fair where it was voted by attendees as “most interesting project.” If you wind up using the SDK, one would love to hear what use you make of it! Touchless Demo is an open source application that anyone with a webcam can use to experience multi-touch, no geekiness required. There are 4 fun demos: Snake – where you control a snake with a marker, Defender – up to 4 player version of a pong-like game, Map – where you can rotate, zoom, and move a map using 2 markers, and Draw – the marker is used to guess what…. draw!
  • 7. Touchless SDK is an open source SDK that enables developers to create multi-touch based applications using a webcam for input, geekiness recommended. This project is an Office Labs community project, which means a Microsoft employee worked on this in their spare time. It is also an open source project, which means that anyone can view, use, and contribute to the code. Window Shop pic: In addition, it is worth pointing out that you may need a few cameras in stereo to maximize accuracy and you could theoretically use your hands as a mouse - meaning you can likely take advantage of all the functions of the GBUI while resting your hand on the desk in front of you for most of the day. At some point we will see this stuff hit the OS and when that happens, the consumer can decide if the mouse and keyboard will rule the future or the GBUI will be the killer tech of the next decade. Touch wall: Touch Wall refers to the touch screen hardware setup itself; the corresponding software to run Touch Wall, which is built on a standard version of Vista, is called Plex. Touch Wall and Plex are superficially similar to Microsoft Surface, a multi-touch table computer that was introduced in 2007 and which recently became commercially available in select AT&T stores. It is a fundamentally simpler mechanical system, and is also significantly cheaper to produce. While Surface retails at around $10,000, the hardware to “turn almost anything into a multi-touch interface” for Touch Wall is just “hundreds of dollars”. Touch Wall consists of three infrared lasers that scan a surface. A camera notes when something breaks through the laser line and feeds that information back to the Plex software. Early prototypes, say Pratley and Sands, were made, simply, on a cardboard screen. A projector was used to show the Plex interface on the cardboard, and a the system worked fine. Touch Wall certainly isn‟t the first multi-touch product we‟ve seen (see iPhone). In addition to Surface, of course, there are a number of early prototypes emerging in this space. But what Microsoft has done with a few hundred dollars worth of readily available hardware is stunning. It‟s also clear that the only real limit on the screen size is the projector, meaning that entire walls can easily be turned into a multi touch user interface. Scrap those white boards in the office, and make every flat surface into a touch display instead. You might even save some money. Sounds like the future, right? Yeah, well the future is here today and picking up steam faster than some marketers might care to acknowledge.
  • 8. Kiosks powered by webcams and flash-driven augmented reality apps have been on the market for years, some used in short-lived campaigns and others finding long-term adoption and retention in theme parks, airports and other high-traffic spaces. In 2010 Microsoft released the Kinect, which shook up the possibilities of touch-free gestures as a means to interact with things.Smart phones and the browser are nearly ubiquitous. GPS and Natural-language interactions systems are getting better by the moment, making SIRI seem as sophisticated as a Speak & Preliminary usability testing was carried out by two surgeons for accessing all kinds of supported digital images, simulating typical dental surgery situations. During this phase, the positions of all components were calibrated and adjusted. After trying different positions, we chose the final location of the controller, taking into account the fact that the interaction space of the controller allowed the operator to move his/her hands in an ergonomic way in order to avoid fatigue during the gestures. Different light conditions were tested to verify whether the controller performance was affected. The data transfer rate of the universal serial bus (USB) line and hub was checked. Different Leap Motion control settings were tested for a smooth and stable interaction, and the proposed system was set at 42 fps with a processing time of 23.2 ms; the interaction height was set as automatic. Touchless application (Leap Motion Inc., San Francisco, USA) in advanced mode was used. In this pilot study, 11 dental surgery procedures were conducted using the abovementioned custom system and included in this report as a case series. An overview of the procedures accomplished is presented . This study was in compliance with the Helsinki Declaration, and the protocol was approved by the Institutional Review Board at CORE Dental Clinic in Chaco, Argentina. Each subject signed a detailed informed consent form. Acoustic Touch Panel Have you seen Minority Report (come on, who hasn‟t?) and watched as John Anderton went through holographic reports and databases solely with his gloved hands? Touchless technology based on gestures instead of clicks and typing may have been an element from a sci-fi movie in 2002 but it‟s no longer science fiction today. As we make more advancements in tech, design and gesture navigation, this is the time where we can navigate through a computing system without the use of a keyboard, a mouse, or even touching anything. Feast your eyes on these amazing technology that work with motion sensors and gesture technology that probably grew from the seeds that were planted in this amazingly accurate Steven Spielberg movie. 1.Tobii Rex Tobii Rex is an eye-tracking device from Sweden which works with any computer running on Windows 8. The device has a pair of infrared sensors built in that will track the user‟s eyes. Users need just place Tobii Rex on the bottom part of the screen and it will capture eye movements, engaging in Gaze Interaction.
  • 9. Basically you use your eyes like you would the mouse cursor. Wherever you are looking, the cursor will appear in the precise spot on screen. To select you can use the touchpad. Although not entirely touchless, at least now you need not move a bulky mouse around. It‟s also a great alternative to using the finger on a touch tablet, which blocks off the view of what you want to click or select.As of now Tobii is not out in the market for consumers yet but you can get an invite to for earlier access to it here. 2.Elliptic Labs Elliptic Labs allows you to operate your computer without touching it with the Windows 8 Gesture Suite. It uses ultrasound so it works not with cameras but with your audio tools. Ideally, you need 6 speakers and 8 microphones but the dedicated speakers on laptops and a normal micrphone could work too.
  • 10. The speaker will emit ultrasound which will bounce to microphones so that it could track a user‟s hand movements which will be interpreted by the Elliptic Labs software.This technology is designed to work on the Windows 8 platform and is expected to work on tablets, smartphones and even cars. Elliptic Labs is not out for consumers to buy as the company is focusing on marketing it to Original Equipment Manufacturers (OEM). 3.Airwriting Airwriting is a technology that allows you to write text messages or compose emails by writing in the air. Airwriting comes in the form of a glove which recognizes the path your hands and fingers move in as you write. The glove contains sensors that can record hand movements. What happen is, when the user starts „airwriting‟, the glove will detect it and send it to the computer via wireless connection. The computer will capture it and decode the movements.
  • 11. The system is capable of recognizing capital letters and has 8000 vocabulary words. For now, the glove is only a prototype and it‟s nowhere near perfect as it still has a 11% error rate. However, the system will self-correct and adapt to the user‟s writing style, pushing the error rate down to 3%. Google has awarded the creator ChristophAmmait‟s Google Faculty Research Award (of over $80,000) in hopes that it could help him to developed this system 4.Eyesight EyeSight is a gesture techonology which allows you to navigate through your devices by just pointing at it. Much like how you use a remote to navigate your TV, you don‟t have to touch the screen.And get this, the basic requirement for eyeSight to work is to have a basic 2D webcam (even the built-in ones work) and the software. Your screen need not even be one with touch technology To navigate, you just move your finger to move the cursor, push your finger (like how you push a button) to click. eyeSight does not only work with laptops and computers but it also works with a lot of other devices such as tablets, televisions and much more. As of now, eyeSight is not for consumers use, but it is nowoffering software development kits (SDK) for the Windows, Android and Linux platforms. 5.MauzMauz is a third party device that turns your iPhone into a trackpad or mouse. Download the driver into your computer and the app to your iPhone then connect the device
  • 12. to your iPhone via the charger port. Mauz is connected to the computer via Wi-Fi connection. Start navigating through your computer like you would a regular mouse: left click, right click and scroll as normal. Now comes the fun part, you can use gestures with Mauz too. With the iPhone camera on, move your hands to the left to bring you a page back on your browser and move it right to bring yourself a page forward. If there‟s an incoming call or a text message simply intercept it and resume using Mauz right after.Unfortunately, Mauz is not out for consumers to buy just yet. 6.PointGrab PointGrab is something similar to eyeSight, in that it enables users to navigate on their computer just by pointing at it. PointGrab comes in the form of a software and only needs a 2D webcam. The camera will detect your hand movements and with that you can control your computer. PointGrab works with computers that run on Windows 7 and 8, smartphones, tablets and television.
  • 13. Fujitsu, Acer and Lenovo has already implemented this technology in their laptops and computers that run on Windows 8. The software comes with the specific laptops and computers and is not by itself available for purchase. 7.Leap Motion Leap Motion is a motion sensor device that recognizes the user‟s fingers with its infrared LEDs and cameras. As it works by recognizing only your fingers, when you hover over it to type on the keyboard, nothing registers. But when you hover your fingers above it, you can navigate your desktop like you would your smartphone or tablet: flick to browse pages or pinch to zoom, etc.
  • 14. It‟s a small USB device that works the moment you connect it to your computer. You don‟t need to charge it and it works even with non-touch sensitive screens.Leap Motion works well with gaming and 3D-related sofwares. You can pre-order Leap Motion for $79.99. 8.Myoelectric Arm BandMyoelectric armband or MYO armband is a gadget that allows you to control your other bluetooth enabled devices using your finger or your hands. How it works is that, when put on, the armband will detect movements in in your muscle and translate that into gestures that interact with your compute.
  • 15.  By moving your hands up/down it will scroll the page you are browsing in. By waving it, it will slide through pictures in a photo album or switch between applications running in your system. What would this be good for? At the very least, it will be very good for action games. MYO armband is out for pre-order at the price of $149. Conclusion: Today‟s thoughts are again around user interface. Efforts are being put to better the technology day-in and day-out. The Touchless touch screen user interface can be used effectively in computers, cell phones, webcams and laptops. May be few years down the line, our body can be transformed into a virtual mouse, virtual keyboard and what not??, Our body may be turned in to an input device! Many personal computers will likely have similar screens in the near future. But touch interfaces are nothing new -- witness ATM machines. How about getting completely out of touch? A startup called LM3Labs says it's working with major computer makers in Japan, Taiwan and the US to incorporate touch less navigation into their laptops, Called Airstrike; the system uses tiny charge- coupled device (CCD) cameras integrated into each side of the keyboard to detect user movements. You can drag windows around or close them, for instance, bypointing and gesturing in midair above the keyboard.You should be able to buy an Airstrike-equipped laptop next year, with high-end stand-alone keyboards to follow.
  • 16. Any such system is unlikely to replace typing and mousing. But that's not the point. Airstrike aims to give you an occasional quick break from those activities. REFERENCES http://dvice.com/archives/2008/05/universal_remot.php?p=4&cat=undefined#more www.hitslot.com http://hitslot.com/?p=214 http://www.touchuserinterface.com/2008/09/touchless-touch-screen-that-senses- your.html http://technabob.com/blog/2007/03/19/the-touchless-touchscreen-monitor/ http://www.etre.com/blog/2008/02/elliptic_labs_touchless_user_interface/ http://lewisshepherd.wordpress.com/2008/10/13/stop-being-so-touchy http://www.engadget.com/tag/interface/ http://comogy.com/concepts/170-universal-remote-concept.html