SlideShare a Scribd company logo
SIXTH SENSE TECHNOLOGY
Seminar report submitted in partial fulfillment of the
requirements for the award of the degree of
Bachelor of Technology (B. Tech)
In
Information Technology
Of
University of Calicut
By
JISMI K JACOB
Under the Guidance of
Ms. JISNA V A
Asst.Professor, IT Department
April 2014
Department of Information Technology
JYOTHI ENGINEERING COLLEGE,
Cheruthuruthy,Thrissur – 679 531.
DEPARTMENT OF INFORMATION TECHNOLOGY
JYOTHI ENGINEERING COLLEGE
CHERUTHURUTHY, THRISSUR- 679 531
April 2014
CERTIFICATE
This is to certify that the seminar report entitled “SIXTH SENSE
TECHNOLOGY” being submitted by Ms. JISMI K JACOB in partial
fulfillment of the requirements for the award of degree of Bachelor of
Technology of University of Calicut is bonafide record of work carried out at
Department of Information Technology, JECC by her during the period
December 2013 – April 2014 under our supervision.
Seminar Guide Seminar Coordinator Ms. DIVYA M MENON
Ms. JISNA V A Ms. SABNA A B H.O.D in charge,
Asst.Professor Asst.Professor IT Department
ABSTRACT
Sixth Sense technology is a technology with which a system could be trained to
recognize and percept real world objects and react as desired. Sixth Sense technology bridges
the gap between the physical world and the digital world, bringing intangible, digital
information out into the tangible world, and allowing us to interact with this information via
natural hand gestures. Sixth Sense Technology is implemented in 'Sixth Sense/WUW (wear
your world) using gesture recognition, augmented reality, computer vision and radio
frequency identification. It’s just born concept which allows user to connect with the internet
seamlessly. Without use of keyboard, mouse we can see videos access, change, move data
simply .But this concept bottle necks lead to modification of the same by using commands
instead of gestures. Sixth Sense technology could be integrated with voice recognition.
Bluetooth device and laser projectors could be used.
INDEX
Contents Page no:
1. INTRODUCTION 1
2. OVERVIEW 3
3. ORIGIN OF IDEA 4
3.1 What is Sixth Sense Technology? 5
3.2 Earlier Sixth Sense Prototype 5
3.3 Recent Prototype 6
4. DETAILED DESCRIPTION 8
4.1 Components 8
4.2 Working of Sixth Sense Technology 11
4.3 Related Technologies 13
4.4 Applications 19
4.5 Advantages 23
4.6 Future Enhancements 23
5. CONCLUSION 24
6. BIBLIOGRAPHY 25
List of Figures
3.1 Six Senses 5
3.2 Earlier Prototype 6
3.3 Recent Prototypes 7
4.1.1 Camera 8
4.1.2 Projector 9
4.1.3 Mirror 9
4.1.4 Smartphone 10
4.1.5 Color Marker 11
4.1.6 Microphone 11
4.2 Working of the Sixth Sense 12
4.3.1 Augmented reality 15
4.3.2 Computer vision 15
4.3.3 Hand gestures 17
4.3.4 Components of RFID 18
CHAPTER 1
INTRODUCTION
We use our five natural senses to perceive any information; that information helps us
make decisions and choose the right actions to take. But arguably the most useful information
that can help us make the right decision is not naturally perceivable with our five senses,
namely the data, information and knowledge that mankind has accumulated about everything
and which is increasingly all available online. Although the miniaturization of computing
devices allows us to carry computers in our pockets, keeping us continually connected to the
digital world, there is no link between our digital devices and our interactions with the
physical world. Information is confined traditionally on paper or digitally on a screen. Sixth
Sense Technology bridges this gap, bringing intangible, digital information out into the
tangible world, and allowing us to interact with this information via natural hand gestures.
‘Sixth Sense’ frees information from its confines by seamlessly integrating it with reality, and
thus making the entire world your computer. WUW was developed by Pranav Mistry, a Ph. D
student at Fluid Interfaces Group at the MIT Media Lab. The Sixth Sense prototype
implements several applications that demonstrate the usefulness, viability and flexibility of
the system acts as the computer and your connection to the Cloud, all the information stored
on the web.
Sixth Sense recognizes the objects around you, displaying information automatically
and letting you access it in any way you want, in the simplest way possible. The device
brings us closer to reality and assists us in making right decisions by providing the relevant
information, thereby, making the entire world a computer. The technology is mainly based on
hand gesture recognition, image capturing, processing, and manipulation, etc. The software of
the technology uses the video stream, which is captured by the camera, and also tracks the
location of the tips of the fingers to recognize the gestures. This process is done using some
techniques of computer vision. He invented ‘ Sixth Sense / WUW ( Wear UR World)’ which
is a wearable gestural , user friendly interface which links the physical world around us with
digital information and uses hand gestures to interact with them. This technology is a
revolutionary way to interface the physical world with digital information. Modern
technologies include the touch screen techniques which is used widely and it makes ease of
operation and saves utilization time.
This paper deals with the latest technology called the sixth sense. It’s a wearable
interface that augments the physical world around us with the digital information. It’s just
born concept which allows user to connect with the internet seamlessly. Without use of
keyboard, mouse we can see videos access, change, move data simply .But this concept bottle
necks lead to modification of the same by using commands instead of gestures. Speech IC is
used as a database for commands which will be initially trained for storage. It performs the
corresponding commands accessing the operation from the mobile device connected to it and
action is projected using a projector over any surface.
CHAPTER 2
OVERVIEW
Previously many technologies evolved such as augmented reality which is to add
information and meaning to real object or place. Unlike virtual reality, augmented reality
does not create a simulation of reality instead it takes a real object or space as the foundation
and incorporates technologies that add contextual data to deepen a person understanding of
the subject. It’s a term for live direct or indirect view of a physical real world environment
whose elements are augmented by virtual computer generated imagery. Gesture recognition is
a term with a goal of interpreting human gestures through mathematical gestures and
mathematical algorithms. Computer vision is the science and technology of machines that is
concerned with the theory behind artificial systems that extract information from the images.
As a technological discipline, computer vision seeks to apply its theories and models
to the construction of computer vision systems. The examples include the controlling
processes, detecting events, organising information, modelling objects or environments and
interaction. Recently speech integrated circuits evolved which is used widely in car
automation and home appliances. It eases the operation and saves the utilization time of the
manual operations performed by the human’s every day. The speech recognition process is
performed by a software component known as speech recognition engine. The primary
function of this is to process the spoken input and translate it into text which the application
understands. The application then can do one of the two things, 1.The application can
interpret the result of the recognition as a command, in this case application is a command
and control application. If the application handles the recognized text as simply text, then it’s
considered as dictation application. When the user says something, it is known as utterance.
An utterance is a stream of speech between two periods of silence. The speech IC can be used
for all sorts of data, statistical models, and algorithms to convert spoken input into text.
CHAPTER 3
ORIGIN OF IDEA
This technology is a revolutionary way to interface the physical world with digital
information. Modern technologies include the touch screen techniques which is used widely
and it makes ease of operation and saves utilization time. Sixth sense is a wearable gestural
interface that augments the physical world around us with digital information and lets us use
natural hand gestures to interact with that information. But the bottle necks of this method
such as position of camera, for capturing gestures interprets the accuracy in the projected
output, lead to use of commands instead of hand gestures. The position of camera is a major
constraint in the image capturing and projected output efficiency and accuracy. Therefore the
actions which we regularly perform in our daily life, are converted to commands and are
trained to a speech IC .They are stored as a database in the integrated circuit and
corresponding actions are performed when the speech is recognized from the user. It’s a hi-
tech device seamlessly integrate Analog information with our every day physical world. The
voice is directly performed into operation within fractions of seconds, and the action is
projected on the surface. It’s a portable device and eases the operation which we regularly
perform. Basically the sixth sense technology concept involves the use of hand gestures .the
finger tip will contain colored markers and hence gestures performed will be captured by the
camera. Then it’s given to the mobile device for the corresponding action to be performed.
The action is projected on the surface through the projector. Software algorithms and
computer vision technologies will be used to enable the action from the mobile device for the
corresponding gesture captured in the camera. This gesture based technology is used for
variety of applications like performing basic actions, locating points in the map, watching
video in news paper, dialing number in hand etc. The slight modification of this method leads
to the use of commands that is analog information into real world. The analog data is
converted into digital and performed as action, as all times the hand gestures cannot be used.
This was how the wearable device is fit to the human body. Here color markers are used in
the finger tips.
In our technology we use commands for performing the same operations. Many high
technology speech integrated circuits evolved which makes our operation enhanced with
more advanced features. To ensure accurate gesture recognition and an intuitive interface a
number of constraints are applied. A region in the front of the projection screen is defined as
the active zone and the gestures are ignored, if the gestures are performed out of this area.
Gestures are also defined by a set start posture, end posture and dynamic motion between the
start and end postures. Perhaps the use of gestures is most powerful when combined with
other input modalities, especially voice. Allowing combined voice and gestural input has
several tangible advantages. The first is purely practical-ease of expression .Ease corresponds
to the efficiency with which commands can be remembered and expressiveness, size of
command vocabulary.
3.1 WHAT IS SIXTH SENSE?
Sixth sense in scientific or non scientific terms defined as Extra Sensory Perception. It
involves the reception of information not gained through any of the five senses. Namely the
data, information and knowledge that mankind has accumulated about everything available
online.
Figure 3.1 six senses
3.2 EARLIER SIXTH SENSE PROTOTYPE
Steve Mann is the father of Sixth Sense Technology who made of wearable
computer in 1990. He implemented the neck worn projector with a camera system. Maes’
MIT group, which includes seven graduate students, were thinking about how a person could
be more integrated into the world around them and access information without having to do
something like take out a phone. They initially produced a wristband that would read an
Radio Frequency Identification tag to know, for example, which book a user is holding in a
store. They also had a ring that used infrared to communicate by beacon to supermarket smart
shelves to give you information about products. As we grab a package of macaroni, the ring
would glow red or green to tell us if the product was organic or free of peanut traces —
whatever criteria we program into the system. They wanted to make information more useful
to people in real time with minimal effort in a way that doesn’t require any behavior changes.
The wristband was getting close, but we still had to take out our cell phone to look at the
information. That’s when they struck on the idea of accessing information from the internet
and projecting it. So someone wearing the wristband could pick up a paperback in the
bookstore and immediately call up reviews about the book, projecting them onto a surface in
the store or doing a keyword search through the book by accessing digitized pages on
Amazon or Google books. They started with a larger projector that was mounted on a helmet.
But that proved cumbersome if someone was projecting data onto a wall then turned to speak
to friend — the data would project on the friend’s face.
Figure 3.2: Earlier prototype
3.3 RECENT PROTOTYPE
WUW was developed by Pranav Mistry, a Ph. D student at Fluid Interfaces Group at
the MIT Media Lab. The Sixth Sense prototype implements several applications that
demonstrate the usefulness, viability and flexibility of the system acts as the computer and
your connection to the Cloud, all the information stored on the web. The key here is that
Sixth Sense recognizes the objects around you, displaying information automatically and
letting you access it in any way you want, in the simplest way possible. The device brings us
closer to reality and assists us in making right decisions by providing the relevant
information, thereby, making the entire world a computer. The technology is mainly based on
hand gesture recognition, image capturing, processing, and manipulation, etc. The software of
the technology uses the video stream, which is captured by the camera, and also tracks the
location of the tips of the fingers to recognize the gestures. This process is done using some
techniques of computer vision. He invented ‘ Sixth Sense / WUW ( Wear UR World)’ which
is a wearable gestural , user friendly interface which links the physical world around us with
digital information and uses hand gestures to interact with them.
Figure 3.3 Recent prototypes
CHAPTER 4
DETAILED DESCRIPTION
4.1 COMPONENTS
The hardware components are coupled in a pendant like mobile wearable device. The
components are:
1. Camera
2. Projector
3. Mirror
4. Mobile component
5. Color markers
6. Microphone
4.1.1 Camera
Camera captures an object in view and tracks the user’s hand gestures. It sends the
data to smart phone. Camera recognizes and tracks user's hand gestures and physical objects
using computer-vision based techniques. Sixth Sense system implements a gestural camera
that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. It acts
as a digital eye, connecting you to the world of digital information.
Figure 4.1.1: Camera
4.1.2 Projector
A tiny LED projector displays data sent from the smart phone on any surface in view-object,
wall, or person. The projector projects visual information enabling surfaces, walls and
physical objects around us to be used as interfaces. The projector projects visual information
enabling surfaces, walls and physical objects around us to be used as interfaces. We want this
thing to merge with the physical world in a real physical sense. You are touching that object and
projecting info onto that object. The information will look like it is part of the object. The
projector itself contains a battery inside, with 3 hours of battery life.
Figure 4.1.2: Projector
4.1.3 Mirror
The mirror reflects the projection coming out from the projector and thus helps in
projecting onto the desired locations on walls or surfaces. The user manually can change the
tilt of the mirror to change the location of the projection. For example in application where
the user wants the projection to go on the ground instead of the surface in front, he can
change the tilt of the mirror to change the projection. Thus, the mirror in the Sixth Sense
helps in overcoming the limitation of the limited projection space of the projector.
Figure 4.1.3 Mirror
4.1.4 Mobile component
The Sixth Sense system uses a mobile computing device in user’s pocket as the
processing device. The software program enabling all the features of the system runs on this
computing device. This device can be a mobile phone or a small laptop computer. The
camera, the projector and the microphone are connected to this device using wired or wireless
connection. The detail of the software program that runs on
this device is provided in next section. The mobile computing device is also connected to the
Internet via 3G network or wireless connection.
A Web-enabled smart phone in the user's pocket processes the video data, using
vision algorithms to identify the object. Other software searches the Web and interprets the
hand gestures.
Figure 4.1.4: Smartphone
4.1.5 Color marker
Color marker is at the tip of the user’s fingers. Marking the user’s fingers with red, yellow,
green, and blue tape helps the webcam recognize gestures. The camera tracks the movements
of the color markers. The movements and arrangements of these makers are interpreted into
gestures that act as interaction instructions for the projected application interfaces.
The software program processes the video stream data captured by the camera and
tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s
fingers using simple computer-vision techniques. The movements and arrangements of these
fiducials are interpreted into gestures that act as interaction instructions for the projected
application interfaces. The maximum number of tracked fingers is only constrained by the
number of unique fiducials, thus Sixth Sense also supports multi-touch and multi-user
interaction.
Figure 4.1.5: Color marker
4.1.6 Microphone
The microphone is an optional component of the Sixth Sense. It is required when
using a paper as a computing interface. When the user wants to use a sheet of paper as an
interactive surface, he or she clips the microphone to the paper. The microphone attached this
way captures the sound signals of user’s touching the paper. This data is passed to computing
device for processing. Later, combined with the tracking information about user’s finger, the
system is able to identify precise touch events on the paper. Here, the sound signal captured
by the microphone provides time information whereas the camera performs tracking. The
applications enabled by this technique are explained earlier.
Figure 4.1.6 Microphone
4.2 WORKING OF SIXTH SENSE TECHNOLOGY
The hardware that makes Sixth Sense work is a pendant like mobile wearable
interface. It has a camera, a mirror and a projector and is connected wirelessly to a Bluetooth
smart phone that can slip comfortably into one’s pocket. The camera recognizes individuals,
images, pictures, gestures one makes with their hands. Information is sent to the Smartphone
for processing. The downward-facing projector projects the output image on to the mirror.
Mirror reflects image on to the desired surface. Thus, digital information is freed from its
confines and placed in the physical world.
Figure 4.2: working of the sixth sense
The entire hardware apparatus is encompassed in a pendant-shaped mobile wearable
device. Basically the camera recognizes individuals, images, pictures, gestures one makes
with their hands and the projector assists in projecting any information on whatever type of
surface is present in front of the person. The usage of the mirror is significant as the projector
dangles pointing downwards from the neck. To bring out variations on a much higher plane,
in the demo video which was broadcasted to showcase the prototype to the world, Mistry uses
colored caps on his fingers so that it becomes simpler for the software to differentiate
between the fingers, demanding various applications.
The software program analyses the video data caught by the camera and also tracks
down the locations of the colored markers by utilizing single computer vision techniques.
One can have any number of hand gestures and movements as long as they are all reasonably
identified and differentiated for the system to interpret it, preferably through unique and
varied fiducials. This is possible only because the ‘Sixth Sense’ device supports multi-touch
and multi-user interaction.
The technology is mainly based on hand gesture recognition, image capturing,
processing, and manipulation, etc. The map application lets the user navigate a map displayed
on a nearby surface using hand gestures, similar to gestures supported by multi-touch based
systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The
drawing application lets the user draw on any surface by tracking the fingertip movements of
the user’s index finger.
4.3 RELATED TECHNOLOGIES
Previously many technologies evolved such as augmented reality which is to add
information and meaning to real object or place. Unlike virtual reality, augmented reality
does not create a simulation of reality instead it takes a real object or space as the foundation
and incorporates technologies that add contextual data to deepen a person understanding of
the subject. It’s a term for live direct or indirect view of a physical real world environment
whose elements are augmented by virtual computer generated imagery. Sixth Sense
technology takes a different approach to computing and tries to make the digital aspect of our
lives more intuitive, interactive and, above all, more natural. We shouldn’t have to think
about it separately. It’s a lot of complex technology squeezed into a simple portable device.
When we bring in connectivity, we can get instant, relevant visual information projected on
any object we pick up or interact with The technology is mainly based on hand augmented
reality, gesture recognition, computer vision based algorithm etc.
4.3.1 AUGMENTED REALITY
Augmented reality is a term for a live direct or indirect view of a physical real world
environment whose elements are augmented by virtual computer generated imagery.
Augmented reality, blurs the line between what's real and what's computer-generated by
enhancing what we see, hear, feel and smell. Augmented reality is one of the newest
innovations in the electronics industry. It superimposes graphics, audio and other sense
enhancements from computer screens onto real time environments. Augmented reality goes
far beyond the static graphics technology of television where the graphics imposed do not
change with the perspective. Augmented reality systems superimpose graphics for every
perspective and adjust to every movement of the user's head and eyes. The basic idea of
augmented reality is to superimpose graphics, audio and other sensory enhancements over a
real world environment in real time. However, augmented reality is more advanced than any
technology you've seen in television broadcasts, although some new TV effects come close,
such as RACEf/x and the super-imposed first down line on televised U.S. football games,
both created by Sport vision. But these systems display graphics for only one point of view.
You can even point the phone at a building, and Layar will tell you if any companies in that
building are hiring, or it might be able to find photos of the building on Flickr. Augmented
reality adds graphics, sounds, haptic feedback and smell to the natural world as it exists.
Everyone from tourists, to soldiers, to someone looking for the closest subway stop can now
benefit from the ability to place computer-generated graphics in their field of vision.
The main hardware components for augmented reality are: display, tracking, input
devices, and computer. Combination of powerful CPU, camera, accelerometers, GPS and
solid state compass are often present in modern Smartphone, which make them prospective
platforms. There are three major display techniques for Augmented Reality:
1. Head Mounted Displays
2. Handheld Displays
3. Spatial Displays
1. Head Mounted Displays
A Head Mounted Display (HMD) places images of both the physical world and
registered virtual graphical objects over the user's view of the world. The HMD's are either
optical see through or video see-through in nature.
2. Handheld Displays
Handheld Augment Reality employs a small computing device with a display that fits in a
user's hand. All handheld AR solutions to date have employed video see through techniques
to overlay the graphical information to the physical world. Initially handheld AR employed
sensors such as digital compasses and GPS units for its six degree of freedom tracking
sensors.
3. Spatial Displays
Instead of the user wearing or carrying the display such as with head mounted displays or
handheld devices; Spatial Augmented Reality (SAR) makes use of digital projectors to
display graphical information onto physical objects.
Figure 4.3.1 Augmented reality
4.3.2 COMPUTER VISION
Computer vision is the science and technology of machines that see. It is concerned
with the theory behind artificial systems that extract information from images. An image is a
huge array of gray level (brightness) values of individual pixels. Taken individually, these
numbers are almost meaningless, because they contain very little information about the scene
. A robot needs information like "object ahead", "table to the left", or "person approaching" to
perform its tasks. The conversion of this huge amount of low level information into usable
high level information is the subject of computer vision. Earlier algorithms were too
computationally expensive to run in real-time, but also required any type of memory and
modeling. We concentrate on two types of images frequently used in computer vision:
Intensity images Photograph like images encoding light intensities, Range images encoding
shape and distance (sonar and laser).
Figure 4.3.2: Computer vision
4.3.3 GESTURE RECOGNTION
Gesture recognition is a topic in computer science and language technology with the
goal of interpreting human gestures via Mathematical algorithms. Gestures can originate from
any bodily motion or state but commonly originate from the face or hand. Current focuses in
the field include emotion recognition from the face and hand gesture recognition. The
keyboard and mouse are currently the main interfaces between man and computer. Humans
communicate mainly by vision and sound, therefore, a man-machine interface would be more
intuitive if it made greater use of vision and audio recognition. Another advantage is that the
user not only can communicate from a distance, but need have no physical contact with the
computer. However, unlike audio commands, a visual system would be preferable in noisy
environments or in situations where sound would cause a disturbance. Many approaches have
been made using cameras and computer vision algorithms to interpret sign language.
However, the identification and recognition of posture, gait, proxemics, and human behaviors
is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way
for computers to begin to understand human body language, thus building a richer bridge
between machines and humans than primitive text user interfaces or even GUIs (graphical
user interfaces), which still limit the majority of input to keyboard and mouse. Gesture
recognition enables humans to interface with the machine (HMI) and interact naturally
without any mechanical devices. Using the concept of gesture recognition, it is possible to
point a finger at the computer screen so that the cursor will move accordingly. Gesture
recognition is useful for processing information from humans which is not conveyed through
speech or type. As well, there are various types of gestures which can be identified by
computers.
Figure 4.3.3: Hand gestures
4.3.4 RADIO FREQUENCY IDENTIFICATION
Radio Frequency Identification is basically an electronic tagging technology that
allows the detection, tracking of tags and consequently the objects that they are affixed to
Radio frequency identification (RFID) is a technology that uses communication via radio
waves to exchange data between a reader and an electronic tag attached to an object, for the
purpose of identification and tracking. Some tags can be read from several meters away and
beyond the line of sight of the reader. The application of bulk reading enables an almost
parallel reading of tags. Radio frequency identification (RFID) is a generic term that is used
to describe a system that transmits the identity (in the form of a unique serial number) of an
object or person wirelessly, using radio waves. It's grouped under the broad category of
automatic identification technologies. Radio-frequency identification involves interrogators
(also known as readers), and tags (also known as labels). Most RFID tags contain at least two
parts. One is an integrated circuit for storing and processing information, modulating and
demodulating a radio-frequency (RF) signal, and other specialized functions. The other is an
antenna for receiving and transmitting the signal. RFID is in use all around us. If you have
ever chipped your pet with an ID tag, used EZ Pass through a toll booth, or paid for gas using
Speed Pass, you've used RFID. In addition, RFID is increasingly used with biometric
technologies for security.
The idea of Sixth Sense is to use Radio Frequency Identification technology in
conjunction with a bunch of other enterprise systems such as the calendar system or online
presence that can track user activity. Here, we consider an enterprise setting of the future
where people (or rather their employee badges) and their personal objects such as books,
laptops, and mobile phones are tagged with cheap, passive RFID tags, and there is good
coverage of RFID readers in the workplace. Sixth Sense incorporates algorithms that start
with a mass of undifferentiated tags and automatically infer a range of information based on
an accumulation of observations. The technology is able to automatically differentiate
between people tags and object tags, learn the identities of people, infer the ownership of
objects by people, learn the nature of different zones in a workspace (e.g., private office
versus conference room), and perform other such inferences. By combining information from
these diverse sources, Sixth Sense records all tag-level events in a raw database. The
inference algorithms consume these raw events to infer events at the level of people, objects,
and workspace zones, which are then recorded in a separate processed database. Applications
can either poll these databases by running SQL queries or set up triggers to be notified of
specific events of interest. Sixth Sense infers when a user has interacted with an object, for
example, when you pick up your mobile phone. It is a platform in that its programming
model makes the inferences made automatically available to applications via a rich set of
APIs. To demonstrate the capabilities of the platform, the researchers have prototyped a few
applications using these APIs, including a misplaced object alert service, an enhanced
calendar service, and rich annotation of video with physical events.
Figure 4.3.4: Components of RFID
4.4 APPLICATIONS
The basic operations such as enabling clock, inbox, browsing, searching gallery,
calendar, seeing contact list etc are performed regularly in the mobile every time. These
operations can be stored as commands in the IC and then can be accessed on the screen or
over any surface using our technology within fractions of seconds.
4.4.1 MAKE A CALL
The sixth sense can project the keypad onto one’s hand which can be used as a virtual
screen to make a call. You can use the Sixth Sense to project a keypad onto your hand, then
use that virtual keypad to make a call. Calling a number also will not be a great task with the
introduction of Sixth Sense Technology. No mobile device will be required, just type in the
number with your palm acting as the virtual keypad. The keys will come up on the fingers.
The fingers of the other hand will then be used to key in the number and call.
Figure 4.4.1: Making a call
4.4.2 CHECK THE TIME
When you draw a circle on your wrist, a virtual watch appears that gives you the correct
time. Sixth Sense all we have to do is draw a circle on our wrist with our index finger to get a
virtual watch that gives us the correct time. The computer tracks the red marker cap or piece
of tape, recognizes the gesture, and instructs the projector to flash the image of a watch onto
his wrist.
Figure 4.4.2: Checking time
4.4.3 ZOOMING FEATURES
The user can zoom in or zoom out by using their intuitive hand movements.
Figure 4.4.3: zooming
4.4.4 GET PRODUCT INFORMATION
The sixth sense uses image recognition or a marker technology to recognize the products
we pick up and gives us the information on those products. For example, if you're trying to
shop "green" and are looking for paper towels with the least amount of bleach in them, the
system will scan the product you pick up off the shelf and give you guidance on whether this
product is a good choice for you.
Figure 4.4.4: Product information
4.4.5 GET FLIGHT UPDATES
By using our boarding pass, the system will recognize it and would let you know whether
your flight is on time and if the gate has changed.
Figure 4.4.5: Flight updates
4.4.6 TAKE A PICTURE
If you fashion your index finger and thumbs into a square (“framing” gesture), the
system snaps a photo at that time. After taking the desired number of photos, we can project
them onto a surface to view them.
Figure 4.4.6: Take pictures
4.4.7 CALL UP A MAP
It is helpful to call up the map of our choice and then use our thumbs and index
fingers to navigate the map. The sixth sense also implements map which lets the user display
the map on any physical surface and find his destination and he can use his thumbs and index
fingers to navigate the map, for example, to zoom in and out and do other controls.
Figure 4.4.7: Virtual map
4.4.8 CHECK THE EMAIL
Draw an @ symbol on any surface helps to check the email using the web service.
4.4.9 CREATE MULTIMEDIA READING EXPERIENCES
The Sixth Sense system also augments physical objects the user is interacting with by
projecting more information about these objects projected on them. For example, a
newspaper can show live video news or dynamic information can be provided on a regular
piece of paper. Thus a piece of paper turns into a video display.
Figure 4.4.9: Multimedia reading
4.4.10 FEED INFORMATION ON PEOPLE
It helps to display the relevant information about a person we are looking at. Sixth
Sense also is capable of "a more controversial use”. When you go out and meet someone,
projecting relevant information such as what they do, where they work, and also it could
display tags about the person floating on their shirt. It could be handy if it displayed their
facebook relationship status so that you knew not to waste your time.
Figure 4.4.10: info about person
4.5 ADVANTAGES
Portable, supports multi-touch and multi-user interaction, connectedness between
world and information, cost effective, data access directly from machine in real time, mind
map the idea anywhere, assists us in making right decisions, supports multi touch and multi
user interaction, the device serves the purpose of a computer plus saves time spent on
searching information. Sixth Sense also recognizes user’s freehand gestures (postures) and
saves electricity.
4.6 FUTURE ENHANCEMENTS
Imagine the world where Sixth Sense Technology is applied everywhere. In
educational field, the number of hardware components could be reduced. Usage of papers and
electricity could decrease. Students could use any wall or any surface wherever they are to
carry out activities that are done in a PC. Security will be assured for everyone. It could be
helpful in rendering defense services. In medical field, it could be implied to check the
genuinity of drugs. It could be implemented to monitor the agricultural lands. Blind people
could be able to read books and recognize objects. It could be used for the betterment of
handicapped people. Sixth sense could make the world magical.
CHAPTER 5
CONCLUSION
Sixth Sense technology recognizes the objects around us, displaying information
automatically and letting us to access it in any way we need. The Sixth Sense prototype
implements several applications that demonstrate the usefulness, viability and flexibility of
the system. Allowing us to interact with this information through natural hand gestures. The
potential of becoming the ultimate "transparent" user interface for accessing information
about everything around us. Currently the prototype of the device costs around $350 to build.
It could change the way we interact with the real world and truly give everyone complete
awareness of the environment around us. The Sixth Sense prototype implements several
applications that demonstrate the usefulness, viability and flexibility of the system. It will
definitely revolutionize the world.
The Sixth Sense software will be open source. As far as this seems to be a little set of
items, there will not be user interfaces or much advanced programs for the users. There will
be much harder and secured coding inside the device to make sure the security of the
software. It will be interesting to know the new language for coding for a sixth sense device.
BIBLIOGRAPHY
[1] Kirishima, T. Sato, K. Chihara, K.Dept. of Electr. Eng., Nara Nat. Coll. of Technol.,
Japan Robotics, “Gesture Spotting and Recognition for Human–Robot Interaction”, IEEE
Transactions on Volume: 23, Issue:2 pp256 – 270., April 2007.
[2] Alon,J.Athitsos, V.Quan, YuanSclarof,” A Unified Framework for Gesture Recognition
and Spatiotemporal Gesture Segmentation”, IEEE transactions on Pattern Analysis and
Machine Intelligence, Volume: 31, Issue:9 pp 1685 - 1699 ., Sept. 2009.
[3] Gomez, A.M. Peinado, A.M. Sanchez, V. Rubio, A.J.dept. eoria de la Senal,” Recognition
of coded speech transmitted over wireless channels Wireless Communications”, IEEE
Transactions on Volume: 5, Issue: 9, pp-2555 – 2562., Sept. 2006.
[4] Evans, J.R. Tjoland, W.A. Allred, L.G.Ogden Air Logistics Center, Hill AFB,” Achieving
a hands-free computer interface using voice recognition and speech synthesis”, IEEE
Volume: 15, Issue:1, pp 14 16., Jan 2000.
[5] Pelaez-Moreno, C. Gallardo-Antolin, A. Diaz-de-Maria,” Recognizing voice over IP: a
robust front-end for speech recognition on the world wide webMultimedia”, IEEE
Transactions on Volume: 3, Issue:2, pp-209 – 218., Jun 2001.

More Related Content

What's hot

5 pen-pc-technology complete ppt
5 pen-pc-technology complete ppt5 pen-pc-technology complete ppt
5 pen-pc-technology complete ppt
atinav242
 
Touchless Touchscreen Technology
Touchless Touchscreen TechnologyTouchless Touchscreen Technology
Touchless Touchscreen Technology
Akshay Vasava
 
Sixth Sense Technology
Sixth Sense Technology Sixth Sense Technology
Sixth Sense Technology
Arjun R Krishna
 
Touchless touch screen
Touchless touch screenTouchless touch screen
Touchless touch screen
Lovely Professional University
 
Sixth sense-final-ppt
Sixth sense-final-pptSixth sense-final-ppt
Sixth sense-final-ppt
Thedarkangel1
 
Sixth sense technology ppt
Sixth sense technology pptSixth sense technology ppt
Sixth sense technology ppt
Baljeet singh Chauhan
 
skinput technology
skinput technologyskinput technology
skinput technology
yamini rayalu
 
Seminar on isphere
Seminar on isphereSeminar on isphere
Seminar on isphere
Kedar Damkondwar
 
My seminar ppt SPACE MOUSE
My seminar ppt  SPACE MOUSEMy seminar ppt  SPACE MOUSE
My seminar ppt SPACE MOUSE
Sudeep Kumar
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
JISMI JACOB
 
The sixth sense technology complete ppt
The sixth sense technology complete pptThe sixth sense technology complete ppt
The sixth sense technology complete ppt
atinav242
 
Sixth sense technology ppt
Sixth sense technology pptSixth sense technology ppt
Sixth sense technology ppt
Mohammad Adil
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Pulkit Singhal
 
PPT of 6th sense tech. Jagdeep Singh Sidhu
PPT of 6th sense tech. Jagdeep Singh SidhuPPT of 6th sense tech. Jagdeep Singh Sidhu
PPT of 6th sense tech. Jagdeep Singh Sidhujagdeepsidhu
 
Touchless technology Seminar Presentation
Touchless technology Seminar PresentationTouchless technology Seminar Presentation
Touchless technology Seminar PresentationAparna Nk
 
6thsensetechnology by www.avnrpptworld.blogspot.com
6thsensetechnology by www.avnrpptworld.blogspot.com6thsensetechnology by www.avnrpptworld.blogspot.com
6thsensetechnology by www.avnrpptworld.blogspot.com
avnrworld
 
Sixth Sense Seminar ppt
Sixth Sense Seminar pptSixth Sense Seminar ppt
Sixth Sense Seminar ppt
shwetha shwet
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense TechnologyPallavi Sonone
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
bshruti
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Jai Rabindra
 

What's hot (20)

5 pen-pc-technology complete ppt
5 pen-pc-technology complete ppt5 pen-pc-technology complete ppt
5 pen-pc-technology complete ppt
 
Touchless Touchscreen Technology
Touchless Touchscreen TechnologyTouchless Touchscreen Technology
Touchless Touchscreen Technology
 
Sixth Sense Technology
Sixth Sense Technology Sixth Sense Technology
Sixth Sense Technology
 
Touchless touch screen
Touchless touch screenTouchless touch screen
Touchless touch screen
 
Sixth sense-final-ppt
Sixth sense-final-pptSixth sense-final-ppt
Sixth sense-final-ppt
 
Sixth sense technology ppt
Sixth sense technology pptSixth sense technology ppt
Sixth sense technology ppt
 
skinput technology
skinput technologyskinput technology
skinput technology
 
Seminar on isphere
Seminar on isphereSeminar on isphere
Seminar on isphere
 
My seminar ppt SPACE MOUSE
My seminar ppt  SPACE MOUSEMy seminar ppt  SPACE MOUSE
My seminar ppt SPACE MOUSE
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
The sixth sense technology complete ppt
The sixth sense technology complete pptThe sixth sense technology complete ppt
The sixth sense technology complete ppt
 
Sixth sense technology ppt
Sixth sense technology pptSixth sense technology ppt
Sixth sense technology ppt
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
PPT of 6th sense tech. Jagdeep Singh Sidhu
PPT of 6th sense tech. Jagdeep Singh SidhuPPT of 6th sense tech. Jagdeep Singh Sidhu
PPT of 6th sense tech. Jagdeep Singh Sidhu
 
Touchless technology Seminar Presentation
Touchless technology Seminar PresentationTouchless technology Seminar Presentation
Touchless technology Seminar Presentation
 
6thsensetechnology by www.avnrpptworld.blogspot.com
6thsensetechnology by www.avnrpptworld.blogspot.com6thsensetechnology by www.avnrpptworld.blogspot.com
6thsensetechnology by www.avnrpptworld.blogspot.com
 
Sixth Sense Seminar ppt
Sixth Sense Seminar pptSixth Sense Seminar ppt
Sixth Sense Seminar ppt
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 

Viewers also liked

Startup Experienceships - An Internship Revolution
Startup Experienceships - An Internship RevolutionStartup Experienceships - An Internship Revolution
Startup Experienceships - An Internship Revolution
Corkscrew Startup School
 
Summer internship presentation development process of startups from start to...
Summer internship presentation  development process of startups from start to...Summer internship presentation  development process of startups from start to...
Summer internship presentation development process of startups from start to...
Dinesh Kumar
 
Internship final presentation GraphicPeople
Internship final presentation GraphicPeopleInternship final presentation GraphicPeople
Internship final presentation GraphicPeople
Samsuddoha Sams
 
Sixth sense tecnology.
Sixth sense tecnology.Sixth sense tecnology.
Sixth sense tecnology.Appam Sushma
 
Project Report
Project ReportProject Report
Project Report
Kunal Thakur
 
Internship Report
Internship ReportInternship Report
Internship Report
Buddhima Wijeweera
 
Web Development on Web Project Report
Web Development on Web Project ReportWeb Development on Web Project Report
Web Development on Web Project Report
Milind Gokhale
 
Ecommerce website proposal
Ecommerce website proposalEcommerce website proposal
Ecommerce website proposalSudhir Raj
 
Online shopping report-6 month project
Online shopping report-6 month projectOnline shopping report-6 month project
Online shopping report-6 month project
Ginne yoffe
 
E commerce project report
E commerce project report E commerce project report
E commerce project report
Aditya Purohit
 

Viewers also liked (11)

Startup Experienceships - An Internship Revolution
Startup Experienceships - An Internship RevolutionStartup Experienceships - An Internship Revolution
Startup Experienceships - An Internship Revolution
 
Summer internship presentation development process of startups from start to...
Summer internship presentation  development process of startups from start to...Summer internship presentation  development process of startups from start to...
Summer internship presentation development process of startups from start to...
 
Internship final presentation GraphicPeople
Internship final presentation GraphicPeopleInternship final presentation GraphicPeople
Internship final presentation GraphicPeople
 
Sixth sense tecnology.
Sixth sense tecnology.Sixth sense tecnology.
Sixth sense tecnology.
 
Project Report
Project ReportProject Report
Project Report
 
Internship Report
Internship ReportInternship Report
Internship Report
 
Web Development on Web Project Report
Web Development on Web Project ReportWeb Development on Web Project Report
Web Development on Web Project Report
 
Ecommerce website proposal
Ecommerce website proposalEcommerce website proposal
Ecommerce website proposal
 
Online shopping report-6 month project
Online shopping report-6 month projectOnline shopping report-6 month project
Online shopping report-6 month project
 
E commerce project report
E commerce project report E commerce project report
E commerce project report
 
Internship report
Internship reportInternship report
Internship report
 

Similar to SIXTH SENSE TECHNOLOGY REPORT

vimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its workingvimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its working
vimalstar
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Surakshitha Rebba
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technologyRenjith Ravi
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technologyRenjith Ravi
 
Sixth sense
Sixth senseSixth sense
Sixth sense
Shilpa S
 
Sixth sense technology by pranav ji
Sixth sense technology by pranav jiSixth sense technology by pranav ji
Sixth sense technology by pranav ji
Harinandanshukla
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
Vaari Gupta
 
Virtual Class room using six sense Technology
Virtual Class room using six sense TechnologyVirtual Class room using six sense Technology
Virtual Class room using six sense Technology
IOSR Journals
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
Raga Deepthi
 
Sixth sense technology(Pranav Mistry)
Sixth sense technology(Pranav Mistry) Sixth sense technology(Pranav Mistry)
Sixth sense technology(Pranav Mistry)
Subin Jose Sabu
 
Sixth Sense Technology ppt
Sixth Sense Technology pptSixth Sense Technology ppt
Sixth Sense Technology ppt
Avijeet Negel
 
Definition
DefinitionDefinition
Definition
Niraj Bharambe
 
This is future-Sixth Sense Technology
This is future-Sixth Sense TechnologyThis is future-Sixth Sense Technology
This is future-Sixth Sense Technologybhavishya1993
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
IRJET Journal
 
the sixth sense technology
the sixth sense technologythe sixth sense technology
the sixth sense technologyRAJASHREE B
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image Processing
IRJET Journal
 
Sixth sense technology PPT
Sixth sense technology PPTSixth sense technology PPT
Sixth sense technology PPT
krishna singh
 
Sixth sense technology
Sixth sense technology Sixth sense technology
Sixth sense technology
Pulkit Singhal
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
IRJET Journal
 

Similar to SIXTH SENSE TECHNOLOGY REPORT (20)

vimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its workingvimal kumar's presentation on Sixth sense technology & its working
vimal kumar's presentation on Sixth sense technology & its working
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
Sixth sense technology by pranav ji
Sixth sense technology by pranav jiSixth sense technology by pranav ji
Sixth sense technology by pranav ji
 
Sixth sense technology
Sixth sense technologySixth sense technology
Sixth sense technology
 
Virtual Class room using six sense Technology
Virtual Class room using six sense TechnologyVirtual Class room using six sense Technology
Virtual Class room using six sense Technology
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
Sixth sense technology(Pranav Mistry)
Sixth sense technology(Pranav Mistry) Sixth sense technology(Pranav Mistry)
Sixth sense technology(Pranav Mistry)
 
Sixth Sense Technology ppt
Sixth Sense Technology pptSixth Sense Technology ppt
Sixth Sense Technology ppt
 
Definition
DefinitionDefinition
Definition
 
This is future-Sixth Sense Technology
This is future-Sixth Sense TechnologyThis is future-Sixth Sense Technology
This is future-Sixth Sense Technology
 
Sixth Sense Technology
Sixth Sense TechnologySixth Sense Technology
Sixth Sense Technology
 
the sixth sense technology
the sixth sense technologythe sixth sense technology
the sixth sense technology
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image Processing
 
Sixth sense technology PPT
Sixth sense technology PPTSixth sense technology PPT
Sixth sense technology PPT
 
Sixth sense technology
Sixth sense technology Sixth sense technology
Sixth sense technology
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
6th sense
6th sense6th sense
6th sense
 

Recently uploaded

weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
Pratik Pawar
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdf
Pipe Restoration Solutions
 
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxCFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
R&R Consult
 
English lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdfEnglish lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdf
BrazilAccount1
 
Runway Orientation Based on the Wind Rose Diagram.pptx
Runway Orientation Based on the Wind Rose Diagram.pptxRunway Orientation Based on the Wind Rose Diagram.pptx
Runway Orientation Based on the Wind Rose Diagram.pptx
SupreethSP4
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
Robbie Edward Sayers
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
Neometrix_Engineering_Pvt_Ltd
 
ML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptxML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptx
Vijay Dialani, PhD
 
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
ydteq
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
FluxPrime1
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
Massimo Talia
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
gerogepatton
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation & Control
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
MLILAB
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Dr.Costas Sachpazis
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
manasideore6
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
AJAYKUMARPUND1
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
Kamal Acharya
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
seandesed
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
karthi keyan
 

Recently uploaded (20)

weather web application report.pdf
weather web application report.pdfweather web application report.pdf
weather web application report.pdf
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdf
 
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxCFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
 
English lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdfEnglish lab ppt no titlespecENG PPTt.pdf
English lab ppt no titlespecENG PPTt.pdf
 
Runway Orientation Based on the Wind Rose Diagram.pptx
Runway Orientation Based on the Wind Rose Diagram.pptxRunway Orientation Based on the Wind Rose Diagram.pptx
Runway Orientation Based on the Wind Rose Diagram.pptx
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
ML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptxML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptx
 
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
一比一原版(UofT毕业证)多伦多大学毕业证成绩单如何办理
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
 
Immunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary AttacksImmunizing Image Classifiers Against Localized Adversary Attacks
Immunizing Image Classifiers Against Localized Adversary Attacks
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
 
CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
 

SIXTH SENSE TECHNOLOGY REPORT

  • 1. SIXTH SENSE TECHNOLOGY Seminar report submitted in partial fulfillment of the requirements for the award of the degree of Bachelor of Technology (B. Tech) In Information Technology Of University of Calicut By JISMI K JACOB Under the Guidance of Ms. JISNA V A Asst.Professor, IT Department April 2014 Department of Information Technology JYOTHI ENGINEERING COLLEGE, Cheruthuruthy,Thrissur – 679 531.
  • 2. DEPARTMENT OF INFORMATION TECHNOLOGY JYOTHI ENGINEERING COLLEGE CHERUTHURUTHY, THRISSUR- 679 531 April 2014 CERTIFICATE This is to certify that the seminar report entitled “SIXTH SENSE TECHNOLOGY” being submitted by Ms. JISMI K JACOB in partial fulfillment of the requirements for the award of degree of Bachelor of Technology of University of Calicut is bonafide record of work carried out at Department of Information Technology, JECC by her during the period December 2013 – April 2014 under our supervision. Seminar Guide Seminar Coordinator Ms. DIVYA M MENON Ms. JISNA V A Ms. SABNA A B H.O.D in charge, Asst.Professor Asst.Professor IT Department
  • 3. ABSTRACT Sixth Sense technology is a technology with which a system could be trained to recognize and percept real world objects and react as desired. Sixth Sense technology bridges the gap between the physical world and the digital world, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. Sixth Sense Technology is implemented in 'Sixth Sense/WUW (wear your world) using gesture recognition, augmented reality, computer vision and radio frequency identification. It’s just born concept which allows user to connect with the internet seamlessly. Without use of keyboard, mouse we can see videos access, change, move data simply .But this concept bottle necks lead to modification of the same by using commands instead of gestures. Sixth Sense technology could be integrated with voice recognition. Bluetooth device and laser projectors could be used.
  • 4. INDEX Contents Page no: 1. INTRODUCTION 1 2. OVERVIEW 3 3. ORIGIN OF IDEA 4 3.1 What is Sixth Sense Technology? 5 3.2 Earlier Sixth Sense Prototype 5 3.3 Recent Prototype 6 4. DETAILED DESCRIPTION 8 4.1 Components 8 4.2 Working of Sixth Sense Technology 11 4.3 Related Technologies 13 4.4 Applications 19 4.5 Advantages 23 4.6 Future Enhancements 23 5. CONCLUSION 24 6. BIBLIOGRAPHY 25
  • 5. List of Figures 3.1 Six Senses 5 3.2 Earlier Prototype 6 3.3 Recent Prototypes 7 4.1.1 Camera 8 4.1.2 Projector 9 4.1.3 Mirror 9 4.1.4 Smartphone 10 4.1.5 Color Marker 11 4.1.6 Microphone 11 4.2 Working of the Sixth Sense 12 4.3.1 Augmented reality 15 4.3.2 Computer vision 15 4.3.3 Hand gestures 17 4.3.4 Components of RFID 18
  • 6. CHAPTER 1 INTRODUCTION We use our five natural senses to perceive any information; that information helps us make decisions and choose the right actions to take. But arguably the most useful information that can help us make the right decision is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. Sixth Sense Technology bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘Sixth Sense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer. WUW was developed by Pranav Mistry, a Ph. D student at Fluid Interfaces Group at the MIT Media Lab. The Sixth Sense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system acts as the computer and your connection to the Cloud, all the information stored on the web. Sixth Sense recognizes the objects around you, displaying information automatically and letting you access it in any way you want, in the simplest way possible. The device brings us closer to reality and assists us in making right decisions by providing the relevant information, thereby, making the entire world a computer. The technology is mainly based on hand gesture recognition, image capturing, processing, and manipulation, etc. The software of the technology uses the video stream, which is captured by the camera, and also tracks the location of the tips of the fingers to recognize the gestures. This process is done using some techniques of computer vision. He invented ‘ Sixth Sense / WUW ( Wear UR World)’ which is a wearable gestural , user friendly interface which links the physical world around us with digital information and uses hand gestures to interact with them. This technology is a revolutionary way to interface the physical world with digital information. Modern technologies include the touch screen techniques which is used widely and it makes ease of operation and saves utilization time.
  • 7. This paper deals with the latest technology called the sixth sense. It’s a wearable interface that augments the physical world around us with the digital information. It’s just born concept which allows user to connect with the internet seamlessly. Without use of keyboard, mouse we can see videos access, change, move data simply .But this concept bottle necks lead to modification of the same by using commands instead of gestures. Speech IC is used as a database for commands which will be initially trained for storage. It performs the corresponding commands accessing the operation from the mobile device connected to it and action is projected using a projector over any surface.
  • 8. CHAPTER 2 OVERVIEW Previously many technologies evolved such as augmented reality which is to add information and meaning to real object or place. Unlike virtual reality, augmented reality does not create a simulation of reality instead it takes a real object or space as the foundation and incorporates technologies that add contextual data to deepen a person understanding of the subject. It’s a term for live direct or indirect view of a physical real world environment whose elements are augmented by virtual computer generated imagery. Gesture recognition is a term with a goal of interpreting human gestures through mathematical gestures and mathematical algorithms. Computer vision is the science and technology of machines that is concerned with the theory behind artificial systems that extract information from the images. As a technological discipline, computer vision seeks to apply its theories and models to the construction of computer vision systems. The examples include the controlling processes, detecting events, organising information, modelling objects or environments and interaction. Recently speech integrated circuits evolved which is used widely in car automation and home appliances. It eases the operation and saves the utilization time of the manual operations performed by the human’s every day. The speech recognition process is performed by a software component known as speech recognition engine. The primary function of this is to process the spoken input and translate it into text which the application understands. The application then can do one of the two things, 1.The application can interpret the result of the recognition as a command, in this case application is a command and control application. If the application handles the recognized text as simply text, then it’s considered as dictation application. When the user says something, it is known as utterance. An utterance is a stream of speech between two periods of silence. The speech IC can be used for all sorts of data, statistical models, and algorithms to convert spoken input into text.
  • 9. CHAPTER 3 ORIGIN OF IDEA This technology is a revolutionary way to interface the physical world with digital information. Modern technologies include the touch screen techniques which is used widely and it makes ease of operation and saves utilization time. Sixth sense is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information. But the bottle necks of this method such as position of camera, for capturing gestures interprets the accuracy in the projected output, lead to use of commands instead of hand gestures. The position of camera is a major constraint in the image capturing and projected output efficiency and accuracy. Therefore the actions which we regularly perform in our daily life, are converted to commands and are trained to a speech IC .They are stored as a database in the integrated circuit and corresponding actions are performed when the speech is recognized from the user. It’s a hi- tech device seamlessly integrate Analog information with our every day physical world. The voice is directly performed into operation within fractions of seconds, and the action is projected on the surface. It’s a portable device and eases the operation which we regularly perform. Basically the sixth sense technology concept involves the use of hand gestures .the finger tip will contain colored markers and hence gestures performed will be captured by the camera. Then it’s given to the mobile device for the corresponding action to be performed. The action is projected on the surface through the projector. Software algorithms and computer vision technologies will be used to enable the action from the mobile device for the corresponding gesture captured in the camera. This gesture based technology is used for variety of applications like performing basic actions, locating points in the map, watching video in news paper, dialing number in hand etc. The slight modification of this method leads to the use of commands that is analog information into real world. The analog data is converted into digital and performed as action, as all times the hand gestures cannot be used. This was how the wearable device is fit to the human body. Here color markers are used in the finger tips. In our technology we use commands for performing the same operations. Many high technology speech integrated circuits evolved which makes our operation enhanced with
  • 10. more advanced features. To ensure accurate gesture recognition and an intuitive interface a number of constraints are applied. A region in the front of the projection screen is defined as the active zone and the gestures are ignored, if the gestures are performed out of this area. Gestures are also defined by a set start posture, end posture and dynamic motion between the start and end postures. Perhaps the use of gestures is most powerful when combined with other input modalities, especially voice. Allowing combined voice and gestural input has several tangible advantages. The first is purely practical-ease of expression .Ease corresponds to the efficiency with which commands can be remembered and expressiveness, size of command vocabulary. 3.1 WHAT IS SIXTH SENSE? Sixth sense in scientific or non scientific terms defined as Extra Sensory Perception. It involves the reception of information not gained through any of the five senses. Namely the data, information and knowledge that mankind has accumulated about everything available online. Figure 3.1 six senses 3.2 EARLIER SIXTH SENSE PROTOTYPE Steve Mann is the father of Sixth Sense Technology who made of wearable computer in 1990. He implemented the neck worn projector with a camera system. Maes’ MIT group, which includes seven graduate students, were thinking about how a person could be more integrated into the world around them and access information without having to do something like take out a phone. They initially produced a wristband that would read an Radio Frequency Identification tag to know, for example, which book a user is holding in a store. They also had a ring that used infrared to communicate by beacon to supermarket smart shelves to give you information about products. As we grab a package of macaroni, the ring would glow red or green to tell us if the product was organic or free of peanut traces — whatever criteria we program into the system. They wanted to make information more useful
  • 11. to people in real time with minimal effort in a way that doesn’t require any behavior changes. The wristband was getting close, but we still had to take out our cell phone to look at the information. That’s when they struck on the idea of accessing information from the internet and projecting it. So someone wearing the wristband could pick up a paperback in the bookstore and immediately call up reviews about the book, projecting them onto a surface in the store or doing a keyword search through the book by accessing digitized pages on Amazon or Google books. They started with a larger projector that was mounted on a helmet. But that proved cumbersome if someone was projecting data onto a wall then turned to speak to friend — the data would project on the friend’s face. Figure 3.2: Earlier prototype 3.3 RECENT PROTOTYPE WUW was developed by Pranav Mistry, a Ph. D student at Fluid Interfaces Group at the MIT Media Lab. The Sixth Sense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system acts as the computer and your connection to the Cloud, all the information stored on the web. The key here is that Sixth Sense recognizes the objects around you, displaying information automatically and letting you access it in any way you want, in the simplest way possible. The device brings us closer to reality and assists us in making right decisions by providing the relevant information, thereby, making the entire world a computer. The technology is mainly based on hand gesture recognition, image capturing, processing, and manipulation, etc. The software of the technology uses the video stream, which is captured by the camera, and also tracks the location of the tips of the fingers to recognize the gestures. This process is done using some techniques of computer vision. He invented ‘ Sixth Sense / WUW ( Wear UR World)’ which
  • 12. is a wearable gestural , user friendly interface which links the physical world around us with digital information and uses hand gestures to interact with them. Figure 3.3 Recent prototypes
  • 13. CHAPTER 4 DETAILED DESCRIPTION 4.1 COMPONENTS The hardware components are coupled in a pendant like mobile wearable device. The components are: 1. Camera 2. Projector 3. Mirror 4. Mobile component 5. Color markers 6. Microphone 4.1.1 Camera Camera captures an object in view and tracks the user’s hand gestures. It sends the data to smart phone. Camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. Sixth Sense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. It acts as a digital eye, connecting you to the world of digital information. Figure 4.1.1: Camera 4.1.2 Projector A tiny LED projector displays data sent from the smart phone on any surface in view-object, wall, or person. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces. We want this thing to merge with the physical world in a real physical sense. You are touching that object and
  • 14. projecting info onto that object. The information will look like it is part of the object. The projector itself contains a battery inside, with 3 hours of battery life. Figure 4.1.2: Projector 4.1.3 Mirror The mirror reflects the projection coming out from the projector and thus helps in projecting onto the desired locations on walls or surfaces. The user manually can change the tilt of the mirror to change the location of the projection. For example in application where the user wants the projection to go on the ground instead of the surface in front, he can change the tilt of the mirror to change the projection. Thus, the mirror in the Sixth Sense helps in overcoming the limitation of the limited projection space of the projector. Figure 4.1.3 Mirror 4.1.4 Mobile component The Sixth Sense system uses a mobile computing device in user’s pocket as the processing device. The software program enabling all the features of the system runs on this computing device. This device can be a mobile phone or a small laptop computer. The camera, the projector and the microphone are connected to this device using wired or wireless connection. The detail of the software program that runs on
  • 15. this device is provided in next section. The mobile computing device is also connected to the Internet via 3G network or wireless connection. A Web-enabled smart phone in the user's pocket processes the video data, using vision algorithms to identify the object. Other software searches the Web and interprets the hand gestures. Figure 4.1.4: Smartphone 4.1.5 Color marker Color marker is at the tip of the user’s fingers. Marking the user’s fingers with red, yellow, green, and blue tape helps the webcam recognize gestures. The camera tracks the movements of the color markers. The movements and arrangements of these makers are interpreted into gestures that act as interaction instructions for the projected application interfaces. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus Sixth Sense also supports multi-touch and multi-user interaction.
  • 16. Figure 4.1.5: Color marker 4.1.6 Microphone The microphone is an optional component of the Sixth Sense. It is required when using a paper as a computing interface. When the user wants to use a sheet of paper as an interactive surface, he or she clips the microphone to the paper. The microphone attached this way captures the sound signals of user’s touching the paper. This data is passed to computing device for processing. Later, combined with the tracking information about user’s finger, the system is able to identify precise touch events on the paper. Here, the sound signal captured by the microphone provides time information whereas the camera performs tracking. The applications enabled by this technique are explained earlier. Figure 4.1.6 Microphone 4.2 WORKING OF SIXTH SENSE TECHNOLOGY The hardware that makes Sixth Sense work is a pendant like mobile wearable interface. It has a camera, a mirror and a projector and is connected wirelessly to a Bluetooth smart phone that can slip comfortably into one’s pocket. The camera recognizes individuals, images, pictures, gestures one makes with their hands. Information is sent to the Smartphone
  • 17. for processing. The downward-facing projector projects the output image on to the mirror. Mirror reflects image on to the desired surface. Thus, digital information is freed from its confines and placed in the physical world. Figure 4.2: working of the sixth sense The entire hardware apparatus is encompassed in a pendant-shaped mobile wearable device. Basically the camera recognizes individuals, images, pictures, gestures one makes with their hands and the projector assists in projecting any information on whatever type of surface is present in front of the person. The usage of the mirror is significant as the projector dangles pointing downwards from the neck. To bring out variations on a much higher plane, in the demo video which was broadcasted to showcase the prototype to the world, Mistry uses colored caps on his fingers so that it becomes simpler for the software to differentiate between the fingers, demanding various applications. The software program analyses the video data caught by the camera and also tracks down the locations of the colored markers by utilizing single computer vision techniques. One can have any number of hand gestures and movements as long as they are all reasonably identified and differentiated for the system to interpret it, preferably through unique and varied fiducials. This is possible only because the ‘Sixth Sense’ device supports multi-touch and multi-user interaction.
  • 18. The technology is mainly based on hand gesture recognition, image capturing, processing, and manipulation, etc. The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by multi-touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. 4.3 RELATED TECHNOLOGIES Previously many technologies evolved such as augmented reality which is to add information and meaning to real object or place. Unlike virtual reality, augmented reality does not create a simulation of reality instead it takes a real object or space as the foundation and incorporates technologies that add contextual data to deepen a person understanding of the subject. It’s a term for live direct or indirect view of a physical real world environment whose elements are augmented by virtual computer generated imagery. Sixth Sense technology takes a different approach to computing and tries to make the digital aspect of our lives more intuitive, interactive and, above all, more natural. We shouldn’t have to think about it separately. It’s a lot of complex technology squeezed into a simple portable device. When we bring in connectivity, we can get instant, relevant visual information projected on any object we pick up or interact with The technology is mainly based on hand augmented reality, gesture recognition, computer vision based algorithm etc. 4.3.1 AUGMENTED REALITY Augmented reality is a term for a live direct or indirect view of a physical real world environment whose elements are augmented by virtual computer generated imagery. Augmented reality, blurs the line between what's real and what's computer-generated by enhancing what we see, hear, feel and smell. Augmented reality is one of the newest innovations in the electronics industry. It superimposes graphics, audio and other sense enhancements from computer screens onto real time environments. Augmented reality goes far beyond the static graphics technology of television where the graphics imposed do not change with the perspective. Augmented reality systems superimpose graphics for every perspective and adjust to every movement of the user's head and eyes. The basic idea of
  • 19. augmented reality is to superimpose graphics, audio and other sensory enhancements over a real world environment in real time. However, augmented reality is more advanced than any technology you've seen in television broadcasts, although some new TV effects come close, such as RACEf/x and the super-imposed first down line on televised U.S. football games, both created by Sport vision. But these systems display graphics for only one point of view. You can even point the phone at a building, and Layar will tell you if any companies in that building are hiring, or it might be able to find photos of the building on Flickr. Augmented reality adds graphics, sounds, haptic feedback and smell to the natural world as it exists. Everyone from tourists, to soldiers, to someone looking for the closest subway stop can now benefit from the ability to place computer-generated graphics in their field of vision. The main hardware components for augmented reality are: display, tracking, input devices, and computer. Combination of powerful CPU, camera, accelerometers, GPS and solid state compass are often present in modern Smartphone, which make them prospective platforms. There are three major display techniques for Augmented Reality: 1. Head Mounted Displays 2. Handheld Displays 3. Spatial Displays 1. Head Mounted Displays A Head Mounted Display (HMD) places images of both the physical world and registered virtual graphical objects over the user's view of the world. The HMD's are either optical see through or video see-through in nature. 2. Handheld Displays Handheld Augment Reality employs a small computing device with a display that fits in a user's hand. All handheld AR solutions to date have employed video see through techniques to overlay the graphical information to the physical world. Initially handheld AR employed sensors such as digital compasses and GPS units for its six degree of freedom tracking sensors. 3. Spatial Displays Instead of the user wearing or carrying the display such as with head mounted displays or handheld devices; Spatial Augmented Reality (SAR) makes use of digital projectors to display graphical information onto physical objects.
  • 20. Figure 4.3.1 Augmented reality 4.3.2 COMPUTER VISION Computer vision is the science and technology of machines that see. It is concerned with the theory behind artificial systems that extract information from images. An image is a huge array of gray level (brightness) values of individual pixels. Taken individually, these numbers are almost meaningless, because they contain very little information about the scene . A robot needs information like "object ahead", "table to the left", or "person approaching" to perform its tasks. The conversion of this huge amount of low level information into usable high level information is the subject of computer vision. Earlier algorithms were too computationally expensive to run in real-time, but also required any type of memory and modeling. We concentrate on two types of images frequently used in computer vision: Intensity images Photograph like images encoding light intensities, Range images encoding shape and distance (sonar and laser). Figure 4.3.2: Computer vision
  • 21. 4.3.3 GESTURE RECOGNTION Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via Mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from the face and hand gesture recognition. The keyboard and mouse are currently the main interfaces between man and computer. Humans communicate mainly by vision and sound, therefore, a man-machine interface would be more intuitive if it made greater use of vision and audio recognition. Another advantage is that the user not only can communicate from a distance, but need have no physical contact with the computer. However, unlike audio commands, a visual system would be preferable in noisy environments or in situations where sound would cause a disturbance. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs (graphical user interfaces), which still limit the majority of input to keyboard and mouse. Gesture recognition enables humans to interface with the machine (HMI) and interact naturally without any mechanical devices. Using the concept of gesture recognition, it is possible to point a finger at the computer screen so that the cursor will move accordingly. Gesture recognition is useful for processing information from humans which is not conveyed through speech or type. As well, there are various types of gestures which can be identified by computers. Figure 4.3.3: Hand gestures
  • 22. 4.3.4 RADIO FREQUENCY IDENTIFICATION Radio Frequency Identification is basically an electronic tagging technology that allows the detection, tracking of tags and consequently the objects that they are affixed to Radio frequency identification (RFID) is a technology that uses communication via radio waves to exchange data between a reader and an electronic tag attached to an object, for the purpose of identification and tracking. Some tags can be read from several meters away and beyond the line of sight of the reader. The application of bulk reading enables an almost parallel reading of tags. Radio frequency identification (RFID) is a generic term that is used to describe a system that transmits the identity (in the form of a unique serial number) of an object or person wirelessly, using radio waves. It's grouped under the broad category of automatic identification technologies. Radio-frequency identification involves interrogators (also known as readers), and tags (also known as labels). Most RFID tags contain at least two parts. One is an integrated circuit for storing and processing information, modulating and demodulating a radio-frequency (RF) signal, and other specialized functions. The other is an antenna for receiving and transmitting the signal. RFID is in use all around us. If you have ever chipped your pet with an ID tag, used EZ Pass through a toll booth, or paid for gas using Speed Pass, you've used RFID. In addition, RFID is increasingly used with biometric technologies for security. The idea of Sixth Sense is to use Radio Frequency Identification technology in conjunction with a bunch of other enterprise systems such as the calendar system or online presence that can track user activity. Here, we consider an enterprise setting of the future where people (or rather their employee badges) and their personal objects such as books, laptops, and mobile phones are tagged with cheap, passive RFID tags, and there is good coverage of RFID readers in the workplace. Sixth Sense incorporates algorithms that start with a mass of undifferentiated tags and automatically infer a range of information based on an accumulation of observations. The technology is able to automatically differentiate between people tags and object tags, learn the identities of people, infer the ownership of objects by people, learn the nature of different zones in a workspace (e.g., private office versus conference room), and perform other such inferences. By combining information from these diverse sources, Sixth Sense records all tag-level events in a raw database. The inference algorithms consume these raw events to infer events at the level of people, objects,
  • 23. and workspace zones, which are then recorded in a separate processed database. Applications can either poll these databases by running SQL queries or set up triggers to be notified of specific events of interest. Sixth Sense infers when a user has interacted with an object, for example, when you pick up your mobile phone. It is a platform in that its programming model makes the inferences made automatically available to applications via a rich set of APIs. To demonstrate the capabilities of the platform, the researchers have prototyped a few applications using these APIs, including a misplaced object alert service, an enhanced calendar service, and rich annotation of video with physical events. Figure 4.3.4: Components of RFID 4.4 APPLICATIONS The basic operations such as enabling clock, inbox, browsing, searching gallery, calendar, seeing contact list etc are performed regularly in the mobile every time. These operations can be stored as commands in the IC and then can be accessed on the screen or over any surface using our technology within fractions of seconds. 4.4.1 MAKE A CALL The sixth sense can project the keypad onto one’s hand which can be used as a virtual screen to make a call. You can use the Sixth Sense to project a keypad onto your hand, then use that virtual keypad to make a call. Calling a number also will not be a great task with the
  • 24. introduction of Sixth Sense Technology. No mobile device will be required, just type in the number with your palm acting as the virtual keypad. The keys will come up on the fingers. The fingers of the other hand will then be used to key in the number and call. Figure 4.4.1: Making a call 4.4.2 CHECK THE TIME When you draw a circle on your wrist, a virtual watch appears that gives you the correct time. Sixth Sense all we have to do is draw a circle on our wrist with our index finger to get a virtual watch that gives us the correct time. The computer tracks the red marker cap or piece of tape, recognizes the gesture, and instructs the projector to flash the image of a watch onto his wrist. Figure 4.4.2: Checking time 4.4.3 ZOOMING FEATURES The user can zoom in or zoom out by using their intuitive hand movements.
  • 25. Figure 4.4.3: zooming 4.4.4 GET PRODUCT INFORMATION The sixth sense uses image recognition or a marker technology to recognize the products we pick up and gives us the information on those products. For example, if you're trying to shop "green" and are looking for paper towels with the least amount of bleach in them, the system will scan the product you pick up off the shelf and give you guidance on whether this product is a good choice for you. Figure 4.4.4: Product information 4.4.5 GET FLIGHT UPDATES By using our boarding pass, the system will recognize it and would let you know whether your flight is on time and if the gate has changed.
  • 26. Figure 4.4.5: Flight updates 4.4.6 TAKE A PICTURE If you fashion your index finger and thumbs into a square (“framing” gesture), the system snaps a photo at that time. After taking the desired number of photos, we can project them onto a surface to view them. Figure 4.4.6: Take pictures 4.4.7 CALL UP A MAP It is helpful to call up the map of our choice and then use our thumbs and index fingers to navigate the map. The sixth sense also implements map which lets the user display the map on any physical surface and find his destination and he can use his thumbs and index fingers to navigate the map, for example, to zoom in and out and do other controls. Figure 4.4.7: Virtual map 4.4.8 CHECK THE EMAIL Draw an @ symbol on any surface helps to check the email using the web service. 4.4.9 CREATE MULTIMEDIA READING EXPERIENCES The Sixth Sense system also augments physical objects the user is interacting with by projecting more information about these objects projected on them. For example, a
  • 27. newspaper can show live video news or dynamic information can be provided on a regular piece of paper. Thus a piece of paper turns into a video display. Figure 4.4.9: Multimedia reading 4.4.10 FEED INFORMATION ON PEOPLE It helps to display the relevant information about a person we are looking at. Sixth Sense also is capable of "a more controversial use”. When you go out and meet someone, projecting relevant information such as what they do, where they work, and also it could display tags about the person floating on their shirt. It could be handy if it displayed their facebook relationship status so that you knew not to waste your time. Figure 4.4.10: info about person 4.5 ADVANTAGES Portable, supports multi-touch and multi-user interaction, connectedness between world and information, cost effective, data access directly from machine in real time, mind map the idea anywhere, assists us in making right decisions, supports multi touch and multi user interaction, the device serves the purpose of a computer plus saves time spent on
  • 28. searching information. Sixth Sense also recognizes user’s freehand gestures (postures) and saves electricity. 4.6 FUTURE ENHANCEMENTS Imagine the world where Sixth Sense Technology is applied everywhere. In educational field, the number of hardware components could be reduced. Usage of papers and electricity could decrease. Students could use any wall or any surface wherever they are to carry out activities that are done in a PC. Security will be assured for everyone. It could be helpful in rendering defense services. In medical field, it could be implied to check the genuinity of drugs. It could be implemented to monitor the agricultural lands. Blind people could be able to read books and recognize objects. It could be used for the betterment of handicapped people. Sixth sense could make the world magical.
  • 29. CHAPTER 5 CONCLUSION Sixth Sense technology recognizes the objects around us, displaying information automatically and letting us to access it in any way we need. The Sixth Sense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. Allowing us to interact with this information through natural hand gestures. The potential of becoming the ultimate "transparent" user interface for accessing information about everything around us. Currently the prototype of the device costs around $350 to build. It could change the way we interact with the real world and truly give everyone complete awareness of the environment around us. The Sixth Sense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. It will definitely revolutionize the world. The Sixth Sense software will be open source. As far as this seems to be a little set of items, there will not be user interfaces or much advanced programs for the users. There will be much harder and secured coding inside the device to make sure the security of the software. It will be interesting to know the new language for coding for a sixth sense device.
  • 30. BIBLIOGRAPHY [1] Kirishima, T. Sato, K. Chihara, K.Dept. of Electr. Eng., Nara Nat. Coll. of Technol., Japan Robotics, “Gesture Spotting and Recognition for Human–Robot Interaction”, IEEE Transactions on Volume: 23, Issue:2 pp256 – 270., April 2007. [2] Alon,J.Athitsos, V.Quan, YuanSclarof,” A Unified Framework for Gesture Recognition and Spatiotemporal Gesture Segmentation”, IEEE transactions on Pattern Analysis and Machine Intelligence, Volume: 31, Issue:9 pp 1685 - 1699 ., Sept. 2009. [3] Gomez, A.M. Peinado, A.M. Sanchez, V. Rubio, A.J.dept. eoria de la Senal,” Recognition of coded speech transmitted over wireless channels Wireless Communications”, IEEE Transactions on Volume: 5, Issue: 9, pp-2555 – 2562., Sept. 2006. [4] Evans, J.R. Tjoland, W.A. Allred, L.G.Ogden Air Logistics Center, Hill AFB,” Achieving a hands-free computer interface using voice recognition and speech synthesis”, IEEE Volume: 15, Issue:1, pp 14 16., Jan 2000. [5] Pelaez-Moreno, C. Gallardo-Antolin, A. Diaz-de-Maria,” Recognizing voice over IP: a robust front-end for speech recognition on the world wide webMultimedia”, IEEE Transactions on Volume: 3, Issue:2, pp-209 – 218., Jun 2001.