SlideShare a Scribd company logo
1 of 22
Google’s Project Tango
Department of CSE
1 | P a g e
1. INTRODUCTION
3D models represent a 3D object using a collection of points in a given 3D space, connected by
various entities such as curved surfaces, triangles, lines, etc. Being a collection of data which
includes points and other information, 3D models can be created by hand, scanned (procedural
modeling), or algorithmically. The "Project Tango" prototype is an Android smartphone- like
device which tracks the 3D motion of particular device, and creates a 3D model of the environment
around it.
Project Tango was introduced by Google initially in early 2013, they described this as a
Simultaneous Localization and Mapping (SLAM) system capable of operating in real-time on a
phone. Google’s ATAP teamed up with a number of organizations to create Project Tango from
this description.
The team at Google’s Advanced Technology and Projects Group (ATAP) has been
working with various Universities and Research labs to harvest ten years of research in Robotics
and Computer Vision to concentrate that technology into a very unique mobile phone. We are
physical being that live in a 3D world yet the mobile devices today assume that the physical world
ends the boundaries of the screen. Project Tango’s goal is to give mobile devices a human-scale
understanding of space and motion. This project will help people interact with the environment in
a fundamentally different way and using this technology we can prototype in a couple of hours
something that would take us months or even years before because we did not have this technology
readily available. Imagine having all this in a smartphone and see how things would change.
The first product to emerge from Google's ATAP skunkworks group,[1] Project Tango was
developed by a team led by computer scientist Johnny Lee, a core contributor
to Microsoft's Kinect. In an interview in June 2015, Lee said, "We're developing the hardware and
software technologies to help everything and everyone understand precisely where they are,
anywhere."[2]
This device runs Android and includes development APIs to provide alignment, position
or location, and depth data to regular Android apps written in C/C++, Java as well as the Unity
Game Engine (UGE). These early algorithms, prototypes, and APIs are still in active development.
So, these are experimental devices and are intended only for the exploratory and adventurous are
not a final shipping product.
Project Tango technology gives a mobile device the ability to navigate the physical world
similar to how we do as humans.
Project Tango brings a new kind of spatial perception to the Android device platform by
adding advanced computer vision, image processing, and special vision sensors.
Project Tango is a prototype phone containing highly customized hardware and software
designed to allow the phone to track its motion in full 3D in real-time. The sensors make over a
quarter million 3D measurements every single second updating the position and rotation of the
Google’s Project Tango
Department of CSE
2 | P a g e
phone, blending this data into a single 3D model of the environment. It tracks ones position as one
goes around the world and also makes a map of that. It can scan a small section of your room and
then are able to generate a little game world in it. It is an open source technology. ATAP has
around 200 development kits which has already been distributed among the developers.
Google has produced two devices to demonstrate the Project Tango technology: the Peanut
phone (no longer available) and the Yellowstone 7-inch tablet. More than 3,000 of these devices
had been sold as of June 2015,[3] chiefly to researchers and software developers interested in
building applications for the platform. In the summer of 2015, Qualcomm and Intel both
announced that they are developing Project Tango reference devices as models for device
manufacturers who use their mobile chipsets.[4][5
At CES, in January 2016, Google announced a partnership with Lenovo to release a
consumer smartphone during the summer of 2016 to feature Project Tango technology marketed
at consumers, noting a less than $500 price-point and a small form factor below 6.5 inches. At the
same time, both companies also announced an application incubator to get applications developed
to be on the device on launch.
Fig (1) Google’s Project Tango Logo
Google’s Project Tango
Department of CSE
3 | P a g e
Which companies are behind Project Tango?
A number of companies came together to develop Project Tango. All of these are listed in the
credits of the Google Project Tango introduction video called “Say hello to Project Tango!” Each
company has had a different amount of involvement. The following are the list of participating
companies listed in that video:
· Bosch
· BSquare
· CompalComm
· ETH Zürich
· Flyby Media
· George Washington University
· HiDOF
· MMSolutions
· Movidius
· University of Minnesota
· NASA JPL
· Ologic
· OmniVision
· Open Source Robotics Foundation
· ParaCosm
· Sunny Optical tech
· Speck Design
Google’s Project Tango
Department of CSE
4 | P a g e
2. OVERVIEW
WHAT IS PROJECT TANGO?
Tango allows a device to build up an accurate 3D model of its immediate surroundings, which
Google says will be useful for everything from AR gaming to navigating large shopping centres.
Fig (2) A view of Googles Project Tango 3D Model Mapping
Google isn't content with making software for phones that can merely capture 2D photos and
videos. Nor does it just want to take stereoscopic 3D snaps. Instead, Project Tango is a bid to equip
every mobile device with a powerful suite of software and sensors that can capture a complete 3D
picture of the world around it, in real-time. Why? So you can map your house, furniture and all,
simply by walking around it. Bingo - no more measuring up before going shopping for a new
wardrobe. Or so you can avoid getting lost next time you go to the hospital - you'll have instant
access to a 3D plan of its labyrinthine corridors. Or so you can easily find the 'unhealthy snacks'
section in your local megamart. Or so can play amazing augmented reality games. Or so that the
visually impaired can receive extra help in getting around. In fact, as with most Google projects,
the ways in which Tango could prove useful are only limited by our imagination.
Google’s Project Tango
Department of CSE
5 | P a g e
WHAT DOES THE PHONE LOOK LIKE?
There are two prototypes of Tango phone yet. A 7inch tablet and another prototype of a 5 inch
phone.
Fig (3) Prototype 1
It's a fairly standard 7in slate with a slight wedge at the back to accommodate the extra sensors.
As far as we can tell, it has three cameras including the webcam. Inside, it has one of Nvidia's
so-far-untested Tegra K1 mobile processors with a beefy 4GB of RAM and a 128GB
SSD. Google is at pains to point out that it's not a consumer device, but one is supposedly on
the way. The depth-sensing array consists of an infrared projector, 4MP rear camera and front-
facing fisheye view lens with 180-degree field of vision. Physically, it's a standard phone shape
but rather chunky compared to the class of 2014. More like something from about 2010
Google’s Project Tango
Department of CSE
6 | P a g e
Fig (4) Prototype 2
Prototype 2 is an android 5 inch smartphone with the same tango hardware as that of the tablet.
Fig (5) A simple Overview of Components of Tango Phone
Google's Project Tango is a smartphone equipped with a variety of cameras and vision sensors that
provides a whole new perspective on the world around it. The Tango smartphone can capture a
Google’s Project Tango
Department of CSE
7 | P a g e
wealth of data never before available to application developers, including depth and
object-tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical
smartphone.
Project Tango is different from other emerging 3D-sensing computer vision products, such
as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is
chiefly concerned with determining the device's position and orientation within the environment.
The high-end Android tablet with 7-inch HD display, 4GB of RAM, 128GB of internal
SSD storage and an NVIDIA Tegra K1 graphics chip (the first in the
US and second in the world) that features desktop GPU architecture. It also has a distinctive design
that consists of an array of cameras and sensors near the top and a couple of subtle grips on the
sides. Movidius which is the company that developed some of the technology which has been used
in Tango has been working on computer vision technology for the past seven years — it developed
the processing chips used in Project Tango, which Google paired with sensors and cameras to give
the smartphone the same level of computer vision and tracking that formerly required much larger
equipment. The phone is equipped with a standard 4-megapixel camera paired with a special
combination of RGB and IR sensor and a lower-resolution image-tracking camera. These combos
of image sensors give the smartphone a similar perspective on the world, complete with 3-D
awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low-
power computer-vision processor, which can then process the data and feed it to apps through a
set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all
the motions made by the user.
Google’s Project Tango
Department of CSE
8 | P a g e
3. SMARTPHONE SPECIFICATION
Tango wants to deconstruct reality, taking a quarter million 3D measurements each second to
create a real-time 3D model that describes the physical depth of its surroundings.
The smartphone specs are
The above specs include Snapdragon 800 quad core CPU running up to 2.3 GHz per core, 2GB or
4GB of memory, an expandable 64GB or 128 of internal storage, and a nine axis
accelerometer/gyroscope/compass. There’s also a Mini-USB, a Micro-USB, and USB 3.0.
In addition to above specs Tango’s specs also include: a rear-facing four megapixel
RGB/infrared camera, a 180-degree field-of-view fisheye rear-facing camera, a 120-degree field-
of-view front facing camera, and a 320 x 180 depth sensor – plus a vision processor with one
teraflop of computer power. Project Tango uses a 3000 mAh battery.
Google’s Project Tango
Department of CSE
9 | P a g e
4. HARDWARE
Project Tango is basically a camera and sensor array that happens to run on an Android phone.
The smartphone is equipped with a variety of cameras and vision sensors that provides a whole
new perspective on the world around it. The Tango smartphone can capture a wealth of data never
before available to application developers, including depth and object-tracking and instantaneous
3D mapping. And it is almost as powerful and as big as a typical smartphone. The Front View and
Back View of a Tango Phone is shown below.
It is same like some other phones but the phone is having variety of cameras and sensors that make
the 3D modelling of the environment possible.
Fig (6) Tango Phone Front View
The device tracks the 3D motion and creates a 3D model of the environment around it by using
the array of cameras and sensors. The phone emits pulses of infrared light from the IR projector
and records how it is reflected back allowing it to build a detailed depth map of the surrounding
space.
There are three cameras that capture a 120-degree wide-angle field of view. 3D camera captures
the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto
the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also captures
the depth dimension (in addition to the standard 2D data).A rear-facing four megapixel
RGB/infrared camera, a 180-degree field-of-view fisheye rear-facing camera, a 120-degree field-
Google’s Project Tango
Department of CSE
10 | P a g e
of-view front facing camera, and a 320 x 180 depth sensor are the components of the phone at the
rear end that works together to give the 3D structure of the scene.
Fig (7) Tango Phone Back View
Project Tango, which Google paired with sensors and cameras to give the smartphone the same
level of computer vision and tracking that formerly required much larger equipment. The phone is
equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR
sensor and a lower-resolution image-tracking camera. These combos of image sensors give the
smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of
depth. They supply information to Movidius custom Myriad 1 low-power computer-vision
processor, which can then process the data and feed it to apps through a set of APIs. The phone
also contains a Motion Tracking camera which is used to keep track of all the motions made by
the user. The motherboard which contains all of these components is shown below
Google’s Project Tango
Department of CSE
11 | P a g e
Fig (8) Tango phone Motherboard
 Elpida FA164A1PB 2 GB LPDDR3 RAM, layered above a Qualcomm 8974(Snapdragon 800)
processor. (RED)
 Two Movidius Myriad 1 computer vision co-processors. (ORANGE)
 Two AMIC A25L016 16 Mbit low voltage serial flash memory ICs. (YELLOW)
 InvenSense MPU-9150 9-axis gyroscope/accelerometer/compass MEMS motion tracking
device. (GREEN)
 Skyworks 77629 multimode multiband power amplifier module for quad-band GSM/EDGE.
(BLUE)
 PrimeSense PSX1200 Capri PS1200 3D sensor SoC. (VIOLET)
The figure above is the motherboard: the red is 2GB LPDDR3 RAM, along with Qualcomm
Snapdragon 800 CPU, the orange is computer image processor Movidius Myriad 1, the
green which contain 9-axis acceleration sensor / gyroscope / compass, motion tracking, the
yellow is two memory ICs AMIC A25L016 flash 16Mbit, the purple is the SoC 3D sensor
PrimeSense PSX1200 Capri PS1200, the blue is SPI flash memory Winbond W25Q16CV
16Mbit. Internally, the Myriad 2 consists of 12 128-bit vector processors called Streaming
Hybrid Architecture Vector Engines, or SHAVE in general, which run at 60MHz. The
Myriad 2 chip gives five times the SHAVE performance of the Myriad 1, and the SIPP
engines are 15x to 25x more powerful than the 1st generation chip.
The phone is equipped with a standard 4-megapixel camera paired with a special
combination of RGB and IR sensor and a lower-resolution image-tracking camera..As the
main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s
Google’s Project Tango
Department of CSE
12 | P a g e
mobile device. The OV4682 is a 4MP RGB IR image sensor that captures
high-resolution images and video as well as IR information, enabling depth analysis.
Fig (9) Front and Rear Camera Fig (10) Fisheye Camera
Fig (11) IR Projector
Integrated Depth Sensor
Google’s Project Tango
Department of CSE
13 | P a g e
5. TECHNOLOGY BEHIND TANGO
5.1 TANGO’S SENSOR
Myriad 1 vision processor platform developed by Movidius Company. The sensors
allow the device to make "over a quarter million 3D measurements every second, updating
its position and orientation in real time, combining that data into a single 3D model of the
space around you. Movidius which is the company that developed some of the technology
which has been used in Tango has been working on computer vision technology for the
past seven years — it developed the processing chips used in Project Tango, which Google
paired with sensors and cameras to give the smartphone the same level of computer vision
and tracking that formerly required much larger equipment.
5.2 IMAGE SENSORS
Image sensors give the smartphone a similar perspective on the world, complete
with 3-D awareness and a awareness of depth which is then supplied information to
Movidius custom Myriad 1 low-power computer-vision processor, which can then process
the data and feed it to apps through a set of APIs. The Motion Tracking camera keeps track
of all the motions made by the user. . There are three cameras that capture a 120-degree
wide-angle field of view from the front. An even wider 180 degree span from the back The
phone is equipped with a standard 4-megapixel camera paired with a special combination
of RGB and IR sensor and a lower-resolution image-tracking camera. Its depth-sensing
array consists of an infrared projector, 4MP rear camera and front-facing fisheye view lens
with 180-degree field of vision. The phone emits pulses of infrared light from the IR
projector and records how it is reflected back allowing it to build a detailed depth map of
the surrounding space. The data collected from sensors and camera is processed by the
Myriad vision processor for delivering 3D structure of the view to the apps.
Google’s Project Tango
Department of CSE
14 | P a g e
6. WORKING CONCEPT
Project Tango devices combine the camera, gyroscope and accelerometer to estimate six degrees
of freedom motion tracking, providing developers the ability to track 3D motion of a device while
simultaneously creating a map of the environment
An IR projector provides infrared light that other (non-RGB) cameras can use to get a sense of an
area in 3D space. The phone emits pulses of infrared light from the IR projector and records how
it is reflected back allowing it to build a detailed depth map of the surrounding space. There are
three cameras that capture a 120-degree wide-angle field of view from the front. An even wider
180 degree span from the back. A 4-MP color camera sensor can also be used for snapping regular
pics. A 3D camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a
projection of the scene onto the camera's imaging plane; any depth information is lost. In contrast,
a 3D camera also captures the depth dimension (in addition to the standard 2D data).
The main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s
mobile device. The OV4682 is a 4MP RGB IR image sensor that captures
high-resolution images and video as well as IR information, enabling depth analysis. The sensor
features a 2um OmniBSI-2 pixel and records 4MP images and video in a 16:9 format at 90fps. The
sensor's 2-micron OmniBSI-2 pixel delivers excellent signal-to-noise ratio and IR sensitivity, and
offers best-in-class low-light sensitivity. The OV4682's unique architecture and pixel optimization
bring not only the best IR performance but also best-in-class image quality. The OV4682 records
full-resolution 4-megapixel video in a native 16:9 format at 90 frames per second (fps), with a
quarter of the pixels dedicated to capturing IR. The 1/3- inch sensor can also record 1080p high
definition (HD) video at 120 fps with electronic image stabilization (EIS), or 720p HD at 180 fps.
The OV7251 Camera Chip sensor is capable of capturing VGA resolution video at 100fps using a
global shutter. RGB infrared (IR) single sensor that captures high-resolution images and video as
well as IR information. Its dual RGB and IR capabilities allow it to bring a host of additional
features to mobile and machine vision applications, including gesture sensing, depth analysis, iris
detection and eye tracking.
The another camera is fisheye lens enables a 180º FOV, while the sensor balances
resolution and frames per second to record black and white images for motion tracking. So if the
users moves the devices left or right, it draws the path that the devices and that path followed is
show in the image on the right in real-time. Thus through this we have a motion capture
capabilities in our device. The device also has a depth sensor.
Google’s Project Tango
Department of CSE
15 | P a g e
Fig (12) The image represents the feed from the fish-eye lens. Fig (13) Computer Vision
The figure above illustrates depth sensing by displaying a distance heat map on top of what the
camera sees, showing blue colors on distant objects and red colors on close by objects. It also the
data from the image sensors and paired with the device's standard motion sensors and gyroscopes
to map out paths of movement down to 1 percent accuracy and then plot that onto an interactive
3D map. It uses the Sensor fusion technology which combines sensory data or data derived from
sensory data from disparate sources such that the resulting information is in some sense better than
would be possible when these sources were used separately. Thus it means a more precise, more
comprehensive, or more reliable, or refer to the result of an emerging view, such as stereoscopic
vision.
These combos of image sensors give the smartphone a similar perspective on the world,
complete with 3-D awareness and a awareness of depth. They supply information to Movidius
custom Myriad 1 low-power computer-vision processor, which can then process the
data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera
which is used to keep track of all the motions made by the user.
Mantis Vision, a developer of some of the world's most advanced 3D enabling technologies
research MV4D technology platform is the core 3D engine behind Google's Project Tango. Mantis
Vision provides the 3D sensing platform, consisting of flash projector hardware components and
Mantis Vision's core MV4D technology which includes structured light-based depth sensing
algorithms which generates realistic, dense maps of the world. It focuses to provide reliable
estimates of the pose of a phone i.e. position and alignment, relative to its
environment, dense maps of the world. It focuses to provide reliable estimates of the pose of a
phone (position and alignment), relative to its environment
Google’s Project Tango
Department of CSE
16 | P a g e
7. PROJECT TANGO CONCEPTS
Project Tango is different from other emerging 3D-sensing computer vision products, such as
Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is chiefly
concerned with determining the device's position and orientation within the environment.
The software works by integrating three types of functionality:
7.1 Motion Tracking:
Motion tracking allows a device to understand position and orientation using
Project Tango's custom sensors. This gives you real-time information about the 3D motion
of a device. Motion-tracking: using visual features of the environment, in combination
with accelerometer and gyroscope data, to closely track the device's movements in space.
Project Tango’s core functionality is measuring movement through space and
understanding the area moved through. Google API’s provide the position and orientation
of the user’s device in full six degrees of freedom, referred to as its pose.
Fig (14) Motion Tracking
Google’s Project Tango
Department of CSE
17 | P a g e
7.2 Area Learning:
Using area learning, a Project Tango device can remember the visual features of
the area it is moving through and recognize when it sees those features again. These
features can be saved in an Area Description File (ADF) to use again later.
Project Tango devices can use visual cues to help recognize the world around them. They
can self-correct errors in motion tracking and relocalize in areas they've seen before. . With
an ADF loaded, Project Tango devices gain a new feature called drift corrections or
improved motion tracking.
Area learning is the way of storing environment data in a map that can be re-used
later, shared with other Project Tango devices, and enhanced with metadata such as notes,
instructions, or points of interest
Fig (15) Area Learning
7.3 Depth Perception:
Project Tango devices are equipped with integrated 3D sensors that measure the
distance from a device to objects in the real world. This configuration gives good depth at
a distance while balancing power requirements for infrared illumination and depth
processing.
The depth data allows an application to understand the distance of visible objects to the
device. By combining depth perception with motion tracking, you can also measure distance
between points in an area that aren’t in the same fame.
Google’s Project Tango
Department of CSE
18 | P a g e
Project Tango devices are equipped with integrated 3D sensors that measure the distance
from a device to objects in the real world. Current devices are designed to work best indoors
at moderate distances (0.5 to 4 meters). It may not be ideal for close range object scanning.
Because the technology relies on viewing infrared light using the device's camera, there
are some situations where accurate depth perception is difficult. Areas lit with light sources
high in IR like sunlight or incandescent bulbs, or objects that do not reflect IR light cannot
be scanned well.
By combining depth perception with motion tracking, you can also measure distances
between points in an area that aren't in the same frame.
Fig (16) Depth Perception
Together, these generate data about the device in "six degrees of freedom"
(3 axes of orientation plus 3 axes of motion) and detailed three-dimensional information about the
environment.
Applications on mobile devices use Project Tango's C and Java APIs to access this data in
real time. In addition, an API is also provided for integrating Project Tango with the Unity game
engine; this enables the rapid conversion or creation of games that allow the user to interact and
navigate in the game space by moving and rotating a Project Tango device in real space. These
APIs are documented on the Google developer website.
Google’s Project Tango
Department of CSE
19 | P a g e
8. DEVICES DEVELOPED SO FAR
As a platform for software developers and a model for device manufacturers, Google has
created two Project Tango devices to date.
The Yellowstone tablet
Google's Project Tango tablet, 2014
"Yellowstone" is a 7-inch tablet with full Project
Tango functionality, released in June 2014, and sold as the
Project Tango Tablet Development Kit.[7] It features a
2.3 GHz quad-core Nvidia Tegra K1 processor, 128GB flash
memory, 1920x1200-pixel touchscreen, 4MP color
camera, fisheye-lens (motion-tracking) camera, integrated
depth sensing, and 4G LTE connectivity. The device is sold
through the official Project Tango website [8] and the Google Play Store.
The Peanut phone
"Peanut" was the first production Project Tango device, released in the first quarter of 2014.
It was a small Android phone with a Qualcomm MSM8974 quad-core processor and additional
special hardware including a fisheye-lens camera (for motion tracking), "RGB-IR" camera (for
color images and infrared depth detection), and Movidius image-processing chips. A high-
performance accelerometer and gyroscope were added after testing several competing models in
the MARS lab at the University of Minnesota.
Several hundred Peanut devices were distributed to early-access partners including
university researchers in computer vision and robotics, as well as application developers and
technology.Google stopped supporting the Peanut device in September 2015, as by then the Project
Tango software stack had evolved beyond the versions of Android that run on the device.
Testing by NASA
In May 2014, two Peanut phones were delivered to the International Space Station to be
part of a NASA project to develop autonomous robots that navigate in a variety of environments,
including outer space. The soccer-ball-sized, 18-sided polyhedral SPHERES robots were
developed at the NASA Ames Research Center, adjacent to the Google campus in Mountain View,
California. Andres Martinez, SPHERES manager at NASA, said "We are researching how
effective Project Tango's vision-based navigation abilities are for performing localization and
navigation of a mobile free flyer on ISS.
Google’s Project Tango
Department of CSE
20 | P a g e
9. FUTURE SCOPE
Project Tango seeks to take the next step in this mapping evolution. Instead of depending
on the infrastructure, expertise, and tools of others to provide maps of the world, Tango empowers
users to build their own understanding, all with a phone. Imagine knowing your exact position to
within inches. Imagine building 3D maps of the world in parallel with other users around you.
Imagine being able to track not just the top down location of a device, but also its
full 3D position and alignment. The technology is ambitious, the potential applications are
powerful. The Tango device really enables augmented reality which opens a whole frontier for
playing games in the scenery around you. You can capture the room, you can then render the scene
that includes the room but also adds characters and adds objects so that you can create games that
operate in your natural environment. The applications even go beyond gaming. Imagine if you
could see what room would look like and decorate it with different types
of furniture and walls and create a very realistic scene. This Technology can be used the guide the
visually impaired to give them auditory queues or where they are going. Can even be used by
soldiers in war to replicate the war-zone and prepare for combat or can even be used to
live out one’s own creative fantasies. The possibilities are really endless for this amazing
technology and the future is looking very bright.
Things Project Tango can do
DIRECTIONS: When you need directions inside a building or structure that current mapping
solutions just don’t provide. Shopping - who just like to get in and out as quickly as possible.
Having an indoor map of the store in your hand could make shopping trips more efficient by
leading you directly to the shelf you want.
EMERGENCY RESPONSE: To help emergency response workers such as firefighters find their
way through buildings by projecting the blueprints onto the screen.
It has the potential to provide valuable information in situations where knowing the exact layout
of a room can be a matter of life or death
AUGMENTED REALITY GAMING: It could combine the room-mapping with augmented
reality. “Imagine competing against a friend for control over territories in your own home with
your own miniature army.
Mapping in-game textures onto your real walls through the
smartphone would arguably produce the best game of Cops
and Robbers in history.
MODELLING OBJECTS:
A simple image showing image Modelling using Project
Tango.
Fig (17)
Google’s Project Tango
Department of CSE
21 | P a g e
10.CONCLUSION
Project Tango enables apps to track a device's position and orientation within a detailed 3D
environment, and to recognize known environments. This makes possible applications such as in-
store navigation, visual measurement and mapping utilities, presentation and design tools, and a
variety of immersive games.
At this moment, Tango is just a project but is developing quite rapidly with early prototypes and
development kits already distributed among many developers. It is all up to the developers now to
create more clever and innovative apps to take advantage of this technology. It is just the
beginning and there is a lot of work to do to fine-tune this amazing technology. Thus, if Project
Tango works – and we've no reason to suspect it won't - it could prove every bit as revolutionary
as Maps or earth or android. It just might take a while for its true genius to become clear
Google’s Project Tango
Department of CSE
22 | P a g e
11.REFERENCE
[1] Announcement on ATAP Google+ site, 30 January 2015.
[2] "Future Phones Will Understand, See the World". 3 June 2015. Retrieved 4
November 2015.
[3] ^"Slamdance: inside the weird virtual reality of Google's Project Tango". 29 May 2015.
[4] Qualcomm Powers Next Generation Project Tango Development Platform, 2015-05-29
[5] IDF 2015: Intel teams with Google to bring RealSense to Project Tango, 2015-08-18
[6] https://developers.google.com/project-tango/ Google developer website
[7] Product announcement on ATAP Google+ page, 5 June 2014, retrieved 4 November 2015
[8] https://www.google.com/atap/project-tango/ Project Tango website

More Related Content

What's hot (20)

Microsoft Hololens Seminar Report
Microsoft Hololens Seminar ReportMicrosoft Hololens Seminar Report
Microsoft Hololens Seminar Report
 
Google glass ppt
Google glass pptGoogle glass ppt
Google glass ppt
 
Ppt on Google glass
Ppt on Google glassPpt on Google glass
Ppt on Google glass
 
GOOGLE GLASS
GOOGLE GLASSGOOGLE GLASS
GOOGLE GLASS
 
Seminar ppt on google cardboard
Seminar ppt on google cardboardSeminar ppt on google cardboard
Seminar ppt on google cardboard
 
Google glass documentation
Google glass documentationGoogle glass documentation
Google glass documentation
 
Google Glass
Google GlassGoogle Glass
Google Glass
 
Google glass
Google glassGoogle glass
Google glass
 
powerpoint presentation on Google glass
powerpoint presentation on Google glasspowerpoint presentation on Google glass
powerpoint presentation on Google glass
 
Hololens
HololensHololens
Hololens
 
Project soli
Project soliProject soli
Project soli
 
Google glass
Google glassGoogle glass
Google glass
 
screen less display
screen less displayscreen less display
screen less display
 
Technical Seminar Topic on Google glass
Technical Seminar Topic on Google glassTechnical Seminar Topic on Google glass
Technical Seminar Topic on Google glass
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Google Glass
Google GlassGoogle Glass
Google Glass
 
Google glass
Google glassGoogle glass
Google glass
 
visual Positioning System
visual Positioning Systemvisual Positioning System
visual Positioning System
 
Google Glass
Google Glass  Google Glass
Google Glass
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 

Viewers also liked

Google & gaming, IGDA - Helsinki
Google & gaming, IGDA - HelsinkiGoogle & gaming, IGDA - Helsinki
Google & gaming, IGDA - HelsinkiRobert Nyman
 
Project Tango
Project TangoProject Tango
Project Tangotechugo
 
Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...Harsha Madusankha
 
Online jobportal
Online jobportalOnline jobportal
Online jobportalteriwoja
 
google project tango
google project tango google project tango
google project tango Sonu S Kumar
 
Online Job Portal SnapShots
Online Job Portal SnapShots Online Job Portal SnapShots
Online Job Portal SnapShots Aj Maurya
 
Exploring Raspberry Pi
Exploring Raspberry PiExploring Raspberry Pi
Exploring Raspberry PiLentin Joseph
 
Internet (Intelligence) of Things (IOT) with Drupal
Internet (Intelligence) of Things (IOT) with DrupalInternet (Intelligence) of Things (IOT) with Drupal
Internet (Intelligence) of Things (IOT) with DrupalPrateek Jain
 
Java Online Job Portal Presentation
Java Online Job Portal PresentationJava Online Job Portal Presentation
Java Online Job Portal Presentationtanmanrai
 
A seminar report on Raspberry Pi
A seminar report on Raspberry PiA seminar report on Raspberry Pi
A seminar report on Raspberry Pinipunmaster
 
Online job portal
Online job portal Online job portal
Online job portal Aj Maurya
 
Job Portal
Job PortalJob Portal
Job Portalbijendra
 

Viewers also liked (13)

Google & gaming, IGDA - Helsinki
Google & gaming, IGDA - HelsinkiGoogle & gaming, IGDA - Helsinki
Google & gaming, IGDA - Helsinki
 
Project Tango
Project TangoProject Tango
Project Tango
 
Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...
 
Online jobportal
Online jobportalOnline jobportal
Online jobportal
 
google project tango
google project tango google project tango
google project tango
 
Online Job Portal SnapShots
Online Job Portal SnapShots Online Job Portal SnapShots
Online Job Portal SnapShots
 
Exploring Raspberry Pi
Exploring Raspberry PiExploring Raspberry Pi
Exploring Raspberry Pi
 
Internet (Intelligence) of Things (IOT) with Drupal
Internet (Intelligence) of Things (IOT) with DrupalInternet (Intelligence) of Things (IOT) with Drupal
Internet (Intelligence) of Things (IOT) with Drupal
 
Java Online Job Portal Presentation
Java Online Job Portal PresentationJava Online Job Portal Presentation
Java Online Job Portal Presentation
 
A seminar report on Raspberry Pi
A seminar report on Raspberry PiA seminar report on Raspberry Pi
A seminar report on Raspberry Pi
 
Online job portal
Online job portal Online job portal
Online job portal
 
Job Portal
Job PortalJob Portal
Job Portal
 
Online Job Portal (UML Diagrams)
Online Job Portal (UML Diagrams)Online Job Portal (UML Diagrams)
Online Job Portal (UML Diagrams)
 

Similar to Google project tango seminar report

Google''s Project Tango
Google''s Project TangoGoogle''s Project Tango
Google''s Project TangoShone Mathew
 
Google cardboard the most cost effective virtual reality technology by google
Google cardboard the most cost effective virtual reality technology by googleGoogle cardboard the most cost effective virtual reality technology by google
Google cardboard the most cost effective virtual reality technology by googleAzilen Technologies Pvt. Ltd.
 
Seminar report on google glass
Seminar report on google glassSeminar report on google glass
Seminar report on google glassGhanshyam Devra
 
Seminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green ITSeminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green ITAnjali Agrawal
 
introduction and abstract on Google Glass Major report
introduction and abstract on  Google Glass Major reportintroduction and abstract on  Google Glass Major report
introduction and abstract on Google Glass Major reportJawhar Ali
 
Google Project Tango
Google Project TangoGoogle Project Tango
Google Project TangoAkhil Nair
 
Report on google glass(in pdf)
Report on google glass(in pdf)Report on google glass(in pdf)
Report on google glass(in pdf)Prakhar Gupta
 
Google Glass: A Futuristic Fashion Failure Gadget
Google Glass: A Futuristic Fashion Failure  GadgetGoogle Glass: A Futuristic Fashion Failure  Gadget
Google Glass: A Futuristic Fashion Failure GadgetMd. Salim Reza Jony
 
Presentation on Google Tango By Atharva Jawalkar
Presentation on Google Tango By Atharva Jawalkar Presentation on Google Tango By Atharva Jawalkar
Presentation on Google Tango By Atharva Jawalkar Atharva Jawalkar
 
Latest technology
Latest technologyLatest technology
Latest technologylodripas
 
Google glass
Google glass Google glass
Google glass Amith
 
Projectglassppt 130418102721-phpapp01
Projectglassppt 130418102721-phpapp01Projectglassppt 130418102721-phpapp01
Projectglassppt 130418102721-phpapp01vivek chandel
 
Google Glass seminar complete
Google Glass seminar completeGoogle Glass seminar complete
Google Glass seminar completeRaju kumar
 
Google cardboard final
Google cardboard finalGoogle cardboard final
Google cardboard finalpkholkute
 
Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee documentbhavyakishore
 
Google Glass Seminar Report
Google Glass  Seminar ReportGoogle Glass  Seminar Report
Google Glass Seminar ReportHit Esh
 
Mobile Augmented Reality Development tools
Mobile Augmented Reality Development toolsMobile Augmented Reality Development tools
Mobile Augmented Reality Development toolsThiwanka Makumburage
 

Similar to Google project tango seminar report (20)

Google''s Project Tango
Google''s Project TangoGoogle''s Project Tango
Google''s Project Tango
 
Tango
TangoTango
Tango
 
Google cardboard the most cost effective virtual reality technology by google
Google cardboard the most cost effective virtual reality technology by googleGoogle cardboard the most cost effective virtual reality technology by google
Google cardboard the most cost effective virtual reality technology by google
 
Seminar report on google glass
Seminar report on google glassSeminar report on google glass
Seminar report on google glass
 
Seminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green ITSeminar report on Google Glass, Blu-ray & Green IT
Seminar report on Google Glass, Blu-ray & Green IT
 
Aijaz tango
Aijaz tangoAijaz tango
Aijaz tango
 
introduction and abstract on Google Glass Major report
introduction and abstract on  Google Glass Major reportintroduction and abstract on  Google Glass Major report
introduction and abstract on Google Glass Major report
 
Google Project Tango
Google Project TangoGoogle Project Tango
Google Project Tango
 
Report on google glass(in pdf)
Report on google glass(in pdf)Report on google glass(in pdf)
Report on google glass(in pdf)
 
Google Glass: A Futuristic Fashion Failure Gadget
Google Glass: A Futuristic Fashion Failure  GadgetGoogle Glass: A Futuristic Fashion Failure  Gadget
Google Glass: A Futuristic Fashion Failure Gadget
 
Presentation on Google Tango By Atharva Jawalkar
Presentation on Google Tango By Atharva Jawalkar Presentation on Google Tango By Atharva Jawalkar
Presentation on Google Tango By Atharva Jawalkar
 
Latest technology
Latest technologyLatest technology
Latest technology
 
CMPE- 280-Research_paper
CMPE- 280-Research_paperCMPE- 280-Research_paper
CMPE- 280-Research_paper
 
Google glass
Google glass Google glass
Google glass
 
Projectglassppt 130418102721-phpapp01
Projectglassppt 130418102721-phpapp01Projectglassppt 130418102721-phpapp01
Projectglassppt 130418102721-phpapp01
 
Google Glass seminar complete
Google Glass seminar completeGoogle Glass seminar complete
Google Glass seminar complete
 
Google cardboard final
Google cardboard finalGoogle cardboard final
Google cardboard final
 
Project glass ieee document
Project glass ieee documentProject glass ieee document
Project glass ieee document
 
Google Glass Seminar Report
Google Glass  Seminar ReportGoogle Glass  Seminar Report
Google Glass Seminar Report
 
Mobile Augmented Reality Development tools
Mobile Augmented Reality Development toolsMobile Augmented Reality Development tools
Mobile Augmented Reality Development tools
 

Recently uploaded

THE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTION
THE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTIONTHE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTION
THE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTIONjhunlian
 
Module-1-(Building Acoustics) Noise Control (Unit-3). pdf
Module-1-(Building Acoustics) Noise Control (Unit-3). pdfModule-1-(Building Acoustics) Noise Control (Unit-3). pdf
Module-1-(Building Acoustics) Noise Control (Unit-3). pdfManish Kumar
 
Virtual memory management in Operating System
Virtual memory management in Operating SystemVirtual memory management in Operating System
Virtual memory management in Operating SystemRashmi Bhat
 
『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书
『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书
『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书rnrncn29
 
CS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdfCS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdfBalamuruganV28
 
DEVICE DRIVERS AND INTERRUPTS SERVICE MECHANISM.pdf
DEVICE DRIVERS AND INTERRUPTS  SERVICE MECHANISM.pdfDEVICE DRIVERS AND INTERRUPTS  SERVICE MECHANISM.pdf
DEVICE DRIVERS AND INTERRUPTS SERVICE MECHANISM.pdfAkritiPradhan2
 
Research Methodology for Engineering pdf
Research Methodology for Engineering pdfResearch Methodology for Engineering pdf
Research Methodology for Engineering pdfCaalaaAbdulkerim
 
Katarzyna Lipka-Sidor - BIM School Course
Katarzyna Lipka-Sidor - BIM School CourseKatarzyna Lipka-Sidor - BIM School Course
Katarzyna Lipka-Sidor - BIM School Coursebim.edu.pl
 
Cost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based questionCost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based questionSneha Padhiar
 
Artificial Intelligence in Power System overview
Artificial Intelligence in Power System overviewArtificial Intelligence in Power System overview
Artificial Intelligence in Power System overviewsandhya757531
 
KCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosKCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosVictor Morales
 
System Simulation and Modelling with types and Event Scheduling
System Simulation and Modelling with types and Event SchedulingSystem Simulation and Modelling with types and Event Scheduling
System Simulation and Modelling with types and Event SchedulingBootNeck1
 
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENTFUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENTSneha Padhiar
 
Engineering Drawing section of solid
Engineering Drawing     section of solidEngineering Drawing     section of solid
Engineering Drawing section of solidnamansinghjarodiya
 
Prach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism CommunityPrach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism Communityprachaibot
 
Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________Romil Mishra
 
Python Programming for basic beginners.pptx
Python Programming for basic beginners.pptxPython Programming for basic beginners.pptx
Python Programming for basic beginners.pptxmohitesoham12
 
11. Properties of Liquid Fuels in Energy Engineering.pdf
11. Properties of Liquid Fuels in Energy Engineering.pdf11. Properties of Liquid Fuels in Energy Engineering.pdf
11. Properties of Liquid Fuels in Energy Engineering.pdfHafizMudaserAhmad
 
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.elesangwon
 
OOP concepts -in-Python programming language
OOP concepts -in-Python programming languageOOP concepts -in-Python programming language
OOP concepts -in-Python programming languageSmritiSharma901052
 

Recently uploaded (20)

THE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTION
THE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTIONTHE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTION
THE SENDAI FRAMEWORK FOR DISASTER RISK REDUCTION
 
Module-1-(Building Acoustics) Noise Control (Unit-3). pdf
Module-1-(Building Acoustics) Noise Control (Unit-3). pdfModule-1-(Building Acoustics) Noise Control (Unit-3). pdf
Module-1-(Building Acoustics) Noise Control (Unit-3). pdf
 
Virtual memory management in Operating System
Virtual memory management in Operating SystemVirtual memory management in Operating System
Virtual memory management in Operating System
 
『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书
『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书
『澳洲文凭』买麦考瑞大学毕业证书成绩单办理澳洲Macquarie文凭学位证书
 
CS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdfCS 3251 Programming in c all unit notes pdf
CS 3251 Programming in c all unit notes pdf
 
DEVICE DRIVERS AND INTERRUPTS SERVICE MECHANISM.pdf
DEVICE DRIVERS AND INTERRUPTS  SERVICE MECHANISM.pdfDEVICE DRIVERS AND INTERRUPTS  SERVICE MECHANISM.pdf
DEVICE DRIVERS AND INTERRUPTS SERVICE MECHANISM.pdf
 
Research Methodology for Engineering pdf
Research Methodology for Engineering pdfResearch Methodology for Engineering pdf
Research Methodology for Engineering pdf
 
Katarzyna Lipka-Sidor - BIM School Course
Katarzyna Lipka-Sidor - BIM School CourseKatarzyna Lipka-Sidor - BIM School Course
Katarzyna Lipka-Sidor - BIM School Course
 
Cost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based questionCost estimation approach: FP to COCOMO scenario based question
Cost estimation approach: FP to COCOMO scenario based question
 
Artificial Intelligence in Power System overview
Artificial Intelligence in Power System overviewArtificial Intelligence in Power System overview
Artificial Intelligence in Power System overview
 
KCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitosKCD Costa Rica 2024 - Nephio para parvulitos
KCD Costa Rica 2024 - Nephio para parvulitos
 
System Simulation and Modelling with types and Event Scheduling
System Simulation and Modelling with types and Event SchedulingSystem Simulation and Modelling with types and Event Scheduling
System Simulation and Modelling with types and Event Scheduling
 
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENTFUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
FUNCTIONAL AND NON FUNCTIONAL REQUIREMENT
 
Engineering Drawing section of solid
Engineering Drawing     section of solidEngineering Drawing     section of solid
Engineering Drawing section of solid
 
Prach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism CommunityPrach: A Feature-Rich Platform Empowering the Autism Community
Prach: A Feature-Rich Platform Empowering the Autism Community
 
Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________Gravity concentration_MI20612MI_________
Gravity concentration_MI20612MI_________
 
Python Programming for basic beginners.pptx
Python Programming for basic beginners.pptxPython Programming for basic beginners.pptx
Python Programming for basic beginners.pptx
 
11. Properties of Liquid Fuels in Energy Engineering.pdf
11. Properties of Liquid Fuels in Energy Engineering.pdf11. Properties of Liquid Fuels in Energy Engineering.pdf
11. Properties of Liquid Fuels in Energy Engineering.pdf
 
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
2022 AWS DNA Hackathon 장애 대응 솔루션 jarvis.
 
OOP concepts -in-Python programming language
OOP concepts -in-Python programming languageOOP concepts -in-Python programming language
OOP concepts -in-Python programming language
 

Google project tango seminar report

  • 1. Google’s Project Tango Department of CSE 1 | P a g e 1. INTRODUCTION 3D models represent a 3D object using a collection of points in a given 3D space, connected by various entities such as curved surfaces, triangles, lines, etc. Being a collection of data which includes points and other information, 3D models can be created by hand, scanned (procedural modeling), or algorithmically. The "Project Tango" prototype is an Android smartphone- like device which tracks the 3D motion of particular device, and creates a 3D model of the environment around it. Project Tango was introduced by Google initially in early 2013, they described this as a Simultaneous Localization and Mapping (SLAM) system capable of operating in real-time on a phone. Google’s ATAP teamed up with a number of organizations to create Project Tango from this description. The team at Google’s Advanced Technology and Projects Group (ATAP) has been working with various Universities and Research labs to harvest ten years of research in Robotics and Computer Vision to concentrate that technology into a very unique mobile phone. We are physical being that live in a 3D world yet the mobile devices today assume that the physical world ends the boundaries of the screen. Project Tango’s goal is to give mobile devices a human-scale understanding of space and motion. This project will help people interact with the environment in a fundamentally different way and using this technology we can prototype in a couple of hours something that would take us months or even years before because we did not have this technology readily available. Imagine having all this in a smartphone and see how things would change. The first product to emerge from Google's ATAP skunkworks group,[1] Project Tango was developed by a team led by computer scientist Johnny Lee, a core contributor to Microsoft's Kinect. In an interview in June 2015, Lee said, "We're developing the hardware and software technologies to help everything and everyone understand precisely where they are, anywhere."[2] This device runs Android and includes development APIs to provide alignment, position or location, and depth data to regular Android apps written in C/C++, Java as well as the Unity Game Engine (UGE). These early algorithms, prototypes, and APIs are still in active development. So, these are experimental devices and are intended only for the exploratory and adventurous are not a final shipping product. Project Tango technology gives a mobile device the ability to navigate the physical world similar to how we do as humans. Project Tango brings a new kind of spatial perception to the Android device platform by adding advanced computer vision, image processing, and special vision sensors. Project Tango is a prototype phone containing highly customized hardware and software designed to allow the phone to track its motion in full 3D in real-time. The sensors make over a quarter million 3D measurements every single second updating the position and rotation of the
  • 2. Google’s Project Tango Department of CSE 2 | P a g e phone, blending this data into a single 3D model of the environment. It tracks ones position as one goes around the world and also makes a map of that. It can scan a small section of your room and then are able to generate a little game world in it. It is an open source technology. ATAP has around 200 development kits which has already been distributed among the developers. Google has produced two devices to demonstrate the Project Tango technology: the Peanut phone (no longer available) and the Yellowstone 7-inch tablet. More than 3,000 of these devices had been sold as of June 2015,[3] chiefly to researchers and software developers interested in building applications for the platform. In the summer of 2015, Qualcomm and Intel both announced that they are developing Project Tango reference devices as models for device manufacturers who use their mobile chipsets.[4][5 At CES, in January 2016, Google announced a partnership with Lenovo to release a consumer smartphone during the summer of 2016 to feature Project Tango technology marketed at consumers, noting a less than $500 price-point and a small form factor below 6.5 inches. At the same time, both companies also announced an application incubator to get applications developed to be on the device on launch. Fig (1) Google’s Project Tango Logo
  • 3. Google’s Project Tango Department of CSE 3 | P a g e Which companies are behind Project Tango? A number of companies came together to develop Project Tango. All of these are listed in the credits of the Google Project Tango introduction video called “Say hello to Project Tango!” Each company has had a different amount of involvement. The following are the list of participating companies listed in that video: · Bosch · BSquare · CompalComm · ETH Zürich · Flyby Media · George Washington University · HiDOF · MMSolutions · Movidius · University of Minnesota · NASA JPL · Ologic · OmniVision · Open Source Robotics Foundation · ParaCosm · Sunny Optical tech · Speck Design
  • 4. Google’s Project Tango Department of CSE 4 | P a g e 2. OVERVIEW WHAT IS PROJECT TANGO? Tango allows a device to build up an accurate 3D model of its immediate surroundings, which Google says will be useful for everything from AR gaming to navigating large shopping centres. Fig (2) A view of Googles Project Tango 3D Model Mapping Google isn't content with making software for phones that can merely capture 2D photos and videos. Nor does it just want to take stereoscopic 3D snaps. Instead, Project Tango is a bid to equip every mobile device with a powerful suite of software and sensors that can capture a complete 3D picture of the world around it, in real-time. Why? So you can map your house, furniture and all, simply by walking around it. Bingo - no more measuring up before going shopping for a new wardrobe. Or so you can avoid getting lost next time you go to the hospital - you'll have instant access to a 3D plan of its labyrinthine corridors. Or so you can easily find the 'unhealthy snacks' section in your local megamart. Or so can play amazing augmented reality games. Or so that the visually impaired can receive extra help in getting around. In fact, as with most Google projects, the ways in which Tango could prove useful are only limited by our imagination.
  • 5. Google’s Project Tango Department of CSE 5 | P a g e WHAT DOES THE PHONE LOOK LIKE? There are two prototypes of Tango phone yet. A 7inch tablet and another prototype of a 5 inch phone. Fig (3) Prototype 1 It's a fairly standard 7in slate with a slight wedge at the back to accommodate the extra sensors. As far as we can tell, it has three cameras including the webcam. Inside, it has one of Nvidia's so-far-untested Tegra K1 mobile processors with a beefy 4GB of RAM and a 128GB SSD. Google is at pains to point out that it's not a consumer device, but one is supposedly on the way. The depth-sensing array consists of an infrared projector, 4MP rear camera and front- facing fisheye view lens with 180-degree field of vision. Physically, it's a standard phone shape but rather chunky compared to the class of 2014. More like something from about 2010
  • 6. Google’s Project Tango Department of CSE 6 | P a g e Fig (4) Prototype 2 Prototype 2 is an android 5 inch smartphone with the same tango hardware as that of the tablet. Fig (5) A simple Overview of Components of Tango Phone Google's Project Tango is a smartphone equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. The Tango smartphone can capture a
  • 7. Google’s Project Tango Department of CSE 7 | P a g e wealth of data never before available to application developers, including depth and object-tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical smartphone. Project Tango is different from other emerging 3D-sensing computer vision products, such as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is chiefly concerned with determining the device's position and orientation within the environment. The high-end Android tablet with 7-inch HD display, 4GB of RAM, 128GB of internal SSD storage and an NVIDIA Tegra K1 graphics chip (the first in the US and second in the world) that features desktop GPU architecture. It also has a distinctive design that consists of an array of cameras and sensors near the top and a couple of subtle grips on the sides. Movidius which is the company that developed some of the technology which has been used in Tango has been working on computer vision technology for the past seven years — it developed the processing chips used in Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low- power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user.
  • 8. Google’s Project Tango Department of CSE 8 | P a g e 3. SMARTPHONE SPECIFICATION Tango wants to deconstruct reality, taking a quarter million 3D measurements each second to create a real-time 3D model that describes the physical depth of its surroundings. The smartphone specs are The above specs include Snapdragon 800 quad core CPU running up to 2.3 GHz per core, 2GB or 4GB of memory, an expandable 64GB or 128 of internal storage, and a nine axis accelerometer/gyroscope/compass. There’s also a Mini-USB, a Micro-USB, and USB 3.0. In addition to above specs Tango’s specs also include: a rear-facing four megapixel RGB/infrared camera, a 180-degree field-of-view fisheye rear-facing camera, a 120-degree field- of-view front facing camera, and a 320 x 180 depth sensor – plus a vision processor with one teraflop of computer power. Project Tango uses a 3000 mAh battery.
  • 9. Google’s Project Tango Department of CSE 9 | P a g e 4. HARDWARE Project Tango is basically a camera and sensor array that happens to run on an Android phone. The smartphone is equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it. The Tango smartphone can capture a wealth of data never before available to application developers, including depth and object-tracking and instantaneous 3D mapping. And it is almost as powerful and as big as a typical smartphone. The Front View and Back View of a Tango Phone is shown below. It is same like some other phones but the phone is having variety of cameras and sensors that make the 3D modelling of the environment possible. Fig (6) Tango Phone Front View The device tracks the 3D motion and creates a 3D model of the environment around it by using the array of cameras and sensors. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. There are three cameras that capture a 120-degree wide-angle field of view. 3D camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also captures the depth dimension (in addition to the standard 2D data).A rear-facing four megapixel RGB/infrared camera, a 180-degree field-of-view fisheye rear-facing camera, a 120-degree field-
  • 10. Google’s Project Tango Department of CSE 10 | P a g e of-view front facing camera, and a 320 x 180 depth sensor are the components of the phone at the rear end that works together to give the 3D structure of the scene. Fig (7) Tango Phone Back View Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. The motherboard which contains all of these components is shown below
  • 11. Google’s Project Tango Department of CSE 11 | P a g e Fig (8) Tango phone Motherboard  Elpida FA164A1PB 2 GB LPDDR3 RAM, layered above a Qualcomm 8974(Snapdragon 800) processor. (RED)  Two Movidius Myriad 1 computer vision co-processors. (ORANGE)  Two AMIC A25L016 16 Mbit low voltage serial flash memory ICs. (YELLOW)  InvenSense MPU-9150 9-axis gyroscope/accelerometer/compass MEMS motion tracking device. (GREEN)  Skyworks 77629 multimode multiband power amplifier module for quad-band GSM/EDGE. (BLUE)  PrimeSense PSX1200 Capri PS1200 3D sensor SoC. (VIOLET) The figure above is the motherboard: the red is 2GB LPDDR3 RAM, along with Qualcomm Snapdragon 800 CPU, the orange is computer image processor Movidius Myriad 1, the green which contain 9-axis acceleration sensor / gyroscope / compass, motion tracking, the yellow is two memory ICs AMIC A25L016 flash 16Mbit, the purple is the SoC 3D sensor PrimeSense PSX1200 Capri PS1200, the blue is SPI flash memory Winbond W25Q16CV 16Mbit. Internally, the Myriad 2 consists of 12 128-bit vector processors called Streaming Hybrid Architecture Vector Engines, or SHAVE in general, which run at 60MHz. The Myriad 2 chip gives five times the SHAVE performance of the Myriad 1, and the SIPP engines are 15x to 25x more powerful than the 1st generation chip. The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera..As the main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s
  • 12. Google’s Project Tango Department of CSE 12 | P a g e mobile device. The OV4682 is a 4MP RGB IR image sensor that captures high-resolution images and video as well as IR information, enabling depth analysis. Fig (9) Front and Rear Camera Fig (10) Fisheye Camera Fig (11) IR Projector Integrated Depth Sensor
  • 13. Google’s Project Tango Department of CSE 13 | P a g e 5. TECHNOLOGY BEHIND TANGO 5.1 TANGO’S SENSOR Myriad 1 vision processor platform developed by Movidius Company. The sensors allow the device to make "over a quarter million 3D measurements every second, updating its position and orientation in real time, combining that data into a single 3D model of the space around you. Movidius which is the company that developed some of the technology which has been used in Tango has been working on computer vision technology for the past seven years — it developed the processing chips used in Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. 5.2 IMAGE SENSORS Image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth which is then supplied information to Movidius custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The Motion Tracking camera keeps track of all the motions made by the user. . There are three cameras that capture a 120-degree wide-angle field of view from the front. An even wider 180 degree span from the back The phone is equipped with a standard 4-megapixel camera paired with a special combination of RGB and IR sensor and a lower-resolution image-tracking camera. Its depth-sensing array consists of an infrared projector, 4MP rear camera and front-facing fisheye view lens with 180-degree field of vision. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. The data collected from sensors and camera is processed by the Myriad vision processor for delivering 3D structure of the view to the apps.
  • 14. Google’s Project Tango Department of CSE 14 | P a g e 6. WORKING CONCEPT Project Tango devices combine the camera, gyroscope and accelerometer to estimate six degrees of freedom motion tracking, providing developers the ability to track 3D motion of a device while simultaneously creating a map of the environment An IR projector provides infrared light that other (non-RGB) cameras can use to get a sense of an area in 3D space. The phone emits pulses of infrared light from the IR projector and records how it is reflected back allowing it to build a detailed depth map of the surrounding space. There are three cameras that capture a 120-degree wide-angle field of view from the front. An even wider 180 degree span from the back. A 4-MP color camera sensor can also be used for snapping regular pics. A 3D camera captures the 3D structure of a scene. Most cameras are 2D, meaning they are a projection of the scene onto the camera's imaging plane; any depth information is lost. In contrast, a 3D camera also captures the depth dimension (in addition to the standard 2D data). The main camera, the Tango uses OmniVision’s OV4682. It is the eye of Project Tango’s mobile device. The OV4682 is a 4MP RGB IR image sensor that captures high-resolution images and video as well as IR information, enabling depth analysis. The sensor features a 2um OmniBSI-2 pixel and records 4MP images and video in a 16:9 format at 90fps. The sensor's 2-micron OmniBSI-2 pixel delivers excellent signal-to-noise ratio and IR sensitivity, and offers best-in-class low-light sensitivity. The OV4682's unique architecture and pixel optimization bring not only the best IR performance but also best-in-class image quality. The OV4682 records full-resolution 4-megapixel video in a native 16:9 format at 90 frames per second (fps), with a quarter of the pixels dedicated to capturing IR. The 1/3- inch sensor can also record 1080p high definition (HD) video at 120 fps with electronic image stabilization (EIS), or 720p HD at 180 fps. The OV7251 Camera Chip sensor is capable of capturing VGA resolution video at 100fps using a global shutter. RGB infrared (IR) single sensor that captures high-resolution images and video as well as IR information. Its dual RGB and IR capabilities allow it to bring a host of additional features to mobile and machine vision applications, including gesture sensing, depth analysis, iris detection and eye tracking. The another camera is fisheye lens enables a 180º FOV, while the sensor balances resolution and frames per second to record black and white images for motion tracking. So if the users moves the devices left or right, it draws the path that the devices and that path followed is show in the image on the right in real-time. Thus through this we have a motion capture capabilities in our device. The device also has a depth sensor.
  • 15. Google’s Project Tango Department of CSE 15 | P a g e Fig (12) The image represents the feed from the fish-eye lens. Fig (13) Computer Vision The figure above illustrates depth sensing by displaying a distance heat map on top of what the camera sees, showing blue colors on distant objects and red colors on close by objects. It also the data from the image sensors and paired with the device's standard motion sensors and gyroscopes to map out paths of movement down to 1 percent accuracy and then plot that onto an interactive 3D map. It uses the Sensor fusion technology which combines sensory data or data derived from sensory data from disparate sources such that the resulting information is in some sense better than would be possible when these sources were used separately. Thus it means a more precise, more comprehensive, or more reliable, or refer to the result of an emerging view, such as stereoscopic vision. These combos of image sensors give the smartphone a similar perspective on the world, complete with 3-D awareness and a awareness of depth. They supply information to Movidius custom Myriad 1 low-power computer-vision processor, which can then process the data and feed it to apps through a set of APIs. The phone also contains a Motion Tracking camera which is used to keep track of all the motions made by the user. Mantis Vision, a developer of some of the world's most advanced 3D enabling technologies research MV4D technology platform is the core 3D engine behind Google's Project Tango. Mantis Vision provides the 3D sensing platform, consisting of flash projector hardware components and Mantis Vision's core MV4D technology which includes structured light-based depth sensing algorithms which generates realistic, dense maps of the world. It focuses to provide reliable estimates of the pose of a phone i.e. position and alignment, relative to its environment, dense maps of the world. It focuses to provide reliable estimates of the pose of a phone (position and alignment), relative to its environment
  • 16. Google’s Project Tango Department of CSE 16 | P a g e 7. PROJECT TANGO CONCEPTS Project Tango is different from other emerging 3D-sensing computer vision products, such as Microsoft Hololens, in that it's designed to run on a standalone mobile phone or tablet and is chiefly concerned with determining the device's position and orientation within the environment. The software works by integrating three types of functionality: 7.1 Motion Tracking: Motion tracking allows a device to understand position and orientation using Project Tango's custom sensors. This gives you real-time information about the 3D motion of a device. Motion-tracking: using visual features of the environment, in combination with accelerometer and gyroscope data, to closely track the device's movements in space. Project Tango’s core functionality is measuring movement through space and understanding the area moved through. Google API’s provide the position and orientation of the user’s device in full six degrees of freedom, referred to as its pose. Fig (14) Motion Tracking
  • 17. Google’s Project Tango Department of CSE 17 | P a g e 7.2 Area Learning: Using area learning, a Project Tango device can remember the visual features of the area it is moving through and recognize when it sees those features again. These features can be saved in an Area Description File (ADF) to use again later. Project Tango devices can use visual cues to help recognize the world around them. They can self-correct errors in motion tracking and relocalize in areas they've seen before. . With an ADF loaded, Project Tango devices gain a new feature called drift corrections or improved motion tracking. Area learning is the way of storing environment data in a map that can be re-used later, shared with other Project Tango devices, and enhanced with metadata such as notes, instructions, or points of interest Fig (15) Area Learning 7.3 Depth Perception: Project Tango devices are equipped with integrated 3D sensors that measure the distance from a device to objects in the real world. This configuration gives good depth at a distance while balancing power requirements for infrared illumination and depth processing. The depth data allows an application to understand the distance of visible objects to the device. By combining depth perception with motion tracking, you can also measure distance between points in an area that aren’t in the same fame.
  • 18. Google’s Project Tango Department of CSE 18 | P a g e Project Tango devices are equipped with integrated 3D sensors that measure the distance from a device to objects in the real world. Current devices are designed to work best indoors at moderate distances (0.5 to 4 meters). It may not be ideal for close range object scanning. Because the technology relies on viewing infrared light using the device's camera, there are some situations where accurate depth perception is difficult. Areas lit with light sources high in IR like sunlight or incandescent bulbs, or objects that do not reflect IR light cannot be scanned well. By combining depth perception with motion tracking, you can also measure distances between points in an area that aren't in the same frame. Fig (16) Depth Perception Together, these generate data about the device in "six degrees of freedom" (3 axes of orientation plus 3 axes of motion) and detailed three-dimensional information about the environment. Applications on mobile devices use Project Tango's C and Java APIs to access this data in real time. In addition, an API is also provided for integrating Project Tango with the Unity game engine; this enables the rapid conversion or creation of games that allow the user to interact and navigate in the game space by moving and rotating a Project Tango device in real space. These APIs are documented on the Google developer website.
  • 19. Google’s Project Tango Department of CSE 19 | P a g e 8. DEVICES DEVELOPED SO FAR As a platform for software developers and a model for device manufacturers, Google has created two Project Tango devices to date. The Yellowstone tablet Google's Project Tango tablet, 2014 "Yellowstone" is a 7-inch tablet with full Project Tango functionality, released in June 2014, and sold as the Project Tango Tablet Development Kit.[7] It features a 2.3 GHz quad-core Nvidia Tegra K1 processor, 128GB flash memory, 1920x1200-pixel touchscreen, 4MP color camera, fisheye-lens (motion-tracking) camera, integrated depth sensing, and 4G LTE connectivity. The device is sold through the official Project Tango website [8] and the Google Play Store. The Peanut phone "Peanut" was the first production Project Tango device, released in the first quarter of 2014. It was a small Android phone with a Qualcomm MSM8974 quad-core processor and additional special hardware including a fisheye-lens camera (for motion tracking), "RGB-IR" camera (for color images and infrared depth detection), and Movidius image-processing chips. A high- performance accelerometer and gyroscope were added after testing several competing models in the MARS lab at the University of Minnesota. Several hundred Peanut devices were distributed to early-access partners including university researchers in computer vision and robotics, as well as application developers and technology.Google stopped supporting the Peanut device in September 2015, as by then the Project Tango software stack had evolved beyond the versions of Android that run on the device. Testing by NASA In May 2014, two Peanut phones were delivered to the International Space Station to be part of a NASA project to develop autonomous robots that navigate in a variety of environments, including outer space. The soccer-ball-sized, 18-sided polyhedral SPHERES robots were developed at the NASA Ames Research Center, adjacent to the Google campus in Mountain View, California. Andres Martinez, SPHERES manager at NASA, said "We are researching how effective Project Tango's vision-based navigation abilities are for performing localization and navigation of a mobile free flyer on ISS.
  • 20. Google’s Project Tango Department of CSE 20 | P a g e 9. FUTURE SCOPE Project Tango seeks to take the next step in this mapping evolution. Instead of depending on the infrastructure, expertise, and tools of others to provide maps of the world, Tango empowers users to build their own understanding, all with a phone. Imagine knowing your exact position to within inches. Imagine building 3D maps of the world in parallel with other users around you. Imagine being able to track not just the top down location of a device, but also its full 3D position and alignment. The technology is ambitious, the potential applications are powerful. The Tango device really enables augmented reality which opens a whole frontier for playing games in the scenery around you. You can capture the room, you can then render the scene that includes the room but also adds characters and adds objects so that you can create games that operate in your natural environment. The applications even go beyond gaming. Imagine if you could see what room would look like and decorate it with different types of furniture and walls and create a very realistic scene. This Technology can be used the guide the visually impaired to give them auditory queues or where they are going. Can even be used by soldiers in war to replicate the war-zone and prepare for combat or can even be used to live out one’s own creative fantasies. The possibilities are really endless for this amazing technology and the future is looking very bright. Things Project Tango can do DIRECTIONS: When you need directions inside a building or structure that current mapping solutions just don’t provide. Shopping - who just like to get in and out as quickly as possible. Having an indoor map of the store in your hand could make shopping trips more efficient by leading you directly to the shelf you want. EMERGENCY RESPONSE: To help emergency response workers such as firefighters find their way through buildings by projecting the blueprints onto the screen. It has the potential to provide valuable information in situations where knowing the exact layout of a room can be a matter of life or death AUGMENTED REALITY GAMING: It could combine the room-mapping with augmented reality. “Imagine competing against a friend for control over territories in your own home with your own miniature army. Mapping in-game textures onto your real walls through the smartphone would arguably produce the best game of Cops and Robbers in history. MODELLING OBJECTS: A simple image showing image Modelling using Project Tango. Fig (17)
  • 21. Google’s Project Tango Department of CSE 21 | P a g e 10.CONCLUSION Project Tango enables apps to track a device's position and orientation within a detailed 3D environment, and to recognize known environments. This makes possible applications such as in- store navigation, visual measurement and mapping utilities, presentation and design tools, and a variety of immersive games. At this moment, Tango is just a project but is developing quite rapidly with early prototypes and development kits already distributed among many developers. It is all up to the developers now to create more clever and innovative apps to take advantage of this technology. It is just the beginning and there is a lot of work to do to fine-tune this amazing technology. Thus, if Project Tango works – and we've no reason to suspect it won't - it could prove every bit as revolutionary as Maps or earth or android. It just might take a while for its true genius to become clear
  • 22. Google’s Project Tango Department of CSE 22 | P a g e 11.REFERENCE [1] Announcement on ATAP Google+ site, 30 January 2015. [2] "Future Phones Will Understand, See the World". 3 June 2015. Retrieved 4 November 2015. [3] ^"Slamdance: inside the weird virtual reality of Google's Project Tango". 29 May 2015. [4] Qualcomm Powers Next Generation Project Tango Development Platform, 2015-05-29 [5] IDF 2015: Intel teams with Google to bring RealSense to Project Tango, 2015-08-18 [6] https://developers.google.com/project-tango/ Google developer website [7] Product announcement on ATAP Google+ page, 5 June 2014, retrieved 4 November 2015 [8] https://www.google.com/atap/project-tango/ Project Tango website