SlideShare a Scribd company logo
1 of 7
Download to read offline
Unii-3D-Input-Device Date:5/2/2016
1/7
1. Field of Technology: 3D Tracking/Gesture Device
Briefly speaking, it is 100% sure for us to develop a new 3D input device, or man-machine interface, with relative low
cost which exactly can do the same performance as the one shown by the film of Minority
Report(http://tw.youtube.com/watch?v=Kz_vWtAvk1k).
2. Current Status(A Lab. Setup Comprising Existing Commercial Components)
(1) Current Setup
 Line Cameras(USB): Each line camera uses one line CMOS/CCD sensor
 Algorithm(Windows/XP): C++ codes used for data acquisition, data processing, lens calibration
image correspondence, positioning calculations and gesture recognition
 Light Source-Rackets(3 IR LED used): A general simulator used for simulating sports using racket
 Light Source-Glove(3 IR LED used): A glove used for gesture applications
(2) Number of Applied Patents: 6(applied & pending)
 Claims include: optical principle,
core implementation for simultaneous positioning and tracking of multiple points,
key components optimization,
architecture of 3D gesture,
Unii-3D-Input-Device Date:5/2/2016
2/7
applications of virtual input device and simulator and
interface protocols
 Applied area: mainly in TWN, USA and JPN
(3) Current Performance
For the details and current performance of our 3D, please refer to following web pages.
(1) Major features and specifications: http://www.unii.com.tw/3d_input_device/3id_product.htm
(2) Video Demo: http://www.unii.com.tw/3d_input_device/3id_demo.htm
(3) Comparison with other 3D solutions: http://www.unii.com.tw/3d_input_device/3id_comparison.htm
3. Future Performance(Key Components Being Optimized)
We strongly believe that our future 3D is the one and only one in the world at this time which can satisfy all of the
following requirements(conditions for commercialization)
(1)Multiple fingers(or points) tracking in 3D space(absolute position): can be 3 points for each hand(3x2 points)
(2)Fast sampling rate: can be as fast as 1KHz
(3)Large detectable volume: can be larger than 2Mx2Mx2M
(4)High spatial resolution: can be less than 0.1mm in the detectable space
(5)Low manufacturing cost: can be as less as few tens of US$
4. More Resource for Commercialization and Advanced Research
Need more resource and financial support to commercialize the current achievement and do more advanced research on
this tracking technology. The final goal is to develop an ultimate 3D capturing and positioning device without the wear
of any lighting components(LED).
Unii-3D-Input-Device Date:5/2/2016
3/7
Patent #1
Filed Date: March 12, 2008
Applied No.: 12/047,159
METHOD OF RECOGNIZING AND TRACKING A SPATIAL POINT
FIELD OF THE INVENTION
[0001] The present invention relates to a method of recognizing and tracking a
spatial point, and more particularly to a method of using a point light source and a
spatial point recognition device to measure the coordinates of the point light source
and the coordinates of the convergent point of the spatial point recognition device
based on the principle of parallax of human eyes, so as to achieve the purpose of
recognizing the position of a spatial point. Further, the spatial point recognition
device is capable of moving the convergent point, such that the coordinates of the
convergent point are superimposed onto the coordinates of the point light source,
so as to achieve the purpose of tracking a spatial point automatically. At the same
time, the spatial point recognition device can also receive the coordinates of a new
convergent point to reset the position of the convergent point, so as to achieve the
purpose of resetting the convergent point.
Unii-3D-Input-Device Date:5/2/2016
4/7
Patent #2
Filed Date: April 18, 2008
Applied No.: 12/105,630
Method of recognizing and tracking multiple spatial points
FIELD OF THE INVENTION
[0002] The present invention relates to a method of recognizing and tracking multiple
spatial points, and more particularly to a method of measuring coordinates of a plurality
of point light sources by an optical system comprised of a plurality of 1D optical lens
modules and a logical analysis method to achieve the purpose of recognizing and tracking
multiple spatial points.
Patent #3
Filed Date: May 8, 2008
Applied No.: 12/117,124
THREE-DEMENSIONAL MOUSE APPRATUS
BACKGROUND OF THE INVENTION
1. Field of the invention
The present invention relates to a 3D mouse apparatus, and more particularly to an apparatus
comprising a point light source device which is installed on human hands or fingers and a 3D
gesture reading and recognizing device is utilized to calculate, analyze, recognize and output the 3D
gestures which include the physical quantities such as 3D position coordinate, displacement,
velocity, and acceleration of the point light source device and the moving behavior of human
hand, so as to achieve the purpose of 3D mouse apparatus.
Unii-3D-Input-Device Date:5/2/2016
5/7
Patent #4
Filed Date: Aug. 1, 2008
Applied No.: 12/184,475
Three-dimensional virtual input and simulation apparatus
FIELD OF THE INVENTION
[0003] The present invention relates to a three-dimensional virtual input and simulation
apparatus, and more particularly to an apparatus comprising a plurality of point light sources, a
plurality of optical positioning devices with a visual axis tracking function and a control analysis
procedure. The invention is characterized in that the plurality of optical positioning devices with
the visual axis tracking function are provided for measuring and analyzing 3D movements of the
plurality of point light sources to achieve the effect of a virtual input and simulator.
Patent #5~#6 applied in Taiwan only
Unii-3D-Input-Device Date:5/2/2016
6/7
Possible Applications
Unii-3D-Input-Device Date:5/2/2016
7/7

More Related Content

Similar to 3D Input Device(90912)

HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOTHUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOTcsandit
 
Interactive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor networkInteractive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor networkijcga
 
Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network  Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network ijcga
 
Real time occupancy detection using self-learning ai agent
Real time occupancy detection using self-learning ai agentReal time occupancy detection using self-learning ai agent
Real time occupancy detection using self-learning ai agentRaghavendran G
 
DISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERAS
DISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERASDISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERAS
DISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERAScscpconf
 
eng.pptx
eng.pptxeng.pptx
eng.pptxZuine
 
Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...
Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...
Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...AugmentedWorldExpo
 
VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVIRJET Journal
 
Sanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance SystemSanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance SystemIRJET Journal
 
Intelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo visionIntelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo visionsipij
 
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOTHUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOTcscpconf
 
A Fast Single-Pixel Laser Imager for VR/AR Headset Tracking
A Fast Single-Pixel Laser Imager for VR/AR Headset TrackingA Fast Single-Pixel Laser Imager for VR/AR Headset Tracking
A Fast Single-Pixel Laser Imager for VR/AR Headset TrackingPing Hsu
 
IRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET Journal
 
IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...
IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...
IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...IRJET Journal
 
Real Time Hand Gesture Recognition Based Control of Arduino Robot
Real Time Hand Gesture Recognition Based Control of Arduino RobotReal Time Hand Gesture Recognition Based Control of Arduino Robot
Real Time Hand Gesture Recognition Based Control of Arduino Robotijtsrd
 
Research topics for EON Realty's Research Grant Program (RGP) v16
Research topics for  EON Realty's Research Grant Program (RGP) v16Research topics for  EON Realty's Research Grant Program (RGP) v16
Research topics for EON Realty's Research Grant Program (RGP) v16Senthilkumar R
 
AN Introduction to Augmented Reality(AR)
AN Introduction to Augmented Reality(AR)AN Introduction to Augmented Reality(AR)
AN Introduction to Augmented Reality(AR)Jai Sipani
 
Goal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D cameraGoal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D camerajournalBEEI
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with EyeballIRJET Journal
 

Similar to 3D Input Device(90912) (20)

HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOTHUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
 
Interactive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor networkInteractive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor network
 
Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network  Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network
 
Real time occupancy detection using self-learning ai agent
Real time occupancy detection using self-learning ai agentReal time occupancy detection using self-learning ai agent
Real time occupancy detection using self-learning ai agent
 
DISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERAS
DISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERASDISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERAS
DISTRIBUTED SYSTEM FOR 3D REMOTE MONITORING USING KINECT DEPTH CAMERAS
 
eng.pptx
eng.pptxeng.pptx
eng.pptx
 
Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...
Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...
Neil Sarkar (AdHawk Microsystems): Ultra-Fast Eye Tracking Without Cameras fo...
 
VIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCVVIRTUAL MOUSE USING OPENCV
VIRTUAL MOUSE USING OPENCV
 
Sanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance SystemSanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance System
 
Intelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo visionIntelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo vision
 
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOTHUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
HUMAN BODY DETECTION AND SAFETY CARE SYSTEM FOR A FLYING ROBOT
 
A Fast Single-Pixel Laser Imager for VR/AR Headset Tracking
A Fast Single-Pixel Laser Imager for VR/AR Headset TrackingA Fast Single-Pixel Laser Imager for VR/AR Headset Tracking
A Fast Single-Pixel Laser Imager for VR/AR Headset Tracking
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
IRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually Impaired
 
IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...
IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...
IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...
 
Real Time Hand Gesture Recognition Based Control of Arduino Robot
Real Time Hand Gesture Recognition Based Control of Arduino RobotReal Time Hand Gesture Recognition Based Control of Arduino Robot
Real Time Hand Gesture Recognition Based Control of Arduino Robot
 
Research topics for EON Realty's Research Grant Program (RGP) v16
Research topics for  EON Realty's Research Grant Program (RGP) v16Research topics for  EON Realty's Research Grant Program (RGP) v16
Research topics for EON Realty's Research Grant Program (RGP) v16
 
AN Introduction to Augmented Reality(AR)
AN Introduction to Augmented Reality(AR)AN Introduction to Augmented Reality(AR)
AN Introduction to Augmented Reality(AR)
 
Goal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D cameraGoal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D camera
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with Eyeball
 

3D Input Device(90912)

  • 1. Unii-3D-Input-Device Date:5/2/2016 1/7 1. Field of Technology: 3D Tracking/Gesture Device Briefly speaking, it is 100% sure for us to develop a new 3D input device, or man-machine interface, with relative low cost which exactly can do the same performance as the one shown by the film of Minority Report(http://tw.youtube.com/watch?v=Kz_vWtAvk1k). 2. Current Status(A Lab. Setup Comprising Existing Commercial Components) (1) Current Setup  Line Cameras(USB): Each line camera uses one line CMOS/CCD sensor  Algorithm(Windows/XP): C++ codes used for data acquisition, data processing, lens calibration image correspondence, positioning calculations and gesture recognition  Light Source-Rackets(3 IR LED used): A general simulator used for simulating sports using racket  Light Source-Glove(3 IR LED used): A glove used for gesture applications (2) Number of Applied Patents: 6(applied & pending)  Claims include: optical principle, core implementation for simultaneous positioning and tracking of multiple points, key components optimization, architecture of 3D gesture,
  • 2. Unii-3D-Input-Device Date:5/2/2016 2/7 applications of virtual input device and simulator and interface protocols  Applied area: mainly in TWN, USA and JPN (3) Current Performance For the details and current performance of our 3D, please refer to following web pages. (1) Major features and specifications: http://www.unii.com.tw/3d_input_device/3id_product.htm (2) Video Demo: http://www.unii.com.tw/3d_input_device/3id_demo.htm (3) Comparison with other 3D solutions: http://www.unii.com.tw/3d_input_device/3id_comparison.htm 3. Future Performance(Key Components Being Optimized) We strongly believe that our future 3D is the one and only one in the world at this time which can satisfy all of the following requirements(conditions for commercialization) (1)Multiple fingers(or points) tracking in 3D space(absolute position): can be 3 points for each hand(3x2 points) (2)Fast sampling rate: can be as fast as 1KHz (3)Large detectable volume: can be larger than 2Mx2Mx2M (4)High spatial resolution: can be less than 0.1mm in the detectable space (5)Low manufacturing cost: can be as less as few tens of US$ 4. More Resource for Commercialization and Advanced Research Need more resource and financial support to commercialize the current achievement and do more advanced research on this tracking technology. The final goal is to develop an ultimate 3D capturing and positioning device without the wear of any lighting components(LED).
  • 3. Unii-3D-Input-Device Date:5/2/2016 3/7 Patent #1 Filed Date: March 12, 2008 Applied No.: 12/047,159 METHOD OF RECOGNIZING AND TRACKING A SPATIAL POINT FIELD OF THE INVENTION [0001] The present invention relates to a method of recognizing and tracking a spatial point, and more particularly to a method of using a point light source and a spatial point recognition device to measure the coordinates of the point light source and the coordinates of the convergent point of the spatial point recognition device based on the principle of parallax of human eyes, so as to achieve the purpose of recognizing the position of a spatial point. Further, the spatial point recognition device is capable of moving the convergent point, such that the coordinates of the convergent point are superimposed onto the coordinates of the point light source, so as to achieve the purpose of tracking a spatial point automatically. At the same time, the spatial point recognition device can also receive the coordinates of a new convergent point to reset the position of the convergent point, so as to achieve the purpose of resetting the convergent point.
  • 4. Unii-3D-Input-Device Date:5/2/2016 4/7 Patent #2 Filed Date: April 18, 2008 Applied No.: 12/105,630 Method of recognizing and tracking multiple spatial points FIELD OF THE INVENTION [0002] The present invention relates to a method of recognizing and tracking multiple spatial points, and more particularly to a method of measuring coordinates of a plurality of point light sources by an optical system comprised of a plurality of 1D optical lens modules and a logical analysis method to achieve the purpose of recognizing and tracking multiple spatial points. Patent #3 Filed Date: May 8, 2008 Applied No.: 12/117,124 THREE-DEMENSIONAL MOUSE APPRATUS BACKGROUND OF THE INVENTION 1. Field of the invention The present invention relates to a 3D mouse apparatus, and more particularly to an apparatus comprising a point light source device which is installed on human hands or fingers and a 3D gesture reading and recognizing device is utilized to calculate, analyze, recognize and output the 3D gestures which include the physical quantities such as 3D position coordinate, displacement, velocity, and acceleration of the point light source device and the moving behavior of human hand, so as to achieve the purpose of 3D mouse apparatus.
  • 5. Unii-3D-Input-Device Date:5/2/2016 5/7 Patent #4 Filed Date: Aug. 1, 2008 Applied No.: 12/184,475 Three-dimensional virtual input and simulation apparatus FIELD OF THE INVENTION [0003] The present invention relates to a three-dimensional virtual input and simulation apparatus, and more particularly to an apparatus comprising a plurality of point light sources, a plurality of optical positioning devices with a visual axis tracking function and a control analysis procedure. The invention is characterized in that the plurality of optical positioning devices with the visual axis tracking function are provided for measuring and analyzing 3D movements of the plurality of point light sources to achieve the effect of a virtual input and simulator. Patent #5~#6 applied in Taiwan only