• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Vipul divyanshu documentation  on Kinect and Motion Tracking

Vipul divyanshu documentation on Kinect and Motion Tracking






Total Views
Views on SlideShare
Embed Views



2 Embeds 40

http://dl.dropboxusercontent.com 26
http://dl.dropbox.com 14



Upload Details

Uploaded via as Microsoft Word

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.


11 of 1 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
  • download this amazing full version 100% working and virus proof file without any survey from below link. just download it and enjoy.
    download from here:- http://gg.gg/h26m
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Vipul divyanshu documentation  on Kinect and Motion Tracking Vipul divyanshu documentation on Kinect and Motion Tracking Document Transcript

    • Skeletal Tracking and Facial Tracking andAnimation Project Documentation Vipul Divyanshu IIL/2012/14 Summer Internship Mentor: Imagineer India Innovation LabsTasks at hand:*Skeletal Tracking*Facial Tracking*Skeletal animation and integration with the model(additional)Tools Explored: OpenNI Microsoft Kinect SDK Ogre 3D Unity BlenderAnalysis of the tools and what was explored:OpenNI:-OpenNI (Open Natural Interaction) is a multi-language, cross-platformframework that defines APIs for writing applications utilizing Natural Interaction. OpenNIAPIs are composed of a set of interfaces for writing NI applications. The main purpose ofOpenNI is to form a standard API that enables communication with both: Vision and audio sensors (the devices that ‘see’ and ‘hear’ the figures and their surroundings.) Vision and audio perception middleware (the software components that analyse the audio and visual data that is recorded from the scene, and comprehend it). For example, software that receives visual data, such as an image, returns the location of the palm of a hand detected within the image.OpenNI supplies a set of APIs to be implemented by the sensor devices, and a set of APIs tobe implemented by the middleware components.
    • A major plus point of OpenNI is that we can run the skeletal tracking work and work on therecorded Kinect video in .oni format.Constrains of OpenNI:- What I found out was for an easier start of project Microsoft Kinect SDK would be more useful for a full body skeletal tracking and its animation C#. This is because OpenNI cannot easily load a model in OpenNI directly from Maya or Blender, instead SDKs like Ogre 3D must be used.Microsoft Kinect SDK: -Microsoft Kinect come with classes for skeletal trackingand it is much easier to initialise Kinect and get the skeletal joint points just by calling theskeletal.joint in C#. For developing in C++ both Microsoft Kinect and OpenNI stand at samelevel.I developed in C# and found XNA and Microsoft Kinect very useful in both getting theskeletal data and interacting with the 3D model in .fbx format.Still the exploration of Kinect SDK in the facial animation field is left as it wasn’t a part mywork package.FBX :-Fbx format used with C# is very flexible and is has no problem with object collision.Constrains:- Working directly with the recorded data from the Kinect Studio .xed hasn’t been well explored due to lack of material online and that can be restricting. For facial tracking, it cannot be detected if the eyes are open or not, rest other features are detected properly.Ogre 3D: -Ogre can be used in animation and in sync in the OpenNI or KinectSDK .It wasn’t explored much by me. It has a vast amount of potential for facialanimation.Constrains:-Requires a bit of detailed study about the model and the class hierarchy toperform the animation. It is easier for experienced person.Things Learnt and coded: OpenNI: I learnt basic of extracting the skeletal joints and how to work with the recorded data and the class hierarchy of the OpenNI and there attributes, and understood the data flow. Wrote and analysed the code for the getting the skeletal data of the user. Microsoft Kinect SDK: I learnt basic of extracting the skeletal joints and how to work with the recorded data and the class hierarchy of the
    • Microsoft Kinect and there attributes, and understood the data flow. And then learnt C# .Using Kinect SDK I wrote few code blocks to add the sample and three complete programs for skeletal tracking and user interface. Wrote the code for controlling the cursorand colouring the player detected with different colour. There screenshots can be found below. Understood how to load and work with 3D model into C# and control it with user motion and locked the default to the knee so the model doesn’t slip and has a more natural motion flow. Learnt and understood the sample for facial tracking and understood the data structure for getting the facial points. Ogre: Learnt the data flow of Ogre and explored just the tip of the ice burg of Ogre as my part was just the tracking.Initial Software Development tools (SDKs) suggested: OpenNI, Ogre and MayaInstallation of OpenNI:In Ubuntu:-First the following requirements need to be installed to install OpenNI.Requirements:1. GCC 4.x ( http://gcc.gnu.org/releases.html )sudo apt-get install g++2) Python 2.6+/3.x ( http://www.python.org/download/ ) This may already be installed, based onthe linuxdistro being used.sudo apt-get install python3) LibUSB 1.0.8 ( http://sourceforge.net/projects/libusb/ )sudo apt-get install libusb-1.0-0-dev4) FreeGLUT3 ( http://freeglut.sourceforge.net/index.php#download )sudo apt-get install freeglut3-dev5) JDK 6.0 ( http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u26-download-400750.html )sudo add-apt-repository "deb http://archive.canonical.com/ lucid partner"sudo apt-get updatesudo apt-get install sun-java6-jdkOptional Requirements (To build the documentation):
    • 1) Doxygen( http://www.stack.nl/~dimitri/doxygen/download.html#latestsrc )sudo apt-get install doxygen2) GraphViz( http://www.graphviz.org/Download_linux_ubuntu.php )sudo apt-get install graphvizOptional Requirements (To build the Mono wrapper):1) Mono ( http://www.go-mono.com/mono-downloads/download.html )sudo apt-get install mono-completeDownload OpenNI Modules:Download the OpenNI modules appropriate for your operating systemfromhttp:// the latest unstable binaries for these: OpenNI binaries OpenNI compliant middleware binaries OpenNI compliant hardware binariesFor Ubuntu Linux 12.04 64bit, the files will be: (as of 19 June 2012) openni-bin-dev-linux-x64-v1.5.4.0.tar nite-bin-linux-x64-v1.5.2.21.tar.bz2 sensor-bin-linux-x64-v5.1.2.1.tar.bz2Make a new folder called kinectmkdirkinectcdkinectExtract the downloaded files into it. (3 folders are now created). 1. OpenNI-Bin-Linux64-v1.5.4.0 2. Sensor-Bin-Linux64-v5.1.2.1 3. Nite- the folders as, OpenNI, Sensor, Nite respectively.Install OpenNI and Sensor Kinect: (run sudo ./install.sh )cdkinectcd OpenNIsudo ./install.sh(Every step should now show OK! )cd ../Sensorsudo ./install.sh
    • (Every step should now show OK! )Install NITE:cd ../NiteEnter the kinect/Nite/Data folder and edit each of the three xml files in there changing the key datafromkey=""tokey="0KOIk2JeIBYClPWVnMoRKn5cdY4="(useGedit)sudo ./install.sh(Every step should now show OK! )Test if install succeeded:Test out some samples from OpenNI. Run Niviewercd ~/kinect/OpenNI/Samples/Bin/x64-Release/./NiViewer(If a Kinect is connected, this will show depth map and image stream in a window)Download Sample streams from OpenNI.If kinect is not connected, you can run NiViewer on some pre-recorded .oni files from OpenNI.ONI Files (OpenNI sample streams recorded onto a file) (extract as skeletonrec.oni andMultipleHands.oni)Now run NiViewer from the ~/kinect/OpenNI/Samples/Bin/x64-Release/ folderwith the oni file asargument.
    • ./NiViewer ~/skeletonrec.oni(This will show a window with the sample)(skeletonrec.oni from NiViewer)Sample Program: (Sample-NiUserTracker)Run similar to NiViewer./Sample-NiUserTracker ~/skeletonrec.oni
    • (skeletonrec.oni from Sample-NiUserTracker)**********************************************************************************Installation in windows:OpenNI and NITE installation can be painful if not done properly. Lets do it step by step:Step 0Uninstall any previews drivers, such as CLNUI. Look at the end of this post if you want tosee how you can have multiple drivers installed.Step 1 Download Kinect Drivers and unzip. Open the unzipped folder and navigate to Bin folder. Run the msi Windows file.Drivers are now installed in your PC.Step 2Download and install the latest stable or unstable OpenNI Binaries from OpenNI website.Step 3Download and install the latest stable or unstable OpenNI Compliant MiddlewareBinaries (NITE) from OpenNI website.During installation, provide the following (free) PrimeSensekey: 0KOIk2JeIBYClPWVnMoRKn5cdY4=Step 4Download and install the latest stable or unstable OpenNI Compliant HardwareBinaries from OpenNI website.Both stable and unstable releases have worked for me. If you have trouble installing theunstable releases, just try the stable ones.Step 5 Plug in your Kinect device and connect its USB port with your PC. Wait until the driver software is found and applied. Navigate to the Device Manager (Control Panel). You should see something like the following:Step 6 C:ProgramNavigate toFilesOpenNISamplesBinRelease (or C:Program Files
    • (x86)OpenNISamplesBinRelease) and try out the existing demo applications.Try the demos found in C:Program FilesPrimeSenseNITESamplesBinRelease (or C:Program Files (x86)PrimeSenseNITESamplesBinRelease), too. If they work properly, then you are done!Congratulations!Step 7You have successfully installed Kinect in your Windows PC! Read the documentation andfamiliarize yourself with the OpenNI and NITE API. Youll find the proper assemblies in: C:Program FilesOpenNIBin (or C:Program Files (x86)OpenNIBin) and C:Program FilesPrime SenseNITEBin (or C:Program Files (x86)Prime SenseNITEBin)OpenNI is the primary assembly youll need when developing Natural User Interfacesapplications.Installation of OGRE :For installation of Ogre in Linux and Windows the following link can be very useful .http://www.ogre3d.org/tikiwiki/Installing+the+Ogre+SDKWith these suggested tools the Kinect can be very well exploited for Skeletal tracking and itsanimation. These are limited to C++ till what I understood.For starters the Microsoft Kinect SKD also is very useful and a very well developed tool .For using the Kinect SDK just follow the link:http://www.microsoft.com/en-us/kinectforwindows/develop/developer-downloads.aspxIn the toolkit I went through the different examples and understood them and the code flow.I decided of doing the project in C# rather than in C because in C# it was easily possible to import aMaya or blender model easily in .fbx format and can be easily transformed in the sense oforientation and bone movement.Doing that I used Visual studio C# to track the Skeletal .The model was imported and animated with the help of Microsoft XNA Game Studio SDK which isbasicallyused for and ease in importing the model.This was done by the use of the libraries in the SDK called Microsoft.Kinect.
    • Other tools,SDKs Installed and used:Blender: Linkhttp://www.blender.org/download/get-blender/FbxConverter:Linkhttp://usa.autodesk.com/adsk/servlet/pc/item?id=10775855&siteID=123112Unity:http://unity3d.com/unity/download/The first skeletal tracking code written after going through the documentations andunderstanding the class hierarchy, by me was for simple head and hand tracking, in C#.Here is a snapshot of it in action.Now Using C# I built on it the code for tracking a full player and colouring him in a specific colourwhen detected. Note the different colour for the user and the background.
    • Using Avataring sample in Kinect SKD as a base in C# , I built on it, the code to track the model andanimate it with the motion of the human in front of the Kinect . Here are few snapshots of it.Snapshot of the default model provide in the toolkit.
    • But notice that it has been looked in the Knee for a better ad natural flow.This snapshot show the models natural like walking ability with respect to the user.
    • This depicts the neck and hand movement of the model with users movement.This picture is for animation of a different character ( Zombie).
    • The picture below is for an improved version on my previous simple head and hand tracker. In thisimproved code I can control the courser by my hand movement, where I have use Euclidiandistance between the head joint to the right hand cursor joint for moving the courser left or right.For Facial tracking the Kinect SKD has been used. The samples were well understood and are quitesimple to begin with.The samples were well studied and understood. Here is the facial trackingsnapshot.For facialanimation Ogre3D SDK has been studied and the facial points returned by the function aretheorised to the animate the following character in the model.
    • But unable to still match the face points with the vector position values of the actual face. This is stillsomething left to work on and can be done with a while of work and with some time in hand. Vipul Divyanshu IIL/2012/14