Using the Kinect for Fun and Profit by Tam Hanna

1,204 views
1,069 views

Published on

Very few devices offer as fascinating features as the Microsoft Kinect. This seminar teaches you what the Kinect can do and how you can develop for it.
Attendants are recommended to bring a notebook with Visual C# 2010 express edition and the latest Kinect SDK so that they can fully profit from the talk. A sensor will be available for testing own applications.

Published in: Technology
1 Comment
0 Likes
Statistics
Notes
  • download this amazing full version 100% working and virus proof file without any survey from below link. just download it and enjoy.
    download from here:- http://gg.gg/h26m
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

No Downloads
Views
Total views
1,204
On SlideShare
0
From Embeds
0
Number of Embeds
8
Actions
Shares
0
Downloads
15
Comments
1
Likes
0
Embeds 0
No embeds

No notes for slide

Using the Kinect for Fun and Profit by Tam Hanna

  1. 1. Using theKinectfor funand profit
  2. 2. About /me• Tam HANNA– Director,Tamoggemon Holdingk,s– Runs web sites aboutmobile computing– Writes scientific books
  3. 3. Agenda• Kinect – what is that?• Streams• Skeletons• Facial tracking• libfreenect• OpenNI
  4. 4. Slide download• http://www.tamoggemon.com/test/Codemotion-Kinect.ppt• URL IS case sensitive
  5. 5. Kinect – what is that?
  6. 6. History - I• Depth: PrimeSense technology– Not from Redmond• First public mention: 2007– Bill Gates, D3 conference– „Camera for game control“
  7. 7. Contrast detectionWhere does the shirt end?
  8. 8. Dot matrix
  9. 9. Shadows / dead areas
  10. 10. Shadows / dead areas - II
  11. 11. History - II• 2008: Wii ships– Best-selling console of its generation• 2009: E3 conference– Announcement of „Project Natal“• 2010: no CPU in sensor– Takes 10% of XBox 360 CPU
  12. 12. History - III• 4. November 2010– First shipment– “We will sue anyone who reverse engineers“• June 2011– Official SDK
  13. 13. System overview
  14. 14. Kinect provides• Video stream• Depth stream– (IR stream)• Accelerometer data• Rest: computedRest: computed
  15. 15. Family tree• Kinect for XBOX– Normal USB• Kinect bundle– MS-Fucked USB– Needs PSU• Kinect for Windows– Costs more– Legal to deploy
  16. 16. Cheap from China
  17. 17. Streams
  18. 18. Kinect provides „streams“• Repeatedly updated bitmaps• Push or Pull processes possible– Attention: processing time!!!
  19. 19. Color stream• Two modes– VGA@30fps– 1280x960@12fps• Simple data format– 8 bits / component– R / G / B / A components
  20. 20. Depth stream• Two modes– Unlimited range– Reduced range, with player indexing
  21. 21. Depth stream - II• 16bit words• Special encoding for limited range:
  22. 22. Depth stream - III
  23. 23. IR stream• Instead of color data• 640x480@30fps• 16bit words• IR data in 10 MSB bits
  24. 24. Finding the Kinect• SDK supports multiple Sensors/PC• Find one• Microsoft.Kinect.Toolkit
  25. 25. XAML part<Window x:Class="KinectWPFD2.MainWindow"xmlns:toolkit="clr-namespace:Microsoft.Kinect.Toolkit;assembly=Microsoft.Kinect.Toolkit"xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"Title="MainWindow" Height="759" Width="704"><Grid><Image Height="480" HorizontalAlignment="Left" Name="image1"Stretch="Fill" VerticalAlignment="Top" Width="640" /><toolkit:KinectSensorChooserUI x:Name="SensorChooserUI"IsListening="True" HorizontalAlignment="Center"VerticalAlignment="Top" /><CheckBox Content="Overlay rendern" Height="16"HorizontalAlignment="Left" Margin="267,500,0,0" Name="ChkRender"VerticalAlignment="Top" /></Grid></Window>
  26. 26. Code - Ipublic partial class MainWindow : Window{KinectSensor mySensor;KinectSensorChooser myChooser;public MainWindow(){InitializeComponent();myChooser = new KinectSensorChooser();myChooser.KinectChanged += newEventHandler<KinectChangedEventArgs>(myChooser_KinectChanged);this.SensorChooserUI.KinectSensorChooser = myChooser;myChooser.Start();
  27. 27. Code - IIvoid myChooser_KinectChanged(object sender,KinectChangedEventArgs e){if (null != e.OldSensor){if (mySensor != null){mySensor.Dispose();}}if (null != e.NewSensor){mySensor = e.NewSensor;
  28. 28. Initialize streammySensor.DepthStream.Enable(DepthImageFormat.Resolution640x480Fps30);mySensor.ColorStream.Enable(ColorImageFormat.RgbResolution640x480Fps30);myArray = new short[this.mySensor.DepthStream.FramePixelDataLength];myColorArray = new byte[this.mySensor.ColorStream.FramePixelDataLength];mySensor.AllFramesReady += newEventHandler<AllFramesReadyEventArgs>(mySensor_AllFramesReady);try{this.mySensor.Start();SensorChooserUI.Visibility = Visibility.Hidden;}
  29. 29. Process streamvoid mySensor_AllFramesReady(object sender,AllFramesReadyEventArgs e){ColorImageFrame c = e.OpenColorImageFrame();DepthImageFrame d = e.OpenDepthImageFrame();if (c == null || d == null) return;c.CopyPixelDataTo(myColorArray);d.CopyPixelDataTo(myArray);
  30. 30. Problem: Calibration• Depth and Color sensors are not aligned• Position of data in array does not match
  31. 31. Solution• CoordinateMapper class• Maps between various frame types– Depth and Color– Skeleton and Color
  32. 32. On Push mode• Kinect can push data to application• Preferred mode of operation• But: sensitive to proc time• If handler takes too long -> App stops
  33. 33. Skeletons
  34. 34. What is tracked?• Data format– Real life coordinates• Color-Mappable
  35. 35. Initialize streamif (null != e.NewSensor){mySensor = e.NewSensor;mySensor.SkeletonStream.Enable();
  36. 36. Get jointsvoid mySensor_AllFramesReady(object sender, AllFramesReadyEventArgs e){ColorImageFrame c = e.OpenColorImageFrame();SkeletonFrame s = e.OpenSkeletonFrame();if (c == null || s == null) return;c.CopyPixelDataTo(myColorArray);s.CopySkeletonDataTo(mySkeletonArray);foreach (Skeleton aSkeleton in mySkeletonArray){DrawBone(aSkeleton.Joints[JointType.HandLeft],aSkeleton.Joints[JointType.WristLeft], armPen, drawingContext);
  37. 37. Use jointsprivate void DrawBone(Joint jointFrom, Joint jointTo, Pen aPen,DrawingContext aContext){if (jointFrom.TrackingState == JointTrackingState.NotTracked ||jointTo.TrackingState == JointTrackingState.NotTracked){}if (jointFrom.TrackingState == JointTrackingState.Inferred ||jointTo.TrackingState == JointTrackingState.Inferred){ColorImagePoint p1 =mySensor.CoordinateMapper.MapSkeletonPointToColorPoint(jointFrom.Position, ColorImageFormat.RgbResolution640x480Fps30);}if (jointFrom.TrackingState == JointTrackingState.Tracked ||jointTo.TrackingState == JointTrackingState.Tracked)
  38. 38. Facial trackingFacial tracking
  39. 39. What is tracked - I
  40. 40. What is tracked - II
  41. 41. What is tracked - III
  42. 42. AU‘s?• Research by Paul EKMAN• Quantify facial motion
  43. 43. Structure• C++ library with algorithms• Basic .net wrapper provided– Incomplete– Might change!!
  44. 44. Initialize face trackermyFaceTracker = newFaceTracker(mySensor);
  45. 45. Feed face trackerFaceTrackFrame myFrame = null;foreach (Skeleton aSkeleton in mySkeletonArray){if (aSkeleton.TrackingState == SkeletonTrackingState.Tracked){myFrame =myFaceTracker.Track(ColorImageFormat.RgbResolution640x480Fps30,myColorArray, DepthImageFormat.Resolution640x480Fps30, myArray,aSkeleton);if (myFrame.TrackSuccessful == true){break;}}}
  46. 46. Calibration• OUCH!– Not all snouts are equal• Maximums vary
  47. 47. libfreenect
  48. 48. What is it• Result of Kinect hacking competition• Bundled with most Linux distributions• „Basic Kinect data parser“
  49. 49. Set-up• /etc/udev/rules.d/66-kinect.rules#Rules for Kinect####################################################SYSFS{idVendor}=="045e",SYSFS{idProduct}=="02ae",MODE="0660",GROUP="video"SYSFS{idVendor}=="045e", SYSFS{idProduct}=="02ad",MODE="0660",GROUP="video"SYSFS{idVendor}=="045e", SYSFS{idProduct}=="02b0",MODE="0660",GROUP="video"### END#############################################################
  50. 50. Set-up II• sudo adduser $USER plugdev• sudo usermod -a -G video tamhan• tamhan@tamhan-X360:~$ freenect-glviewKinect camera testNumber of devices found: 1Could not claim interface on camera: -6Could not open device
  51. 51. Set-up III
  52. 52. Problems• gspca-kinect– Kernel module, uses Kinect as webcam– Blocks other libraries– sudo modprobe -r gspca_kinect• Outdated version widely deployed– API not compatible
  53. 53. Update library• sudo foo• sudo add-apt-repository ppa:floe/libtisch• sudo apt-get update• sudo apt-get install libfreenect libfreenect-dev libfreenect-demos
  54. 54. libfreenect - IIcolor stream
  55. 55. Implementing it• libfreenect: C++ library• Question: which framework• Answer: Qt ( what else ;) )
  56. 56. The .pro fileQT += core guiTARGET = QtDepthFrameCONFIG += i386DEFINES += USE_FREENECTLIBS += -lfreenect
  57. 57. The freenect thread• Library needs processing time– Does not multithread itself• Should be provided outside of main app
  58. 58. class QFreenectThread : public QThread{Q_OBJECTpublic:explicit QFreenectThread(QObject *parent = 0);void run();signals:public slots:public:bool myActive;freenect_context *myContext;};
  59. 59. QFreenectThread::QFreenectThread(QObject *parent) :QThread(parent){}void QFreenectThread::run(){while(myActive){if(freenect_process_events(myContext) < 0){qDebug("Cannot process events!");QApplication::exit(1);}}}
  60. 60. QFreenect• Main engine module– Contact point between Kinect and app• Fires off signals on frame availability
  61. 61. • class QFreenect : public QObject• {• Q_OBJECT• public:• explicit QFreenect(QObject *parent = 0);• ~QFreenect();• void processVideo(void *myVideo, uint32_t myTimestamp=0);• void processDepth(void *myDepth, uint32_t myTimestamp=0);• signals:• void videoDataReady(uint8_t* myRGBBuffer);• void depthDataReady(uint16_t* myDepthBuffer);• public slots:
  62. 62. • private:• freenect_context *myContext;• freenect_device *myDevice;• QFreenectThread *myWorker;• uint8_t* myRGBBuffer;• uint16_t* myDepthBuffer;• QMutex* myMutex;• public:• bool myWantDataFlag;• bool myFlagFrameTaken;• bool myFlagDFrameTaken;• static QFreenect* mySelf;• };
  63. 63. Some C++QFreenect* QFreenect::mySelf;static inline void videoCallback(freenect_device *myDevice, void*myVideo, uint32_t myTimestamp=0){QFreenect::mySelf->processVideo(myVideo, myTimestamp);}static inline void depthCallback(freenect_device *myDevice, void*myVideo, uint32_t myTimestamp=0){QFreenect::mySelf->processDepth(myVideo, myTimestamp);}
  64. 64. Bring-up• QFreenect::QFreenect(QObject *parent) :• QObject(parent)• {• myMutex=NULL;• myRGBBuffer=NULL;• myMutex=new QMutex();• myWantDataFlag=false;• myFlagFrameTaken=true;• mySelf=this;• if (freenect_init(&myContext, NULL) < 0)• {• qDebug("init failed");• QApplication::exit(1);• }
  65. 65. Bring-up – II• freenect_set_log_level(myContext, FREENECT_LOG_FATAL);• int nr_devices = freenect_num_devices (myContext);• if (nr_devices < 1)• {• freenect_shutdown(myContext);• qDebug("No Kinect found!");• QApplication::exit(1);• }• if (freenect_open_device(myContext, &myDevice, 0) < 0)• {• qDebug("Open Device Failed!");• freenect_shutdown(myContext);• QApplication::exit(1);• }
  66. 66. • myRGBBuffer = (uint8_t*)malloc(640*480*3);• freenect_set_video_callback(myDevice,videoCallback);• freenect_set_video_buffer(myDevice,myRGBBuffer);• freenect_frame_mode vFrame =freenect_find_video_mode(FREENECT_RESOLUTION_MEDIUM,FREENECT_VIDEO_RGB);• freenect_set_video_mode(myDevice,vFrame);• freenect_start_video(myDevice);
  67. 67. • myWorker=newQFreenectThread(this);• myWorker->myActive=true;• myWorker->myContext=myContext;• myWorker->start();
  68. 68. Shut-Down• QFreenect::~QFreenect()• {• freenect_close_device(myDevice);• freenect_shutdown(myContext);• if(myRGBBuffer!=NULL)free(myRGBBuffer);• if(myMutex!=NULL)delete myMutex;• }
  69. 69. Data passingvoid QFreenect::processVideo(void *myVideo, uint32_tmyTimestamp){QMutexLocker locker(myMutex);if(myWantDataFlag && myFlagFrameTaken){uint8_t* mySecondBuffer=(uint8_t*)malloc(640*480*3);memcpy(mySecondBuffer,myVideo,640*480*3);myFlagFrameTaken=false;emit videoDataReady(mySecondBuffer);}}
  70. 70. Format of data word• Array of bytes• Three bytes = one pixel
  71. 71. Format of data word - IIfor(int x=2; x<640;x++){for(int y=0;y<480;y++){r=(myRGBBuffer[3*(x+y*640)+0]);g=(myRGBBuffer[3*(x+y*640)+1]);b=(myRGBBuffer[3*(x+y*640)+2]);myVideoImage->setPixel(x,y,qRgb(r,g,b));}}
  72. 72. libfreenect - IIIdepth stream
  73. 73. Extra bring-upmyDepthBuffer= (uint16_t*)malloc(640*480*2);freenect_set_depth_callback(myDevice,depthCallback);freenect_set_depth_buffer(myDevice,myDepthBuffer);freenect_frame_mode aFrame =freenect_find_depth_mode( FREENECT_RESOLUTION_MEDIUM,FREENECT_DEPTH_REGISTERED);freenect_set_depth_mode(myDevice,aFrame);freenect_start_depth(myDevice);
  74. 74. Extra processingvoid QFreenect::processDepth(void *myDepth, uint32_tmyTimestamp){QMutexLocker locker(myMutex);if(myWantDataFlag && myFlagDFrameTaken){uint16_t* mySecondBuffer=(uint16_t*)malloc(640*480*2);memcpy(mySecondBuffer,myDepth,640*480*2);myFlagDFrameTaken=false;emit depthDataReady(mySecondBuffer);}}
  75. 75. Data extractionvoid MainWindow::depthDataReady(uint16_t*myDepthBuffer){if(myDepthImage!=NULL)delete myDepthImage;myDepthImage=newQImage(640,480,QImage::Format_RGB32);unsigned char r, g, b;for(int x=2; x<640;x++){for(int y=0;y<480;y++){int calcval=(myDepthBuffer[(x+y*640)]);
  76. 76. Data is in metersif(calcval==FREENECT_DEPTH_MM_NO_VALUE){r=255; g=0;b=0;}else if(calcval>1000 && calcval < 2000){QRgb aVal=myVideoImage->pixel(x,y);r=qRed(aVal);g=qGreen(aVal);b=qBlue(aVal);}else{r=0;g=0;b=0;}myDepthImage->setPixel(x,y,qRgb(r,g,b));
  77. 77. Example
  78. 78. OpenNI
  79. 79. What is OpenNI?• Open standard for Natural Interfaces– Very Asus-Centric• Provides generic NI framework• VERY complex APIVERY complex API
  80. 80. Version 1.5 vs Version 2.0
  81. 81. Supported platforms• Linux• Windows– 32bit only
  82. 82. Want more?• Book– German language– 30 Euros• Launch– When it‘s done!
  83. 83. ?!?tamhan@tamoggemon.com@tamhannaImages: pedroserafin, mattbuck

×