Object tracking

  • 5,772 views
Uploaded on

its applications. how it works. algorithms, methods

its applications. how it works. algorithms, methods

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
5,772
On Slideshare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
52
Comments
1
Likes
13

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Multi object (people) tracking within a video of a pedestrian passageway. The dynamic motion vectors attached to each individual represents direction of movement and speed.
  • Here, we can see how a mobile robot can detect and track this red ball. It moves accordingly to the red ball movement.
  • http://www.princeton.edu/~xm/vision05/Assignment3/Tracking/tracking.htm
  • The first term is proportional to the density estimate at x computed with the kernel G. the second term is the mean shift. This part is the mean of the window. We calculate it by using a kernel function, which gives different weights to all points inside the window. the mean shift vector thus always points toward the direction of maximum increase in the density.
  • The absolute difference and census transform are easy to implement but computationally expensive and slow. Feature based method can track multiple objects, but it is also slow.
  • KLT algorithm is can detect objects fast and accurately and it is robust to noise and dynamic scene. but it requires large memory, when the search window size is large.Mean shift has low computation cost. But it might fail in case of heavy occlusion and it can only detect single object. This can be solved by combining different algorithm, for example, SIFT feature descriptor and Kalman filter.

Transcript

  • 1. Presentation on Object Tracking By Sri Vidhya.K
  • 2.  Introduction to Object tracking  Applications Of Object tracking  Object Representation  Object Detection  Steps in Object tracking  Object tracking Algorithm’s  Methodologies  Comparison  Working example of Object tracking in MATLAB  Conclusion
  • 3.  To track an object over a sequence of images.  A method of following an object through successive image frames to determine its relative movement with respect to other objects.
  • 4.  Traffic Information
  • 5.  In a tracking scenario, an object can be defined as anything that is of interest for further analysis. Objects can be represented by their shapes. Object shape representations commonly employed for tracking are:  Points: The object is represented by a point, that is, centroid or set of points. Point representation is suitable for tracking objects that occupy small regions in an image.  Primitive geometric shapes: Object shape is represented by a rectangle, ellipse etc. these are suitable for representing simple rigid objects and non rigid objects.
  • 6.  Object silhouette and contour: contour representation defines the boundary of an object. The region inside the contour is called the silhouette of the object. These are suitable for tracking complex non rigid shapes.  Articulated shape models: These objects are composed of body parts that are held together with joints.  Skeletal models: object skeleton can be extracted by applying medial axis transform to the object silhouette. This can be used to model both articulated and rigid objects.
  • 7.  visual input is usually achieved through digitized images obtained from a camera connected to a digital computer.  This camera can be either stationary or moving depending on the application.  Beyond image acquisition, the computer performs the necessary tracking and any higher level tasks using tracking result.
  • 8.  Every tracking method requires an object detection mechanism either in every frame or when the object first appears in the video.  Challenges of moving object detection: • Loss of information caused by the 3D world on a 2D image • Noise in images • Complex object motion • Non-rigid or articulated nature of objects • Partial or full object occlusions • Complex object shapes • Scene illumination changes
  • 9.  Point Tracking: Objects detected in consecutive frames are represented by points, and the association of the points is based on the previous object state which can include object position and motion. This approach requires an external mechanism to detect the objects in every frame.  Kernel Tracking: Kernel refers to the object shape and appearance  Silhouette Tracking: Tracking is performed by estimating the object region in each frame. Silhouette tracking methods use the information encoded inside the object region.
  • 10. SEGMENTATION Foreground / background extraction Useful feature extraction / calculation Tracking
  • 11. shows the color image segmentation result with the edged image. show the final detected result of joint color image segmentation and background model. background model
  • 12.  Segmentation is the process of identifying components of the image. Segmentation involves operations such as boundary detection, connected component labeling, thresholding etc. Boundary detection finds out edges in the image. Thresholding is the process of reducing the grey levels in the image
  • 13.  As the name suggests this is the process of separating the foreground and background of the image. Here it is assumed that foreground contains the objects of interest
  • 14. Background extraction  Once foreground is extracted a simple subtraction operation can be used to extract the background. Following figure illustrates this operation:
  • 15.  Camera model is an important aspect of any object-tracking algorithm. All the existing objects tracking systems use a preset camera model. In words camera model is directly derived from the domain knowledge. Some of the common camera models are – 1. Single fixed camera Example: Road traffic tracking system 2. Multiple fixed cameras Example: Simple surveillance system 3. Single moving camera Example: Animation and video compression systems 4. Multiple moving cameras Example: Robot navigation system
  • 16.  Different motion analysis method ◦ SAD of consecutive frames ◦ A threshold is set to detect the moving The motion object is here!
  • 17.  Disadvantage of DMA method ◦ May include covered or covering background The size of tracking area is not the same as the size of tracking object !
  • 18.  Solution: Block-Matching Algorithm (BMA) ◦ Using motion vector to compensate the redundant part of tracking area SAD is selected to measure How two blocks match with Each other
  • 19. = Image subtraction D(t)=I(ti) – I(tj)  Gives an image frame with changed and unchanged regions Ideal Case for no motion: I(ti) = I(tj), D(t)=0
  • 20. Moving objects are detected
  • 21. Methods for Motion Detection  Frame Differencing  Background Subtraction Draw Backs:  Involves a lot of computations  Not feasible for DSP implementation
  • 22. Frame1 Frame10 Difference of Two Frames
  • 23. 124 74 32 124 64 18 157 116 184 1 1 0 1 x 0 1 1 1 1 1 0 1 x 0 1 1 1 If (Center pixel < Neighbor pixel) Neighbor pixel = 1 Signature Vector11001111 Signature Vector Generation
  • 24. List Generation 12 8 26 12 5 24 3 87 96 76 43 23 6 12 5 12 8 12 9 23 5 22 9 20 9 22 8 25 1 22 9 22 1 23 4 22 7 22 1 35 58 98 Image Signatur vector generation for all pixels Signature Vectors 1 0 1 1 0 1 0 1 0 0 1 0 1 0 1 1 . . . 1 0 1 1 1 0 1 0 List population 1 0 1 1 0 1 0 1 0 0 1 0 1 0 1 1 . . . 1 0 1 1 1 0 1 0Generated List
  • 25. Advantages:  Compare only two values 0 or 1.  Similar Illumination Variation for pixel and neighbouring pixels Draw Backs:  As we only deal with only 0`s and 1`s, this method is sensitive to noise.  Calculate, store and match process  computationally Expensive
  • 26. Background estimation Frame differencing Object Registration Method 3: Morphology Based Object Tracking
  • 27. Background Estimation • Image Differencing • Thresholding Object Registration • Contours are registered • Width, height and histogram are recorded for each contour Frame Differencing • Each object represented by a feature vector (the length, width, area and histogram of the object)
  • 28.  Visual motion pattern of objects and surface in a scene  by Optical Flow Frame 1 Frame 2
  • 29.  A method that iteratively shifts a data point to the average of data points in its neighborhood Choose a search window size in the initial location Compute the MEAN location in the search window Center the search window at the mean Repeat until convergence
  • 30. Distribution of identical balls Region of interest Center of mass Mean Shift vector Objective : Find the densest region
  • 31. Distribution of identical balls Region of interest Center of mass Mean Shift vector Objective : Find the densest region
  • 32. Distribution of identical balls Region of interest Center of mass Mean Shift vector Objective : Find the densest region
  • 33. Distribution of identical balls Region of interest Center of mass Mean Shift vector Objective : Find the densest region
  • 34. Distribution of identical balls Region of interest Center of mass Mean Shift vector Objective : Find the densest region
  • 35. Distribution of identical balls Region of interest Center of mass Mean Shift vector Objective : Find the densest region
  • 36. Distribution of identical balls Region of interest Center of mass Objective : Find the densest region
  • 37. KDE Mean Shift Mean Shift Algorithm • compute mean shift vector • translate kernel (window) by mean shift vector
  • 38. absolute Differences Easy to implement Allows continuous tracking Computationall y expensive Slow and low accuracy Census Transform Immune to noise and Illumination changes Complex if  Multiple objects per frame Computationall y expensive Feature Based Can track multiple objects well Large Memory consumption Slow
  • 39. KLT High accuracy Less execution time Large memory MeanShift & CAMShift Ineffective if there is heavy occlusion Robust to noise and dynamic scene Computationally less expensive
  • 40.  The problem of motion-based object tracking can be divided into two parts:  Detecting moving objects in each frame  Associating the detections corresponding to the same object over time  How to detect red color:  Suppose our input video stream is handled by vidDevice object.
  • 41.  Step 1: First acquire an RGB Frame from the Video. MATLAB Code: rgbFrame = step(vidDevice);
  • 42.  Step 2: Extract the Red Layer Matrix from the RGB frame.  MATLAB Code: redFrame = rgbFrame(:,:,1);
  • 43.  Step 3: Get the grey image of the RGB frame.  MATLAB Code: grayFrame = rgb2gray(rgbFrame);
  • 44.  Step 4: Subtract the grayFrame from the redFrame.  MATLAB Code: diffFrame = imsubtract(redFrame, grayFrame);
  • 45.  Step 5: Filter out unwanted noises using Median Filter  MATLAB Code: diffFrame = medfilt2(diffFrame, [3 3])  Step 6: Now convert the diffFrame into corresponding Binary Image using proper threshold value. Change its value for different light conditions. Suppose in my code I have used its value as 0.15.  MATLAB Code: binFrame = im2bw(diffFrame, 0.15);
  • 46.  Step 7: Now you are all done. Your Red color has been detected
  • 47. Object tracking means tracing the progress of objects as they move about in visual scene. Object tracking, thus, involves processing spatial as well as temporal changes. Significant progress has been made in object tracking. Taxonomy of moving object detection is been proposed. Performance of various object detection is also compared.
  • 48.  http://www.mathworks.in/help/vision/examples/motion -based-multiple-object-tracking.html  http://opencv-srf.blogspot.in/  http://unoccio.blogspot.in/2009/03/fast-color-based- object-tracking-using.html  www.slideshare.com  http://www.mathworks.in/help/vision/ug/track-an- object-using-correlation.html  http://www.codeproject.com/Articles/139628/Detect- and-Track-Objects-in-Live-Webcam-Video-base  http://scien.stanford.edu/pages/labsite/2002/ee392j/sebe _report.pdf