Upcoming SlideShare
×

# Lath

565 views
477 views

Published on

Published in: Technology, Art & Photos
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
565
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
15
0
Likes
0
Embeds 0
No embeds

No notes for slide

### Lath

1. 1. Object Detection and Tracking Object Detection In A Cluttered Scene Using Point Feature Matching This example shows how to detect a particular object in a cluttered scene, given a reference image of the object. Overview Step 1: Read Images Step 2: Detect Feature Points Step 3: Extract Feature Descriptors Step 4: Find Putative Point Matches Step 5: Locate the Object in the Scene Using Putative Matches Step 7: Detect Another Object Overview This example presents an algorithm for detecting a specific object based on finding point correspondences between the reference and the target image. It can detect objects despite a scale change or in-plane rotation. It is also robust to small amount of out-of-plane rotation and occlusion. This method of object detection works best for objects that exhibit non-repeating texture patterns, which give rise to unique feature matches. This technique is not likely to work well for uniformly- colored objects, or for objects containing repeating patterns. Note that this algorithm is designed for detecting a specific object, for example, the elephant in the reference image, rather than any elephant. For detecting objects of a particular category, such as people or faces, see vision.PeopleDetector and vision.CascadeObjectDetector. Step 1: Read Images Read the reference image containing the object of interest. boxImage = imread('stapleRemover.jpg'); figure; imshow(boxImage); title('Image of a Box'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 1 of 34 04/24/2014 07:10 PM
2. 2. Read the target image containing a cluttered scene. sceneImage = imread('clutteredDesk.jpg'); figure; imshow(sceneImage); title('Image of a Cluttered Scene'); Step 2: Detect Feature Points Detect feature points in both images. Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 2 of 34 04/24/2014 07:10 PM
3. 3. boxPoints = detectSURFFeatures(boxImage); scenePoints = detectSURFFeatures(sceneImage); Visualize the strongest feature points found in the reference image. figure; imshow(boxImage); title('100 Strongest Feature Points from Box Image'); hold on; plot(boxPoints.selectStrongest(100)); Visualize the strongest feature points found in the target image. figure; imshow(sceneImage); title('300 Strongest Feature Points from Scene Image'); hold on; plot(scenePoints.selectStrongest(300)); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 3 of 34 04/24/2014 07:10 PM
4. 4. Step 3: Extract Feature Descriptors Extract feature descriptors at the interest points in both images. [boxFeatures, boxPoints] = extractFeatures(boxImage, boxPoints); [sceneFeatures, scenePoints] = extractFeatures(sceneImage, scenePoints); Step 4: Find Putative Point Matches Match the features using their descriptors. boxPairs = matchFeatures(boxFeatures, sceneFeatures); Display putatively matched features. matchedBoxPoints = boxPoints(boxPairs(:, 1), :); matchedScenePoints = scenePoints(boxPairs(:, 2), :); figure; showMatchedFeatures(boxImage, sceneImage, matchedBoxPoints, ... matchedScenePoints, 'montage'); title('Putatively Matched Points (Including Outliers)'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 4 of 34 04/24/2014 07:10 PM
5. 5. Step 5: Locate the Object in the Scene Using Putative Matches estimateGeometricTransform calculates the transformation relating the matched points, while eliminating outliers. This transformation allows us to localize the object in the scene. [tform, inlierBoxPoints, inlierScenePoints] = ... estimateGeometricTransform(matchedBoxPoints, matchedScenePoints, 'affine'); Display the matching point pairs with the outliers removed figure; showMatchedFeatures(boxImage, sceneImage, inlierBoxPoints, ... inlierScenePoints, 'montage'); title('Matched Points (Inliers Only)'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 5 of 34 04/24/2014 07:10 PM
6. 6. Get the bounding polygon of the reference image. boxPolygon = [1, 1;... % top-left size(boxImage, 2), 1;... % top-right size(boxImage, 2), size(boxImage, 1);... % bottom- right 1, size(boxImage, 1);... % bottom-left 1, 1]; % top-left again to close the polygon Transform the polygon into the coordinate system of the target image. The transformed polygon indicates the location of the object in the scene. newBoxPolygon = transformPointsForward(tform, boxPolygon); Display the detected object. figure; imshow(sceneImage); hold on; line(newBoxPolygon(:, 1), newBoxPolygon(:, 2), 'Color', 'y'); title('Detected Box'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 6 of 34 04/24/2014 07:10 PM
7. 7. Step 7: Detect Another Object Detect a second object by using the same steps as before. Read an image containing the second object of interest. elephantImage = imread('elephant.jpg'); figure; imshow(elephantImage); title('Image of an Elephant'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 7 of 34 04/24/2014 07:10 PM
8. 8. Detect and visualize feature points. elephantPoints = detectSURFFeatures(elephantImage); figure; imshow(elephantImage); hold on; plot(elephantPoints.selectStrongest(100)); title('100 Strongest Feature Points from Elephant Image'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 8 of 34 04/24/2014 07:10 PM
9. 9. Extract feature descriptors. [elephantFeatures, elephantPoints] = extractFeatures(elephantImage, elephantPoints); Match Features elephantPairs = matchFeatures(elephantFeatures, sceneFeatures, 'MaxRatio', 0.9); Display putatively matched features. matchedElephantPoints = elephantPoints(elephantPairs(:, 1), :); matchedScenePoints = scenePoints(elephantPairs(:, 2), :); figure; showMatchedFeatures(elephantImage, sceneImage, matchedElephantPoints, ... Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 9 of 34 04/24/2014 07:10 PM
10. 10. matchedScenePoints, 'montage'); title('Putatively Matched Points (Including Outliers)'); Estimate Geometric Transformation and Eliminate Outliers [tform, inlierElephantPoints, inlierScenePoints] = ... estimateGeometricTransform(matchedElephantPoints, matchedScenePoints, 'affine'); figure; showMatchedFeatures(elephantImage, sceneImage, inlierElephantPoints, ... inlierScenePoints, 'montage'); title('Matched Points (Inliers Only)'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 10 of 34 04/24/2014 07:10 PM
11. 11. Display Both Objects elephantPolygon = [1, 1;... % top-left size(elephantImage, 2), 1;... % top-right size(elephantImage, 2), size(elephantImage, 1);... % bottom-right 1, size(elephantImage, 1);... % bottom-left 1,1]; % top-left again to close the polygon newElephantPolygon = transformPointsForward(tform, elephantPolygon); figure; imshow(sceneImage); hold on; line(newBoxPolygon(:, 1), newBoxPolygon(:, 2), 'Color', 'y'); line(newElephantPolygon(:, 1), newElephantPolygon(:, 2), 'Color', 'g'); title('Detected Elephant and Box'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 11 of 34 04/24/2014 07:10 PM
12. 12. Face Detection and Tracking Using CAMShift This example shows how to automatically detect and track a face. Introduction Step 1: Detect a Face To Track Step 2: Identify Facial Features To Track Step 3: Track the Face Summary Reference Introduction Object detection and tracking are important in many computer vision applications including activity recognition, automotive safety, and surveillance. In this example, you will develop a simple face tracking system by dividing the tracking problem into three separate problems: Detect a face to track1. Identify facial features to track2. Track the face3. Step 1: Detect a Face To Track Before you begin tracking a face, you need to first detect it. Use the vision.CascadeObjectDetector to detect the location of a face in a video frame. The cascade object detector uses the Viola-Jones detection algorithm and a trained classification model for detection. By default, the detector is configured to detect faces, but it can be configured for other object types. % Create a cascade detector object. faceDetector = vision.CascadeObjectDetector(); % Read a video frame and run the detector. videoFileReader = vision.VideoFileReader('visionface.avi'); videoFrame = step(videoFileReader); bbox = step(faceDetector, videoFrame); % Draw the returned bounding box around the detected face. videoOut = insertObjectAnnotation(videoFrame,'rectangle',bbox,'Face'); figure, imshow(videoOut), title('Detected face'); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 12 of 34 04/24/2014 07:10 PM
13. 13. You can use the cascade object detector to track a face across successive video frames. However, when the face tilts or the person turns their head, you may lose tracking. This limitation is due to the type of trained classification model used for detection. To avoid this issue, and because performing face detection for every video frame is computationally intensive, this example uses a simple facial feature for tracking. Step 2: Identify Facial Features To Track Once the face is located in the video, the next step is to identify a feature that will help you track the face. For example, you can use the shape, texture, or color. Choose a feature that is unique to the object and remains invariant even when the object moves. In this example, you use skin tone as the feature to track. The skin tone provides a good deal of contrast between the face and the background and does not change as the face rotates or moves. % Get the skin tone information by extracting the Hue from the video frame % converted to the HSV color space. [hueChannel,~,~] = rgb2hsv(videoFrame); % Display the Hue Channel data and draw the bounding box around the face. figure, imshow(hueChannel), title('Hue channel data'); rectangle('Position',bbox(1,:),'LineWidth',2,'EdgeColor',[1 1 0]) Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 13 of 34 04/24/2014 07:10 PM
14. 14. Step 3: Track the Face With the skin tone selected as the feature to track, you can now use the vision.HistogramBasedTracker for tracking. The histogram based tracker uses the CAMShift algorithm, which provides the capability to track an object using a histogram of pixel values. In this example, the Hue channel pixels are extracted from the nose region of the detected face. These pixels are used to initialize the histogram for the tracker. The example tracks the object over successive video frames using this histogram. % Detect the nose within the face region. The nose provides a more accurate % measure of the skin tone because it does not contain any background % pixels. noseDetector = vision.CascadeObjectDetector('Nose'); faceImage = imcrop(videoFrame,bbox(1,:)); noseBBox = step(noseDetector,faceImage); % The nose bounding box is defined relative to the cropped face image. % Adjust the nose bounding box so that it is relative to the original video % frame. noseBBox(1,1:2) = noseBBox(1,1:2) + bbox(1,1:2); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 14 of 34 04/24/2014 07:10 PM
15. 15. % Create a tracker object. tracker = vision.HistogramBasedTracker; % Initialize the tracker histogram using the Hue channel pixels from the % nose. initializeObject(tracker, hueChannel, noseBBox(1,:)); % Create a video player object for displaying video frames. videoInfo = info(videoFileReader); videoPlayer = vision.VideoPlayer('Position',[300 300 videoInfo.VideoSize+30]); % Track the face over successive video frames until the video is finished. while ~isDone(videoFileReader) % Extract the next video frame videoFrame = step(videoFileReader); % RGB -> HSV [hueChannel,~,~] = rgb2hsv(videoFrame); % Track using the Hue channel data bbox = step(tracker, hueChannel); % Insert a bounding box around the object being tracked videoOut = insertObjectAnnotation(videoFrame,'rectangle',bbox,'Face'); % Display the annotated video frame using the video player object step(videoPlayer, videoOut); end % Release resources release(videoFileReader); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 15 of 34 04/24/2014 07:10 PM
16. 16. release(videoPlayer); Summary In this example, you created a simple face tracking system that automatically detects and tracks a single face. Try changing the input video and see if you are able to track a face. If you notice poor tracking results, check the Hue channel data to see if there is enough contrast between the face region and the background. Reference [1] G.R. Bradski "Real Time Face and Object Tracking as a Component of a Perceptual User Interface", Proceedings of the 4th IEEE Workshop on Applications of Computer Vision, 1998. [2] Viola, Paul A. and Jones, Michael J. "Rapid Object Detection using a Boosted Cascade of Simple Features", IEEE CVPR, 2001. Motion-Based Multiple Object Tracking This example shows how to perform automatic detection and motion-based tracking of moving objects in a video from a stationary camera. Detection of moving objects and motion-based tracking are important components of many computer vision applications, including activity recognition, traffic monitoring, and automotive safety. The problem of motion-based object tracking can be divided into two parts: detecting moving objects in each frame1. associating the detections corresponding to the same object over time2. The detection of moving objects uses a background subtraction algorithm based on Gaussian mixture models. Morphological operations are applied to the resulting foreground mask to eliminate noise. Finally, blob analysis detects groups of connected pixels, which are likely to correspond to Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 16 of 34 04/24/2014 07:10 PM
17. 17. moving objects. The association of detections to the same object is based solely on motion. The motion of each track is estimated by a Kalman filter. The filter is used to predict the track's location in each frame, and determine the likelihood of each detection being assigned to each track. Track maintenance becomes an important aspect of this example. In any given frame, some detections may be assigned to tracks, while other detections and tracks may remain unassigned.The assigned tracks are updated using the corresponding detections. The unassigned tracks are marked invisible. An unassigned detection begins a new track. Each track keeps count of the number of consecutive frames, where it remained unassigned. If the count exceeds a specified threshold, the example assumes that the object left the field of view and it deletes the track. This example is a function with the main body at the top and helper routines in the form of nested functions below. function multiObjectTracking() % Create system objects used for reading video, detecting moving objects, % and displaying the results. obj = setupSystemObjects(); tracks = initializeTracks(); % Create an empty array of tracks. nextId = 1; % ID of the next track % Detect moving objects, and track them across video frames. while ~isDone(obj.reader) frame = readFrame(); [centroids, bboxes, mask] = detectObjects(frame); predictNewLocationsOfTracks(); [assignments, unassignedTracks, unassignedDetections] = ... detectionToTrackAssignment(); updateAssignedTracks(); updateUnassignedTracks(); deleteLostTracks(); createNewTracks(); Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 17 of 34 04/24/2014 07:10 PM
18. 18. displayTrackingResults(); end Create System Objects Initialize Tracks Read a Video Frame Detect Objects Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 18 of 34 04/24/2014 07:10 PM
19. 19. Predict New Locations of Existing Tracks Assign Detections to Tracks Update Assigned Tracks Update Unassigned Tracks Delete Lost Tracks Create New Tracks Display Tracking Results Summary Create System Objects Create System objects used for reading the video frames, detecting foreground objects, and displaying results. function obj = setupSystemObjects() % Initialize Video I/O % Create objects for reading a video from a file, drawing the tracked % objects in each frame, and playing the video. % Create a video file reader. obj.reader = vision.VideoFileReader('atrium.avi'); % Create two video players, one to display the video, % and one to display the foreground mask. obj.videoPlayer = vision.VideoPlayer('Position', [20, 400, 700, 400]); obj.maskPlayer = vision.VideoPlayer('Position', [740, 400, 700, 400]); % Create system objects for foreground detection and blob analysis % The foreground detector is used to segment moving objects from % the background. It outputs a binary mask, where the pixel value % of 1 corresponds to the foreground and the value of 0 corresponds % to the background. Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 19 of 34 04/24/2014 07:10 PM
20. 20. obj.detector = vision.ForegroundDetector('NumGaussians', 3, ... 'NumTrainingFrames', 40, 'MinimumBackgroundRatio', 0.7); % Connected groups of foreground pixels are likely to correspond to moving % objects. The blob analysis system object is used to find such groups % (called 'blobs' or 'connected components'), and compute their % characteristics, such as area, centroid, and the bounding box. obj.blobAnalyser = vision.BlobAnalysis('BoundingBoxOutputPort', true, ... 'AreaOutputPort', true, 'CentroidOutputPort', true, ... 'MinimumBlobArea', 400); end Initialize Tracks The initializeTracks function creates an array of tracks, where each track is a structure representing a moving object in the video. The purpose of the structure is to maintain the state of a tracked object. The state consists of information used for detection to track assignment, track termination, and display. The structure contains the following fields: id : the integer ID of the track bbox : the current bounding box of the object; used for display kalmanFilter : a Kalman filter object used for motion-based tracking age : the number of frames since the track was first detected totalVisibleCount : the total number of frames in which the track was detected (visible) consecutiveInvisibleCount : the number of consecutive frames for which the track was not detected (invisible). Noisy detections tend to result in short-lived tracks. For this reason, the example only displays an object after it was tracked for some number of frames. This happens when totalVisibleCount exceeds a specified threshold. Object Detection and Tracking - MATLAB & Simuli... http://www.mathworks.in/help/vision/gs/object-dete... 20 of 34 04/24/2014 07:10 PM