SlideShare a Scribd company logo
1 of 12
FINAL REPORT OF DIGITAL VIDEO PROCESSING
FACE TRACKING
I.INTRODUCTION
In this report, the work consists of capturing a video with my face moving in and out;
developing a technique to track my face(in video) ;finally I will show the details the
technique I used to do this by showing final results in pictures. To do this some
techniques related to matlab programming language was used as we will see it in details
bellow. The result of this report was obtained with the help of the combination all code
for the tracking in appendences.
II. PROJECT IN DETAILS
II.1.Capturing a video with my face
Although I have mentioned in introduction do my project I used technique related to
matlab programming language, but this step (Capturing a video with my face) is not
related to mat lab programming language. To capture a video, I only used a
camera(digital camera).After capturing this camera ,I saved it in my computer in 3gp
format in file as gashema.
II.2.Playing the captured video in matlab.
To play the captured video in matlab ,matlab function was used as will see it in the
indices of this report. The objective of playing this video in matlab was for testing it
before tracking it. As the original video was in 3gp format and as I used matlab function
to play this video as avi format, before playing it in matlab I first converted it in avi
format by using one of suitable software to tthis work(avi converter).To do it I used
play_video (‘filename.avi’) as function to read this video in matlab.
III.TRACKING MY FACE.
To track my face ,I used technical based on Kalman filter i.e. useful technique for
tracking different types of moving objects which was invented by Rudolf Kalman at
NASA( National Aeronautics and Space Administration) to track the trajectory of
spacecraft. Normally, at its heart, the Kalman filter is a method of combining noisy (and
possibly missing) measurements and predictions of the state of an object to achieve an
estimate of its true current state.As the aim of this project is not to explain or develop this
technique(Kalman filter) applied to track my face. I inserted those few explanations of
Kalman filter in order to allow someone who reads my report can have some ideas of
this technique i.e what this technique is about.
a. Running Kalman filter tracker.
1
To run the basic Kalman filter tracker for my video,the following matlab function was
used: kalman_tracher(gashema,0.05,50,3);
Matlab code for this function is found in appendices.The code is tittles as matlab code for
combination of all above codes.
When you are tracking the face in matlab, pressing any key while the tracker is running
will pause the tracker until you press another key.
b. Implementation of the project in tetails
To implement this project on tracking, four steps were implemented using matlab
programming language. Figure bellow shows how the process looks like.
The basic processing pipeline for object tracking.
In this section we will describe each of the processing modules and the implementation
provided. First we describe the toplevel function that strings together all processing
modules into a complete tracker.
1 .The toplevel function
A tracker in the practicum software framework consists of:
• A segmenter that segments foreground objects from the background;
• a recognizer that identifies which foreground objects are being (or should be)
tracked;
• a representer that represents each tracked object in terms of some abstract feature
space;
• a tracker that actually does the tracking (i.e. estimates the position of each tracked
target in the current frame); and
• an optional visualizer that displays a visualization of each frame of the tracker.
2
As I have mentioned at the beginning of this report,a to track my face in video, Matlab
function implementing each of these modules in order to be complete is required .i.e a
tracker (i a collection of processing modules) is modeled in Matlab as structure
containing references to each module (that is the Matlab function implementing the
module). This structure also contains all state variables that must be passed down the
pipeline to other modules. The structure T must contain the following substructures:
1. T.segmenter which must contain a function
reference T.segmenter.segment to a function taking the structure T and the
current frame.
2. T.recognizer which must contain a function
reference T.recognizer.recognize to a function taking the
structure T and the current frame.
3. T.representer which must contain a function
reference T.representer.represent to a function taking the
structure T and the current frame.
4. T.tracker which must contain a function reference T.tracker.track to
a function taking the structure T and the current frame.
5. And optionally T.visualizer which must contain a function
reference T.visualizer.visualize to a function taking the structure T
and the current frame.
For example, you can run the tracker on a video using the run_tracker function.
where T is the structure setup previously with references to all of the tracking modules
and fname is the filename of the video to run the tracker on. Also,you will write a
wrapper function around run_tracker that sets up the processing pipeline and sets
any needed parameters We will see all these matlab functions in matlab codes at the end
of this report in appendices .
2.Segmenter
The segmenter provided performs simple background subtraction based on a rolling-
average background model and temporal differencing. The background model is updated
like this:
where is a learning rate parameter that controls how quickly the background
model incorporates new information and how quickly it forgets older observations. Every
pixel in every frame is classified as either background (0) or foreground (1) using a
simple threshold:
3
where is a threshold on the difference between a pixel in the current frame and
background model. This threshold controls which pixels are classified as foreground.
Finally, the foreground image is cleaned a little to close small gaps:
where is the morphological closing operation and is a disk of radius .
The Matlab implementation of this segmenter looks like this
(file background_subtractor.m in the framework directory).See matlab code for
this at the end of this report in appandices.
3. The recognizer
The next stage in the pipeline is the recognizer. The basic recognizer provided with the
tracking framework is based on simple blob detection. Connected components are found
in the segmented image, and the whole set of components is retained as the "recognized"
object to be tracked.
By referring to the matlab code of recognizer, you can see how the recognizer assumes
the segmented image from the previous stage is passed in
the T.segmenter.segmentedvariable. This is an example of how state is passed between
processing modules. The recognizer then puts its result in the
variableT.recognizer.blob for subsequent modules to use.
It is important to note that the framework does not enforce any particular protocol for
passing state variables from module to module. You are responsible for establishing how
each module receives its parameters and how it stores its results.
4. The representer
The next module is the representer, which takes the results of recognition and computes
a representation of each recognized object to be tracked. In the basic tracker, the
representer extracts the BoundingBox and size of each blob and only keeps the biggest
one as the single object being tracked. See the matlab code in the appendices.
In a more realistic tracker, the representer module would be responsible for maintaining a
list of objects being tracked and establishing correspondences between recognized objects
and those currently being tracked.
5 The tracker
4
A simple Kalman filter tracker has been provided in the practicum framework. It uses the
results of the representer to track the position and extent of the object being tracked. The
Kalman filter works by estimating an unobservable state which is updated in time with
alinear state update and additive Gaussian noise. A measurement is provided at each time
step which is assumed to be a linear function of the state plus Gaussian noise. It is the
most complex module in the framework, and receives and stores a large amount of state
information .Matlab code of this step is at the end of this report in appendices.
3.6 The visualizer
The final module in the processing pipeline is the visualizer. It is not mandatory, but it is
always a good idea to provide a minimal visualizer to see how your tracker is performing.
The visualizer provided simply displays the current frame with rectangles indicating the
current detection and current estimate (file visualize_kalman.m in the framework
directory).See matlab codes in appendences.
III.THE RESULT AND DISCUSSIONS ON MY WORK
When my face in the video is tracked,it was shown that the tracker is tracking the
face.When the face is moving,the tracker also will move to where the face moves in
order to track it(the face).When the face is out, the tracker disappears.If the face comes in
again ,immediately the tracker also comes in.If the face is stationary, the tracker stops
moving i.e it track the face without movement.
The result of the work.
a.the face is in and moving
5
100 200 300 400 500 600
50
100
150
200
250
300
350
400
450
100 200 300 400 500 600
50
100
150
200
250
300
350
400
450
6
c. The face is out
.
100 200 300 400 500 600
50
100
150
200
250
300
350
400
450
I
REFERENCE
1. Yao-Jiunn Chen, Yen-Chun Lin, simple Face-detection Algorithm Based on
Minimum Facial Features.
2. Sanjay Kr. Singh1, D. S. Chauhan2, Mayank Vatsa3, Richa Singh3*, A Robust
Skin Color Based Face Detection Algorithm
3. Eli Brookner: Tracking and Kalman Filtering Made Easy. Wiley-Interscience, 1st
edition, 4 1998.
4. http://en.wikipedia.org/wiki/Face_detection
5. http://www.micc.unifi.it/bagdanov/tracking/
7
V.APPANDENCES
1.Matlab code to read my video
function play_video(fname);
% PLAY_VIDEO - simple function to play a video in a Matlab
figure.
% PLAY_VIDEO(fname) will play the video in the file
specified by 'fname'.
% See also: videoReader, nextFrame, getFrame
vr = videoReader(fname);
while (nextFrame(vr))
img = getFrame(vr);
imshow(img);
pause(0.01);
end
return
2.Matlab for Tracking framework
a.Matlab code for segmenter
function T = background_subtractor(T, frame)
% Do everything in grayscale.
frame_grey = double(rgb2gray(frame));
% Check to see if we're initialized
if ~isfield(T.segmenter, 'background');
T.segmenter.background = frame_grey
end
% Pull local state out.
gamma = T.segmenter.gamma;
tau = T.segmenter.tau;
radius = T.segmenter.radius;
% Rolling average update.
T.segmenter.background = gamma * frame_grey + (1 - gamma) *
...
T.segmenter.background;
% And threshold to get the foreground.
T.segmenter.segmented = abs(T.segmenter.background -
frame_grey) > tau;
8
T.segmenter.segmented = imclose(T.segmenter.segmented,
strel('disk', radius));
Return
b.Matlab code for recognizer
function T = find_blob(T, frame)
T.recognizer.blobs = bwlabel(T.segmenter.segmented);
return
c.Matlab code for representer
function T = filter_blobs(T, frame)
% Make sure at lease one blob was recognized
if sum(sum(T.recognizer.blobs))
% Extract the BoundingBox and Area of all blobs
R = regionprops(T.recognizer.blobs, 'BoundingBox',
'Area');
% And only keep the biggest one
[I, IX] = max([R.Area]);
T.representer.BoundingBox = R(IX(size(IX,2))).BoundingBox;
end
return
d. Matlab code for tracker
function T = kalman_step(T, frame)
% Get the current filter state.
K = T.tracker;
% Don't do anything unless we're initialized.
if isfield(K, 'm_k1k1') && isfield(T.representer,
'BoundingBox')
% Get the current measurement out of the representer.
z_k = T.representer.BoundingBox';
% Project the state forward m_{k|k-1}.
m_kk1 = K.F * K.m_k1k1;
% Partial state covariance update.
P_kk1 = K.Q + K.F * K.P_k1k1 * K.F';
9
% Innovation is disparity in actual versus predicted
measurement.
innovation = z_k - K.H * m_kk1;
% The new state covariance.
S_k = K.H * P_kk1 * K.H' + K.R;
% The Kalman gain.
K_k = P_kk1 * K.H' * inv(S_k);
% The new state prediction.
m_kk = m_kk1 + K_k * innovation;
% And the new state covariance.
P_kk = P_kk1 - K_k * K.H * P_kk1;
% Innovation covariance.
K.innovation = 0.2 * sqrt(innovation' * innovation) +
(0.8) ...
* K.innovation;
% And store the current filter state for next iteration.
K.m_k1k1 = m_kk;
K.P_k1k1 = P_kk;
else
if isfield(T.representer, 'BoundingBox');
K.m_k1k1 = T.representer.BoundingBox';
K.P_k1k1 = eye(4);
end
end
% Make sure we stuff the filter state back in.
T.tracker = K;
return
e. Matlab code for Visualizer
function T = visualize_kalman(T, frame)
% Display the current frame.
imshow(frame);
% Draw the current measurement in red.
if isfield(T.representer, 'BoundingBox')
rectangle('Position', T.representer.BoundingBox,
'EdgeColor', 'r');
10
end
% And the current prediction in green
if isfield(T.tracker, 'm_k1k1');
rectangle('Position', T.tracker.m_k1k1, 'EdgeColor', 'g');
end
drawnow;
return
f. matlab code for combination of all those above codes
function T = kalman_tracker(fname, gamma, tau, radius);
% Initialize background model parameters
Segmenter.gamma = gamma;
Segmenter.tau = tau;
Segmenter.radius = radius;
Segmenter.segment = @background_subtractor;
% Recognizer and representer is a simple blob finder.
Recognizer.recognize = @find_blob;
Representer.represent = @filter_blobs;
% The tracker module.
Tracker.H = eye(4); % System model
Tracker.Q = 0.5 * eye(4); % System noise
Tracker.F = eye(4); % Measurement model
Tracker.R = 5 * eye(4); % Measurement noise
Tracker.innovation = 0;
Tracker.track = @kalman_step;
% A custom visualizer for the Kalman state.
Visualizer.visualize = @visualize_kalman;
% Set up the global tracking system.
T.segmenter = Segmenter;
T.recognizer = Recognizer;
T.representer = Representer;
T.tracker = Tracker;
T.visualizer = Visualizer;
% And run the tracker on the video.
run_tracker(fname, T);
return
11
12

More Related Content

What's hot

Instrumentation and measurements
Instrumentation and measurementsInstrumentation and measurements
Instrumentation and measurementsTuba Tanveer
 
Template Method Pattern
Template Method PatternTemplate Method Pattern
Template Method Patternmonisiqbal
 
Symbolic Execution (introduction and hands-on)
Symbolic Execution (introduction and hands-on)Symbolic Execution (introduction and hands-on)
Symbolic Execution (introduction and hands-on)Emilio Coppa
 
[Steven karris] introduction_to_simulink_with_engi
[Steven karris] introduction_to_simulink_with_engi[Steven karris] introduction_to_simulink_with_engi
[Steven karris] introduction_to_simulink_with_engiStiedy Jocky
 
Introduction to simulink (1)
Introduction to simulink (1)Introduction to simulink (1)
Introduction to simulink (1)Memo Love
 
System software - macro expansion,nested macro calls
System software - macro expansion,nested macro callsSystem software - macro expansion,nested macro calls
System software - macro expansion,nested macro callsSARASWATHI S
 
Implicit and explicit sequence control with exception handling
Implicit and explicit sequence control with exception handlingImplicit and explicit sequence control with exception handling
Implicit and explicit sequence control with exception handlingVIKASH MAINANWAL
 
Which is not a step in the problem
Which is not a step in the problemWhich is not a step in the problem
Which is not a step in the problemkasguest
 
Circuit analysis i with matlab computing and simulink sim powersystems modeling
Circuit analysis i with matlab computing and simulink sim powersystems modelingCircuit analysis i with matlab computing and simulink sim powersystems modeling
Circuit analysis i with matlab computing and simulink sim powersystems modelingIndra S Wahyudi
 

What's hot (18)

Matlab 1 level_1
Matlab 1 level_1Matlab 1 level_1
Matlab 1 level_1
 
Adsa u4 ver 1.0
Adsa u4 ver 1.0Adsa u4 ver 1.0
Adsa u4 ver 1.0
 
Ppt chapter05
Ppt chapter05Ppt chapter05
Ppt chapter05
 
Matlab tutorial
Matlab tutorialMatlab tutorial
Matlab tutorial
 
Instrumentation and measurements
Instrumentation and measurementsInstrumentation and measurements
Instrumentation and measurements
 
Static analysis
Static analysisStatic analysis
Static analysis
 
Template Method Pattern
Template Method PatternTemplate Method Pattern
Template Method Pattern
 
6. Compile And Run
6. Compile And Run6. Compile And Run
6. Compile And Run
 
Symbolic Execution (introduction and hands-on)
Symbolic Execution (introduction and hands-on)Symbolic Execution (introduction and hands-on)
Symbolic Execution (introduction and hands-on)
 
Matlab brochure
Matlab  brochureMatlab  brochure
Matlab brochure
 
[Steven karris] introduction_to_simulink_with_engi
[Steven karris] introduction_to_simulink_with_engi[Steven karris] introduction_to_simulink_with_engi
[Steven karris] introduction_to_simulink_with_engi
 
Introduction to simulink (1)
Introduction to simulink (1)Introduction to simulink (1)
Introduction to simulink (1)
 
System software - macro expansion,nested macro calls
System software - macro expansion,nested macro callsSystem software - macro expansion,nested macro calls
System software - macro expansion,nested macro calls
 
Implicit and explicit sequence control with exception handling
Implicit and explicit sequence control with exception handlingImplicit and explicit sequence control with exception handling
Implicit and explicit sequence control with exception handling
 
Which is not a step in the problem
Which is not a step in the problemWhich is not a step in the problem
Which is not a step in the problem
 
Ppt chapter03
Ppt chapter03Ppt chapter03
Ppt chapter03
 
Circuit analysis i with matlab computing and simulink sim powersystems modeling
Circuit analysis i with matlab computing and simulink sim powersystems modelingCircuit analysis i with matlab computing and simulink sim powersystems modeling
Circuit analysis i with matlab computing and simulink sim powersystems modeling
 
Using matlab simulink
Using matlab simulinkUsing matlab simulink
Using matlab simulink
 

Similar to Tracking my face with matlab ws word format

GDE Lab 1 – Traffic Light Pg. 1 Lab 1 Traffic L.docx
GDE Lab 1 – Traffic Light  Pg. 1     Lab 1 Traffic L.docxGDE Lab 1 – Traffic Light  Pg. 1     Lab 1 Traffic L.docx
GDE Lab 1 – Traffic Light Pg. 1 Lab 1 Traffic L.docxbudbarber38650
 
Monte -- machine learning in Python
Monte -- machine learning in PythonMonte -- machine learning in Python
Monte -- machine learning in Pythonbutest
 
Monte -- machine learning in Python
Monte -- machine learning in PythonMonte -- machine learning in Python
Monte -- machine learning in Pythonbutest
 
UML Foundation for C Self Trimming
UML Foundation for C Self TrimmingUML Foundation for C Self Trimming
UML Foundation for C Self TrimmingPathfinder Solutions
 
Web-Based Online Embedded Security System And Alertness Via Social Media
Web-Based Online Embedded Security System And Alertness Via Social MediaWeb-Based Online Embedded Security System And Alertness Via Social Media
Web-Based Online Embedded Security System And Alertness Via Social MediaIRJET Journal
 
Roboconf Detailed Presentation
Roboconf Detailed PresentationRoboconf Detailed Presentation
Roboconf Detailed PresentationVincent Zurczak
 
MicroManager_MATLAB_Implementation
MicroManager_MATLAB_ImplementationMicroManager_MATLAB_Implementation
MicroManager_MATLAB_ImplementationPhilip Mohun
 
Face detection and tracking in a video sequence
Face detection and tracking in a video sequenceFace detection and tracking in a video sequence
Face detection and tracking in a video sequenceKarthik G N
 
Hybrid test automation frameworks implementation using qtp
Hybrid test automation frameworks implementation using qtpHybrid test automation frameworks implementation using qtp
Hybrid test automation frameworks implementation using qtpabhijob
 
Presentation1.2.pptx
Presentation1.2.pptxPresentation1.2.pptx
Presentation1.2.pptxpranaykusuma
 
Mastercam basics-tutorial
Mastercam basics-tutorialMastercam basics-tutorial
Mastercam basics-tutorialssuserf5e931
 
License plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehiclesLicense plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehicleslambanaveen
 
Real time Traffic Signs Recognition using Deep Learning
Real time Traffic Signs Recognition using Deep LearningReal time Traffic Signs Recognition using Deep Learning
Real time Traffic Signs Recognition using Deep LearningIRJET Journal
 
Model Execution and System Simulation
Model Execution and System SimulationModel Execution and System Simulation
Model Execution and System SimulationObeo
 
[Capella Day 2019] Model execution and system simulation in Capella
[Capella Day 2019] Model execution and system simulation in Capella[Capella Day 2019] Model execution and system simulation in Capella
[Capella Day 2019] Model execution and system simulation in CapellaObeo
 

Similar to Tracking my face with matlab ws word format (20)

GDE Lab 1 – Traffic Light Pg. 1 Lab 1 Traffic L.docx
GDE Lab 1 – Traffic Light  Pg. 1     Lab 1 Traffic L.docxGDE Lab 1 – Traffic Light  Pg. 1     Lab 1 Traffic L.docx
GDE Lab 1 – Traffic Light Pg. 1 Lab 1 Traffic L.docx
 
Calfem34
Calfem34Calfem34
Calfem34
 
Monte -- machine learning in Python
Monte -- machine learning in PythonMonte -- machine learning in Python
Monte -- machine learning in Python
 
Monte -- machine learning in Python
Monte -- machine learning in PythonMonte -- machine learning in Python
Monte -- machine learning in Python
 
UML Foundation for C Self Trimming
UML Foundation for C Self TrimmingUML Foundation for C Self Trimming
UML Foundation for C Self Trimming
 
ES-CH5.ppt
ES-CH5.pptES-CH5.ppt
ES-CH5.ppt
 
Web-Based Online Embedded Security System And Alertness Via Social Media
Web-Based Online Embedded Security System And Alertness Via Social MediaWeb-Based Online Embedded Security System And Alertness Via Social Media
Web-Based Online Embedded Security System And Alertness Via Social Media
 
Roboconf Detailed Presentation
Roboconf Detailed PresentationRoboconf Detailed Presentation
Roboconf Detailed Presentation
 
MicroManager_MATLAB_Implementation
MicroManager_MATLAB_ImplementationMicroManager_MATLAB_Implementation
MicroManager_MATLAB_Implementation
 
Face detection and tracking in a video sequence
Face detection and tracking in a video sequenceFace detection and tracking in a video sequence
Face detection and tracking in a video sequence
 
Hybrid test automation frameworks implementation using qtp
Hybrid test automation frameworks implementation using qtpHybrid test automation frameworks implementation using qtp
Hybrid test automation frameworks implementation using qtp
 
Malab tutorial
Malab tutorialMalab tutorial
Malab tutorial
 
Presentation1.2.pptx
Presentation1.2.pptxPresentation1.2.pptx
Presentation1.2.pptx
 
Mastercam basics-tutorial
Mastercam basics-tutorialMastercam basics-tutorial
Mastercam basics-tutorial
 
Appletjava
AppletjavaAppletjava
Appletjava
 
Lbc data reduction
Lbc data reductionLbc data reduction
Lbc data reduction
 
License plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehiclesLicense plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehicles
 
Real time Traffic Signs Recognition using Deep Learning
Real time Traffic Signs Recognition using Deep LearningReal time Traffic Signs Recognition using Deep Learning
Real time Traffic Signs Recognition using Deep Learning
 
Model Execution and System Simulation
Model Execution and System SimulationModel Execution and System Simulation
Model Execution and System Simulation
 
[Capella Day 2019] Model execution and system simulation in Capella
[Capella Day 2019] Model execution and system simulation in Capella[Capella Day 2019] Model execution and system simulation in Capella
[Capella Day 2019] Model execution and system simulation in Capella
 

Tracking my face with matlab ws word format

  • 1. FINAL REPORT OF DIGITAL VIDEO PROCESSING FACE TRACKING I.INTRODUCTION In this report, the work consists of capturing a video with my face moving in and out; developing a technique to track my face(in video) ;finally I will show the details the technique I used to do this by showing final results in pictures. To do this some techniques related to matlab programming language was used as we will see it in details bellow. The result of this report was obtained with the help of the combination all code for the tracking in appendences. II. PROJECT IN DETAILS II.1.Capturing a video with my face Although I have mentioned in introduction do my project I used technique related to matlab programming language, but this step (Capturing a video with my face) is not related to mat lab programming language. To capture a video, I only used a camera(digital camera).After capturing this camera ,I saved it in my computer in 3gp format in file as gashema. II.2.Playing the captured video in matlab. To play the captured video in matlab ,matlab function was used as will see it in the indices of this report. The objective of playing this video in matlab was for testing it before tracking it. As the original video was in 3gp format and as I used matlab function to play this video as avi format, before playing it in matlab I first converted it in avi format by using one of suitable software to tthis work(avi converter).To do it I used play_video (‘filename.avi’) as function to read this video in matlab. III.TRACKING MY FACE. To track my face ,I used technical based on Kalman filter i.e. useful technique for tracking different types of moving objects which was invented by Rudolf Kalman at NASA( National Aeronautics and Space Administration) to track the trajectory of spacecraft. Normally, at its heart, the Kalman filter is a method of combining noisy (and possibly missing) measurements and predictions of the state of an object to achieve an estimate of its true current state.As the aim of this project is not to explain or develop this technique(Kalman filter) applied to track my face. I inserted those few explanations of Kalman filter in order to allow someone who reads my report can have some ideas of this technique i.e what this technique is about. a. Running Kalman filter tracker. 1
  • 2. To run the basic Kalman filter tracker for my video,the following matlab function was used: kalman_tracher(gashema,0.05,50,3); Matlab code for this function is found in appendices.The code is tittles as matlab code for combination of all above codes. When you are tracking the face in matlab, pressing any key while the tracker is running will pause the tracker until you press another key. b. Implementation of the project in tetails To implement this project on tracking, four steps were implemented using matlab programming language. Figure bellow shows how the process looks like. The basic processing pipeline for object tracking. In this section we will describe each of the processing modules and the implementation provided. First we describe the toplevel function that strings together all processing modules into a complete tracker. 1 .The toplevel function A tracker in the practicum software framework consists of: • A segmenter that segments foreground objects from the background; • a recognizer that identifies which foreground objects are being (or should be) tracked; • a representer that represents each tracked object in terms of some abstract feature space; • a tracker that actually does the tracking (i.e. estimates the position of each tracked target in the current frame); and • an optional visualizer that displays a visualization of each frame of the tracker. 2
  • 3. As I have mentioned at the beginning of this report,a to track my face in video, Matlab function implementing each of these modules in order to be complete is required .i.e a tracker (i a collection of processing modules) is modeled in Matlab as structure containing references to each module (that is the Matlab function implementing the module). This structure also contains all state variables that must be passed down the pipeline to other modules. The structure T must contain the following substructures: 1. T.segmenter which must contain a function reference T.segmenter.segment to a function taking the structure T and the current frame. 2. T.recognizer which must contain a function reference T.recognizer.recognize to a function taking the structure T and the current frame. 3. T.representer which must contain a function reference T.representer.represent to a function taking the structure T and the current frame. 4. T.tracker which must contain a function reference T.tracker.track to a function taking the structure T and the current frame. 5. And optionally T.visualizer which must contain a function reference T.visualizer.visualize to a function taking the structure T and the current frame. For example, you can run the tracker on a video using the run_tracker function. where T is the structure setup previously with references to all of the tracking modules and fname is the filename of the video to run the tracker on. Also,you will write a wrapper function around run_tracker that sets up the processing pipeline and sets any needed parameters We will see all these matlab functions in matlab codes at the end of this report in appendices . 2.Segmenter The segmenter provided performs simple background subtraction based on a rolling- average background model and temporal differencing. The background model is updated like this: where is a learning rate parameter that controls how quickly the background model incorporates new information and how quickly it forgets older observations. Every pixel in every frame is classified as either background (0) or foreground (1) using a simple threshold: 3
  • 4. where is a threshold on the difference between a pixel in the current frame and background model. This threshold controls which pixels are classified as foreground. Finally, the foreground image is cleaned a little to close small gaps: where is the morphological closing operation and is a disk of radius . The Matlab implementation of this segmenter looks like this (file background_subtractor.m in the framework directory).See matlab code for this at the end of this report in appandices. 3. The recognizer The next stage in the pipeline is the recognizer. The basic recognizer provided with the tracking framework is based on simple blob detection. Connected components are found in the segmented image, and the whole set of components is retained as the "recognized" object to be tracked. By referring to the matlab code of recognizer, you can see how the recognizer assumes the segmented image from the previous stage is passed in the T.segmenter.segmentedvariable. This is an example of how state is passed between processing modules. The recognizer then puts its result in the variableT.recognizer.blob for subsequent modules to use. It is important to note that the framework does not enforce any particular protocol for passing state variables from module to module. You are responsible for establishing how each module receives its parameters and how it stores its results. 4. The representer The next module is the representer, which takes the results of recognition and computes a representation of each recognized object to be tracked. In the basic tracker, the representer extracts the BoundingBox and size of each blob and only keeps the biggest one as the single object being tracked. See the matlab code in the appendices. In a more realistic tracker, the representer module would be responsible for maintaining a list of objects being tracked and establishing correspondences between recognized objects and those currently being tracked. 5 The tracker 4
  • 5. A simple Kalman filter tracker has been provided in the practicum framework. It uses the results of the representer to track the position and extent of the object being tracked. The Kalman filter works by estimating an unobservable state which is updated in time with alinear state update and additive Gaussian noise. A measurement is provided at each time step which is assumed to be a linear function of the state plus Gaussian noise. It is the most complex module in the framework, and receives and stores a large amount of state information .Matlab code of this step is at the end of this report in appendices. 3.6 The visualizer The final module in the processing pipeline is the visualizer. It is not mandatory, but it is always a good idea to provide a minimal visualizer to see how your tracker is performing. The visualizer provided simply displays the current frame with rectangles indicating the current detection and current estimate (file visualize_kalman.m in the framework directory).See matlab codes in appendences. III.THE RESULT AND DISCUSSIONS ON MY WORK When my face in the video is tracked,it was shown that the tracker is tracking the face.When the face is moving,the tracker also will move to where the face moves in order to track it(the face).When the face is out, the tracker disappears.If the face comes in again ,immediately the tracker also comes in.If the face is stationary, the tracker stops moving i.e it track the face without movement. The result of the work. a.the face is in and moving 5
  • 6. 100 200 300 400 500 600 50 100 150 200 250 300 350 400 450 100 200 300 400 500 600 50 100 150 200 250 300 350 400 450 6
  • 7. c. The face is out . 100 200 300 400 500 600 50 100 150 200 250 300 350 400 450 I REFERENCE 1. Yao-Jiunn Chen, Yen-Chun Lin, simple Face-detection Algorithm Based on Minimum Facial Features. 2. Sanjay Kr. Singh1, D. S. Chauhan2, Mayank Vatsa3, Richa Singh3*, A Robust Skin Color Based Face Detection Algorithm 3. Eli Brookner: Tracking and Kalman Filtering Made Easy. Wiley-Interscience, 1st edition, 4 1998. 4. http://en.wikipedia.org/wiki/Face_detection 5. http://www.micc.unifi.it/bagdanov/tracking/ 7
  • 8. V.APPANDENCES 1.Matlab code to read my video function play_video(fname); % PLAY_VIDEO - simple function to play a video in a Matlab figure. % PLAY_VIDEO(fname) will play the video in the file specified by 'fname'. % See also: videoReader, nextFrame, getFrame vr = videoReader(fname); while (nextFrame(vr)) img = getFrame(vr); imshow(img); pause(0.01); end return 2.Matlab for Tracking framework a.Matlab code for segmenter function T = background_subtractor(T, frame) % Do everything in grayscale. frame_grey = double(rgb2gray(frame)); % Check to see if we're initialized if ~isfield(T.segmenter, 'background'); T.segmenter.background = frame_grey end % Pull local state out. gamma = T.segmenter.gamma; tau = T.segmenter.tau; radius = T.segmenter.radius; % Rolling average update. T.segmenter.background = gamma * frame_grey + (1 - gamma) * ... T.segmenter.background; % And threshold to get the foreground. T.segmenter.segmented = abs(T.segmenter.background - frame_grey) > tau; 8
  • 9. T.segmenter.segmented = imclose(T.segmenter.segmented, strel('disk', radius)); Return b.Matlab code for recognizer function T = find_blob(T, frame) T.recognizer.blobs = bwlabel(T.segmenter.segmented); return c.Matlab code for representer function T = filter_blobs(T, frame) % Make sure at lease one blob was recognized if sum(sum(T.recognizer.blobs)) % Extract the BoundingBox and Area of all blobs R = regionprops(T.recognizer.blobs, 'BoundingBox', 'Area'); % And only keep the biggest one [I, IX] = max([R.Area]); T.representer.BoundingBox = R(IX(size(IX,2))).BoundingBox; end return d. Matlab code for tracker function T = kalman_step(T, frame) % Get the current filter state. K = T.tracker; % Don't do anything unless we're initialized. if isfield(K, 'm_k1k1') && isfield(T.representer, 'BoundingBox') % Get the current measurement out of the representer. z_k = T.representer.BoundingBox'; % Project the state forward m_{k|k-1}. m_kk1 = K.F * K.m_k1k1; % Partial state covariance update. P_kk1 = K.Q + K.F * K.P_k1k1 * K.F'; 9
  • 10. % Innovation is disparity in actual versus predicted measurement. innovation = z_k - K.H * m_kk1; % The new state covariance. S_k = K.H * P_kk1 * K.H' + K.R; % The Kalman gain. K_k = P_kk1 * K.H' * inv(S_k); % The new state prediction. m_kk = m_kk1 + K_k * innovation; % And the new state covariance. P_kk = P_kk1 - K_k * K.H * P_kk1; % Innovation covariance. K.innovation = 0.2 * sqrt(innovation' * innovation) + (0.8) ... * K.innovation; % And store the current filter state for next iteration. K.m_k1k1 = m_kk; K.P_k1k1 = P_kk; else if isfield(T.representer, 'BoundingBox'); K.m_k1k1 = T.representer.BoundingBox'; K.P_k1k1 = eye(4); end end % Make sure we stuff the filter state back in. T.tracker = K; return e. Matlab code for Visualizer function T = visualize_kalman(T, frame) % Display the current frame. imshow(frame); % Draw the current measurement in red. if isfield(T.representer, 'BoundingBox') rectangle('Position', T.representer.BoundingBox, 'EdgeColor', 'r'); 10
  • 11. end % And the current prediction in green if isfield(T.tracker, 'm_k1k1'); rectangle('Position', T.tracker.m_k1k1, 'EdgeColor', 'g'); end drawnow; return f. matlab code for combination of all those above codes function T = kalman_tracker(fname, gamma, tau, radius); % Initialize background model parameters Segmenter.gamma = gamma; Segmenter.tau = tau; Segmenter.radius = radius; Segmenter.segment = @background_subtractor; % Recognizer and representer is a simple blob finder. Recognizer.recognize = @find_blob; Representer.represent = @filter_blobs; % The tracker module. Tracker.H = eye(4); % System model Tracker.Q = 0.5 * eye(4); % System noise Tracker.F = eye(4); % Measurement model Tracker.R = 5 * eye(4); % Measurement noise Tracker.innovation = 0; Tracker.track = @kalman_step; % A custom visualizer for the Kalman state. Visualizer.visualize = @visualize_kalman; % Set up the global tracking system. T.segmenter = Segmenter; T.recognizer = Recognizer; T.representer = Representer; T.tracker = Tracker; T.visualizer = Visualizer; % And run the tracker on the video. run_tracker(fname, T); return 11
  • 12. 12