More Related Content
Similar to 50120130405020
Similar to 50120130405020 (20)
More from IAEME Publication
More from IAEME Publication (20)
50120130405020
- 1. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
INTERNATIONAL JOURNAL OF COMPUTER ENGINEERING &
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
TECHNOLOGY (IJCET)
ISSN 0976 – 6367(Print)
ISSN 0976 – 6375(Online)
Volume 4, Issue 5, September – October (2013), pp. 172-181
© IAEME: www.iaeme.com/ijcet.asp
Journal Impact Factor (2013): 6.1302 (Calculated by GISI)
www.jifactor.com
IJCET
©IAEME
ESTIMATE VELOCITY AND TRACK OF OBJECTS BASED ON
PROBABILISTIC DISTRIBUTION FUNCTION
R. F. Mansour1,
1
Abdul Samad A. Marghilani2
Department of Computer Science, Faculty of Science, Northern Border University, KSA
2
Dean of Scientific Research, Northern Border University, KSA
ABSTRACT
The estimation of velocity and tracking of objects is one of the most difficult problems in
computer vision. In this approach introduces a multiple human objects tracking system to estimate
the velocity and track multiple objects in the crowded scene in which occlusions occur, we have
created random particles to represent the position of the objects, we have used maximum likelihood
to compute the parameters for the used distribution, made resample for the maximum likelihood and
histogram for the processing frame, displayed the particles on the target objects in each frame, and
finally, we have drawn the trajectory for each object in the video. Results show that, the proposed
method can effectively improve the tracking precision especially multiple moving objects.
Keywords: Estimation of velocity, generalized distribution, particle filter, maximum likelihood,
resamples particles
1. INTRODUCTION
The estimation of velocity and tracking of objects is one of the most difficult problems in
computer vision. It is used in automated surveillance and in many other applications. Most of the
human object tracking systems is bottom-up processes based on the existing techniques such as
change detection, frame difference, or background subtraction. The Pfinder [1] utilizes stochastic,
region-based features, such as blob and 2D contour for person tracking. It proposes a novel model-based tracking system. However, it cannot handle multiple moving objects with occlusion.
The traditional tracking approaches could be divided into data-driven and model-driven methods. Among the data-driven methods, mean shift algorithm is one of the most popular algorithms
utilizing the color distribution as tracking cue [2]. But it does not work correctly in the presence of
partial occlusion. For the period of the occlusion, the model-driven methods estimating and predicting the position of the tracking objects. Particle filter has shown overwhelming in object tracking
172
- 2. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
systems which the posterior density and the observation density are often non-linear and
non-Gaussian [3-7]. But both the mean shift and particle filter are based on similarity measurement
of color distribution, which leads to a miss tracking when the background has similar color distribution with the tracked object. In order to improve tracking stability and precision, introducing some
other features besides the color cue seems to be a preferable scheme. Tracking multiple interacting
objects is more complex than the single object tracking. When tracking people, the person can
change their movement as random as partial occlusions occurred. However many researchers applied
monitored, but there are many problems when tracking multiple objects and when objects are interacting due to partial occlusions. There are different approaches dealing with object tracking in video
images [8-9]. These approaches are highly affected by the presence of nonlinear, non-Gaussian and
partial occlusion. Methods for monitoring objects may be divided into two categories: such as bottom-up and top-down approaches, tracking objects based on edge detection and use relates to the
contour of the bottom upwards, but the tracking of objects based on color detection and particulate
filters relates to top-down [10-11]. The estimation of grey-level derivatives is of great importance in
the analysis of image sequences for velocity estimation. The approach used here begins with the
problem where the apparent velocity of intensity patterns can be directly identified with the movement of surfaces in the scene. In our approach we track each person as a separate tracker which reduce the size of the state space and no need to know the relationship of a person with others [12]. We
use particle filters to distinguish the target person. In this approach, we use probabilistic distributions
to predict the person position, so the computation of the likelihood of particle filters for the used distribution function is required. The method can be described as Figure 1.
Figure 1 show the outline the method
2. PROPOSED METHOD
We apply our method in the following steps;
1. Initialization parameters where we initialed kernel matrix to restore the particles positions to the
first positions when we start to process the frame. The suitable kernel is F_update =[1 0 1 0; 0 1 0
1; 0 0 1 0; 0 0 0 1]. Then Initialize the number of particle to represent the target color to track
where suppose is 2000 particles, however increasing particles numbers give better result. We
access the color of the object as RGB. Initialize the standard deviation for the color may be
xstd_rgb = 50, standard deviation for the particles position xstd_pos = 25, standard deviation for
direction particles movement as sxtd_vec = 5, and initialize the means of the required colors to
track as Xrgb_trgt_red = [95; 2; 10], Xrgb_trgt_green = [30;85; 80], Xrgb_trgt_blue = [5; 26; 101].
2. Loading video read the record video and store it as a matrix of pixels such as vr = videoreader().
The recorded video must compute the number of frames that means convert the video to frames
to process frame as a single to detect the target object. It must compute the frame height and
width. In our approach use video type avi and the video have 300 frames for three persons in
closed rood, one static camera and clear background.
173
- 3. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
3. Create the particles as point’s random distribution where in this method we use separate particles
color for each target object. The particles distribution depend on frame resolution and number of
particles so we compute ( X, S, G) to track three persons distinguish by three colors( red , green
and blue). The (X, S, G) represent the random integer positions for the particles.
4. we process each frame by the following steps:
4.1.store the current frame as variable matrix Y_k = read(vr,k) then
4.2.Update particles position where it depends on the particle update kernel, standard deviation
for the particles position and standard deviation for direction particles movement and particle positions(X, S, G) in the previous frame. In this stage return the position of particles after adding standard deviation for particles positions multiple by standard values depend on
the standard normal distribution for particles positions values, and direction particles movement multiple by standard values depend on the standard normal distribution for particles
positions then new values for the particles position as (X, S, G).
4.3. Max Likeliklehood Estimation: is a method to estimate the parameters for the proposed distribution. MLE is different least-squares which is primary a descriptive tool and consume
many complex calculations. MLE save a time [13].
3. LIKELIHOOD
Likelihood is useful method for parameter estimation in statistics. Many applications use likelihood technique especially the non-linear model with non-normal data [14-16]. In our approach
we use likelihood technique to estimate the parameters of the proposed distribution for the obtained
observation opposite the processing state. Finding the parameters value or parameters estimation,
there are two methods for parameters estimation, least-squares and maximum likelihood the maximum likelihood estimation (MLE) is better than least-squares estimation (LSE) because the
least-squares require linear regression, sum of squares error , proportion for variance and mean
squared deviation. MLE has many advantages where MLE does not require linear regression so it is
suitable for our approach there non-normal data, MLE give us complete information about parameters, give us the true value of the parameters. In our approach we model the hypothesis of the observations P ( yit | X it ) where the observations that conform the model, so the model is defined as a
group of probability distributions indexed by model’s parameters. In our approach there are observations as Yi t = yi0 , yi1 ,..., yiN and there parameters for proposed distribution function such as beta distribution function as
Γ (α + β ) α −1
(1)
f (y |α,β) =
y (1 − y ) β −1
Γ (α )Γ ( β )
Where 0 < y < 1, α , β > 0, Γ() denote the gamma function
α
(2)
E (Y ) =
α +β
Var (Y ) =
αβ
(α + β ) (α + β + 1)
2
Var(Y ) = E(Y )[1− E(Y )][ /(α + β +1)]
1
(3)
(4)
The Eq.4 represents form variance in beta distribution sunning beta distribution parameters
and expectation of the observations. In this approach Y is an Nx1 vector independent variable, α , β
in eq.16 are specified as functions of X , NxK matrix Where Ψ, Φ are vectors kx1 vectors of parameters representing the effects of the covariates upon compliance and defection.
174
- 4. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
α = exp( X (Ψ ))
β = exp( X (Φ ))
N
logL = ∑lnΓ(α + β) −[lnΓ(α) + lnΓ(β)]+ (α −1) ln(y) + (β −1) ln( − y)
1
(5)
i =1
Where eq.5 represents log-beta maximum likelihood estimation (log-BMLE). When we use Gaussian
distribution, the distribution depends on two parameters mean and variance
f ( xi | α ,σ ) =
1
e
σ 2π
− ( xi −α ) 2
2σ
(6)
2
σ > 0and − ∞ < α < +∞
Where
Thelog-Gaussian maximum likelihood estimation (log-GMLE) as
n
ln L(α ,σ 2 ) = ∑ ln f (xi ;α ,σ 2 ) =
i =1
−n
n
1 n
ln(2π ) − ln σ 2 − 2 ∑ ( xi − α )2
2
2
2σ i =1
(7)
Where σ represents standard deviation and α represents mean.When we use Gamma distribution
where the distribution depends on two parameters as
f ( xi ; k ,θ ) =
1
θ
k
1
x k −1 e
Γ (k )
−x
θ
(8)
For x > 0 and k , θ > 0
The likelihood function for N observations (x1, ..., xN) is
n
n
xi
i =1
i =1
θ
ln L(k,θ ) = (k − 1)∑ln(xi ) − ∑
(9)
− Nk ln(θ ) − N ln Γ(k )
Where Eq.9 represents Log-Gamma maximum likelihood.
4. SAMPLING REPRESENTATION OF PROBABILITY DISTRIBUTION
The main function of the particle filter method is sampling representation of probability distribution where sampling representation of the likelihood if we have a collection of M points U
and a collection of weights wik where k = 1,2,..., M and X is the numbers of samples for this state.
These points are independent samples drawn from probability distribution function P (U ik ) then
wik = f (uik ) / P (U ik ) for some function f . Now the expectation of the weights with proposed distribution function can write as:
1
E
M
∑ g (u
k
k
i
) wik = ∫ g (U ) f (U )dU
(10)
where g (U ) is a proposed distribution may use Gaussian , Beta or Gamma distribution. This repre-
sentation will consist of a set of weight points assume that f is non-negative and
and is finite. Then
∫
∫ f (U )dU exists
f (X )
is a probability density function represents the distribution of interest
f (U )dU
write as pf ( X ) . Now we have a collection of M points uik ~ P (U ) and a collection of weight as
175
- 5. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
wik = f (uik ) / P (U ik ) so we have
1
E
M
∑w
k
i
k
1 f (U )
= ∫ P(U ) P(U )dU
Then
1
E
M
this means
E pf [g ] = ∫ g (U ) pf (U )dU
∑w
k
i
k
= ∫ f (U ) dU
∑ g (uk ) wk
E pf [g ] = E k
∑w
k
k
(11)
∑ g (u ) w
[g ] ≈
∑w
k
E pf
k
k
(12)
k
k
In our approach compute the resembling of the likelihood for the used distribution function
and comparing with the histogram for the image is processing. The representation of the posterior
states conditional observations for each object as p ( X it | Yi 0:n ) can put in simple form when
represent a collection of M weight samples or particles, {x k , wk }M
where i the object number to
i ,t
i ,t
k =1
track and t is the frame number and k is the number of particles, so the particle representation of
this density as
(13)
p ( X i , t | Y i , 0 : N ) = ∑ w ik, t δ ( x i , t − x ik, t − 1 )
k
where the form of
p ( x i ,t | y i , 0: N ) = αP ( y i , 0: N | x i ,t ) ∫ P ( x i ,t −1 | y i , 0:t −1 ) P ( x i ,t | x i ,t −1 ) dx i ,t −1
(14)
So we approximate the formula of (12) as
p( xi ,t | yi ,0:N ) ≈ αP( yi ,0:N | xi ,t )∑ wik,t P( xi ,t | xik,t −1 )
(15)
k
So the particle filter can be viewed as operating as importance sampler on this distribution. The technique of importance sampling is a method of generating samples p( xi ,t ) is the density function that
can be evaluated by
M
p ( x) ≈ ∑ wk δ ( x − x k )
(16)
k
where the wk =
p( x)
g(xk )
where g ( x k ) is any proposed distribution is used for particles. So draw-
ing M samples xij, t from the proposed distribution g ( x k ) for i object so
xij,t ~ g ( x k ) = ∑ wik,t −1 P ( xi ,t | xik,t −1 )
k
176
(17)
- 6. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976
0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
Then set the weight wij,t as the likelihood as
wij,t = P ( yi ,t | xij,t )
(18)
After that normalize the weights
wij,t
wij,t where wij,t =
∑k wik,t
(19)
5. EXPERIMENTS
We have used algorithm with the generalized probability distributions with different param
parameters which leads to different results as shown in figures 2-9, for example in the case α = 1 and γ = 3
2 9,
gives beta distribution .
parti
Figure 2 tracking using beta distribution function and 2000 particles when α = 1 and γ = 3
Figure 3 tracking using Gaussian distribution function and 2000 particles when α = 1 and γ = 0.5
particles
Figure 4 tracking using gamma distribution function and 2000 particles when α = 0.25 and γ =5
177
- 7. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976
0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
Figure 5 tracking using Poisson distribution function and 2000 particles when. α = 5 and γ = 3
Figure 6 Trajectory using Beta distribution function and 2000 particles
Figure 7 Trajectory using Gaussian distribution function and 2000 particles
Figure 8 Trajectory using Gamma distribution function and 2000 particles
Gamma
178
- 8. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
Figure 9 Trajectory using Piossion distribution function and 2000 particles
For determine the speed of a moving object, when we are tracking the moving object, the
video is processed to determine the number of video frames and video time. Is determined by the desired positions of the required object and then determine its speed through the use of Particle Filter
method and several Different distribution functions (Gauss, beta, and gamma and Poisson) to compare the results. The Centers of Particle filter points are determined using the method of Kmean as
shown in Figures 10 and 11, and then specify the trajectory of the tracking object as show in figure
12. Firstly compute the center of object at each frame and accumulate the distances using Euclidean
distance.
Dist ൌ ∑୬ ටሺx୧ െ x୨ ሻଶ ሺy୧ െ y୨ ሻଶ
୧,୨ୀଵ
(20)
Where n is number of frames and ሺi, jሻ are the centers of object at frames. So we can compute
the velocity the object using. Table1 show the center of the object (balloon) at some frames. And the
parameters used are the following: BitsPerPixel = 24, FrameRate = 29.9700, Height = 480, Number
of frames = 106, VideoFormat = RGB24, Width = 720 and Time= 3.27 sec.
The real velocity is 0.36 m/sec but our approach gives 0.358 m/sec.
Figure 10 tracking using beta distribution function and 2000 particles when α = 1 and γ = 3 at
frame #1
179
- 9. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
Figure 11 tracking using beta distribution function and 2000 particles when α = 1 and γ = 3 at
frame from 1 to 106
Figure 12 Trajectory using Beta distribution function and 2000 particles
Table 1: show the center of the object at some frames
#
x
y
dist
P1
572.5
395.2
172.9585
P2
399.6
399.7
171.4786
P3
254.7
308
23.90063
P4
237.9
291
195.8252
P5
433.6
284
261.8262
P6
691.3
330.3
19.5438
P7
672.7
336.3
14.90034
P8
672.8
351.2
Total = 860.43
6. CONCLUSIONS
This approach presented a new technique to tracking multiple interacting objects based on a
generalized probabilistic particle filter and determines the velocity of object. In this method, a sample set of the tracked objects is constructed at the beginning of the tracking process. Then, we have
predicted the prior representation and position of the tracked objects depending on the minimization
of the parameters of the proposed generalized probabilistic distribution. Experimental results show
that, the proposed method can effectively improve the tracking precision especially multiple moving
objects. The proposed approach is an effective technique for tracking multiple interacting objects in
the presence of partial occlusion compared with other techniques
180
- 10. International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 4, Issue 5, September - October (2013), © IAEME
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
[16]
[17]
[18]
C. Wen, A. Azarbayejani, T. Darrell, and A. Pentland, “Pfinder: real time tracking of human
body,” IEEE Trans. On PAMI, vol.19, no.7, pp.780-785, 1997.
D. Comaniciu, V. Ramesh, and P. Meir, Real-time tracking of non-rigid objects using mean
shift, Proceedings of IEEE International Conference on Computer Vision and Pattern
Recognition, Hilton Head, 2000, pp. 142- 149.
M. Isard and A. Blake, Condensation conditional density propagation for visual tracking,
International Journal on Computer Vision, 29(1), 1998, pp. 5-28.
K. Nummiaro, E. Koller-Meier and L. Van Gool, An adaptive color-based particle filter,
Image and Vision Computing, 21(1), 2003, pp. 99-110.
M. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, A tutorial on particle filters for
online nonlinear/non-Gaussian bayesian tracking, IEEE Transactions on Signal Processing,
50(2), 2002, pp. 174-188.
F. Huo and E. Hendriks, Multiple people tracking and pose estimation with occlusion
estimation, Computer Vision and Image Understanding 116 (2012), pp. 634–647.
N. Choa, Alan L. Yuille and S. Lee, Adaptive occlusion state estimation for human pose
tracking under self-occlusions, Pattern Recognition 46 (2013), pp. 649–661.
M. Kristan, J. Perš, M. Perše and S. Kovacˇic, Closed-world tracking of multiple interacting
targets for indoor-sports applications, Computer Vision and Image Understanding, 113,
(2009), pp. 598–611.
X. Zhao, Y. Satoh, H. Takauji, S. Kaneko ,K. Iwata and R. Ozaki, Object detection based on
a robust and accurate statistical multi-point-pair model, Pattern Recognition, 44 (2011),
pp. 1296–1311.
M. Roha, T. Kima, J. Parkb and S. Leea, Accurate object contour tracking based on boundary
edge selection, Pattern Recognition, 40, (2007), pp. 931 – 943.
P. Kaew, T. Ponga and Bowdenb, A real time adaptive visual surveillance system for tracking
low-resolution color targets in dynamically changing scenes, Image and Vision Computing,
21, (2003), pp. 913–929.
Z. Han, J. Jiao, B. Zhang, Q. Ye and J. Liu, Visual object tracking via sample-based Adaptive
Sparse Representation (AdaSR), Pattern Recognition, 44 (9), (2011), pp. 2170–2183.
Z. Feng, B. Yang, Y. Li, Y. Zheng, X. Zhao, J. Yin and Q. Meng, Real-time oriented
behavior-driven 3D freehand tracking for direct interaction, Pattern Recognition, 46, (2013),
pp. 590–608.
F. Huo, Emile A. Hendriks, Multiple people tracking and pose estimation with occlusion
estimation, Computer Vision and Image Understanding, 116, (2012), pp. 634–647.
A. Yilmaz, O. Javed and M. Shah, Object Tracking: A Survey, ACM Computing Surveys,
38(4), (2006), pp. 1-41.
P. Paolino, Maximum Likelihood Estimation of Models with Beta-distributed Dependent
Variables, Political Analysis, 9(4), (2010).
V Purandhar Reddy, “Object Tracking by Dtcwt Feature Vectors”, International Journal of
Computer Engineering & Technology (IJCET), Volume 4, Issue 2, 2013, pp. 73 - 78,
ISSN Print: 0976 – 6367, ISSN Online: 0976 – 6375.
Reshma R.Gulwani and Sudhirkumar D.Sawarkar, “Video Indexing using Shot Boundary
Detection Approach and Search Tracks in Video”, International Journal of Computer
Engineering & Technology (IJCET), Volume 4, Issue 3, 2013, pp. 432 - 440, ISSN Print:
0976 – 6367, ISSN Online: 0976 – 6375.
181