Advanced
Direction-Based Modified Particle Filter for Vehicle Tracking
Direction-Based Modified Particle Filter for Vehicle Tracking
ETRI Journal. 2016. Apr, 38(2): 356-365
Copyright © 2016, Electronics and Telecommunications Research Institute (ETRI)
  • Received : February 25, 2015
  • Accepted : November 16, 2015
  • Published : April 01, 2016
Download
PDF
e-PUB
PubReader
PPT
Export by style
Share
Article
Author
Metrics
Cited by
TagCloud
About the Authors
Mustafa Eren Yildirim
Ibrahim Furkan Ince
Yucel Batu Salman
Jong Kwan Song
Jang Sik Park
Byung Woo Yoon

Abstract
This research proposes a modified particle filter to increase the accuracy of vehicle tracking in a noisy and occluded medium. In our proposed method for vehicle tracking, the direction angle of a target vehicle is calculated. The angular difference between the motion direction of the target vehicle and each particle of the particle filter is observed. Particles are filtered and weighted depending on their angular distance to the motion direction. Those particles moving in a direction similar to that of the target vehicle are assigned larger weights; this, in turn, increases their probability in a given likelihood function (part of the process of estimation of a target’s state parameters). The proposed method is compared against a condensation algorithm. Our results show that the proposed method improves the stability of a particle filter tracker and decreases the particle consumption.
Keywords
I. Introduction
Recent innovations in high-speed computers, object detection techniques, and high-definition cameras, as well as advances in multisensory technology are of increasing interest to researchers in the field of video-based object tracking. Visual object tracking is an essential task in applications such as surveillance systems [1] , robotics [2] , traffic control, and human–computer interaction [3] . However, it is a challenging task due to the fact that tracked objects are always subject to deformation, various illuminations, and collisions [4] . Researchers have proposed several methods to overcome related handicaps in the field of visual tracking. Among such methods, that of the particle filter is one of the more outstanding [5] . Being able to process nonlinear and non-Gaussian systems [6] , it is widely used for object tracking and is able to handle both the gray change of an object and occlusions between objects well [7] [11] . A particle filter is a method that recursively constructs posterior probability distributions of a state space. It is basically a Bayesian filtering method that uses the Sequential Monte Carlo method to simulate the posterior distribution of a system state. A state model representing characteristics such as position, speed, and scale should be defined in a two-dimensional image space [12] , [13] . Additionally, an observation, made from using texture, shape, and color information, is conducted to establish a likelihood model [12] , [14] , [15] . A conventional particle filter, also known as a condensation algorithm [16] , is highly robust to clutter. Although it is a robust tracker, it suffers from high computational burden and inefficient particle distribution [17] . High computational burden will not affect the accuracy of tracking but can extend a system’s capacity; thus, the default number of particles within a particle filter method needs to be reduced. However, a reduction of this kind should be made in such a way so as not to interfere with the accuracy of tracking. A particle filter is expected to perform with high accuracy under the minimum number of particles necessary. A robust tracker based on sparse representation, by combining a generative model with a discriminative model, was introduced in [18] . This method, however, failed to track under occlusion and background clutter. In [19] , authors proposed a particle filter algorithm with an adaptive noise model. In [20] , a multifeature target representation including color, shape, and local binary pattern (LBP) was used with a particle filter. It has shown some increase in accuracy; however, the associated computational load was high. In [21] , a firefly algorithm–based particle filter was introduced that achieved a robust tracking. In [17] , an iterative particle filter that increased the tracking accuracy is presented. However, it was not able to affect the computational burden.
In [22] and [23] , authors proposed new architectures for a particle filter. However, these only affected the tracking speed and not the accuracy. A layered particle filter was introduced for vehicle tracking, but it gives out error in the case of occlusion [24] . Although there is a considerable number of studies on particle filter–based tracking, there is still the absence of a method having both high accuracy and low computational load.
This paper proposes a novel vehicle tracking method that redistributes and filters the particles of a particle filter depending on their angular differences to a target vehicle’s direction. The conventional likelihood model is modified by the target vehicle’s motion. This approach ensures that those particles located close (in terms of the motion angle) to the target vehicle motion increase their probability in a given likelihood function (part of the process of estimation of a target’s state parameters).
The primary contributions of the proposed model are as follows: particle consumption is reduced and tracking accuracy is improved due to an increase in the probabilities pertaining to the particles. Additionally, the particle distribution is more efficient, which enables our model to be more stable and robust against noise.
This paper is organized as follows. In Section II, a conventional particle filter method is described. Section III introduces the proposed approach. The experimental results and considerations are presented in Section IV. Section V presents our conclusions and further study.
II. Conventional Particle Filtering
Particle filtering was originally developed to track objects in clutter. The state of a tracked object at any given time can be denoted by a vector, say x t , where t represents time. Similarly, vector z t can be said to denote the observations z 1 ,…, z t up to time t . Particle filters are often used when the posterior density, p( x t | z t ), and the observation density, p( z t | x t ), are non-Gaussian.
The key function of a particle filter is to approximate a probability distribution of a set of samples through a series of appropriate weights. Each sample, s , in the set S = {( s (n) , π (n) | n =1,…, N )} represents a hypothetical state of the object being tracked and has a corresponding discrete sampling probability, π , where
∑ n=1 N π (n) =1.
The evolution of the sample set is described by propagating each sample according to a system model. Then, each element of the set is weighted in terms of the observations, and N samples are drawn, with replacement, by choosing a particular sample with probability
π t (n) =p( z t | xt = s t (n) ).
The mean state of an object being tracked is estimated at each time step by
(1) E[ St ]= ∑ n=1 N s t (n) π t (n) .
In a particle filter, several features such as image contrast, image frame difference, edge-detected silhouette, 2D or 3D contours, gray level, and RGB or HSV color spaces are used for the tracking process. The features for tracking are selected in relation to the type of moving target and the conditions of the tracking environment [25] . Color is one of the most commonly used features for it is effective in terms of scaling variation and rotation, as well as being robust to object deformations and partial occlusions within the tracking environment. It is independent from rotation and scale, and can be easily obtained.
In our experiments, color histograms for measuring the similarity (likelihood) are typically calculated in a hue saturation value (HSV) space. To make the proposed method less sensitive to illumination conditions, we used a HSV color space of 8 × 8 × 4 bins (where value (V) is represented by four bins) [26] .
In a tracking approach, the estimated state of the target vehicle is updated at each time step by taking any new observations into account. Therefore, the proposed method needs a similarity measure based on the distributions of the aforementioned color histograms. One such popular measure between two given distributions, p [ u ] and q [ u ], is that of the Bhattacharyya coefficient.
(2) p[p,  q]= ∫ p(u)q(u)  du  .
The associated Bhattacharya distance is defined as follows:
(3) d B = 1−ρ[p,  q] .
Finally, the likelihood weights are specified by the following Gaussian distribution with variance σ c :
(4) π color (n) = 1 2π σ c exp(− d B 2 /2 σ c 2 )         n=1, 2, ... , N,
where N is the number of samples [27] .
We use an ellipse to model our target vehicle and particles. The following 4-D state model represents a dynamic model of such an ellipse:
(5) s ={x,y, H x , H y },
where x and y refer to the coordinates of the center location of the ellipse, and Hx and Hy refer to the lengths of the major and minor axes of the ellipse, respectively.
These are the basic steps of a conventional particle filter algorithm using a single feature. When a target vehicle is occluded by another car or object, new measurements cannot be taken. Therefore, in such cases, the likelihood function cannot be utilized and tracking can be difficult.
In the proposed particle filter, the direction feature supports the tracker in cases where scalar features cannot be detected.
III. Directional Particle Filtering
- 1. Calculation of Motion Direction
The first step in our proposed method is to calculate the direction of motion of the target vehicle. This direction is defined in terms of an angle.
A schematic diagram of a target object’s motion is shown in Fig. 1 . The target vehicle moves from location A to B, and its motion angle relative to the horizontal axis is θ t−1 . It’s direction can be denoted in terms of a displacement vector, by using the center point of the target vehicle in the current frame, ( xt , yt ), and its center point from the previous frame, ( x t−1 , y t−1 ). This process is formulated as follows:
(6) θ t−1 =arctan( Δy Δx )=arctan( y t − y t−1 x t − x t−1 ).
PPT Slide
Lager Image
Schematic diagram of object motion.
The direction of the target may change; for example, if the target vehicle changes lanes. Therefore, the direction feature must be updated for accurate tracking. In this study, this is achieved through use of an approximate median method [11] , as follows:
(7) H t+1 =( H t +1    if M t > H t H t −1    if M t < H t H t            if M t = H t ),
where H represents a parameter to be updated at times t and t + 1, respectively. The new measurement of such time steps is denoted by M .
- 2. Particle Weighting
Particles are observed and weighted depending on the observations. In a conventional particle filter, particles are evenly distributed around a target. Unlike a conventional filter, the proposed filter does not use all one hundred particles for tracking. The angle between the target vehicle and each particle is calculated. The aim is to identify the angular distance of each particle to the direction of motion of the vehicle. The angular distance is used to weight the particles and renew their distribution around the vehicle. This calculation can be performed by using the distributed particles of the current time step and the vehicle’s estimated location in the previous time step. This is illustrated in Fig. 2 .
In Fig. 2 , the red circles are the candidate particles at time step t . At time step t − 1, the state parameters are already estimated. These parameters include the center of the target (vehicle) ( x t−1 , y t−1 ). Each blue line represents the location of a particular particle with reference to the target’s position in the previous frame. These angles are used to determine the weighting coefficients of the corresponding particles. The calculation for this is given by
(8) β t−1 (n) =arctan( Δy Δx )=arctan( y t−1 − y t (n) x t−1 − x t (n) )  n=1, 2, ... , N.
PPT Slide
Lager Image
Illustration of particle angle observation.
After this calculation, the difference between the direction of motion and each value of
β t−1 (n)
is obtained. This is done by subtracting the results of (6) and (8).
(9) ψ t−1 (n) =| θ t−1 − β t−1 (n) |   n=1,2,...,N.
In (9),
ψ t−1 (n)
is the angle of the n th particle with reference to the direction of motion of the target vehicle; it has a range of [0, π].
The next step is to filter the particles. This filtering process is done by applying a threshold control to all the results of (9). This control assigns new weights to particles according to the results of (8).
(10) π t (n) =( 0 if ψ t−1 (n) ≥T π t (n) ×( 1−[ ψ t−1 (n) T ] ) if ψ t−1 (n) According to (10), if a particle is outside the region T then it will be multiplied by 0 and will not be used. In contrast, if the particle is inside the region then it will be re-weighted depending on its angle difference with the target vehicle, which is obtained in (9). The advantage of this is that the number of particles for tracking is reduced. In (10), T is a threshold value having the same unit as that used for motion and particle angles. The decision of T affects the performance of the method. This relation is shown in further sections. The above expression reveals that as the angle difference between target vehicle and particle decreases, the weight of that particle increases. If the difference is equal to “0,” then the weight will be equal to “1.” After the decision on each particle is completed, re-weighted values are normalized. In this way, the likelihood probabilities of those particles that are moving in the same or similar direction to the target are increased.
Figure 3 shows the distribution of particles around the target vehicle and how the proposed method filters them. The angle threshold — shown by a blue dotted line — is shown next to the motion direction — shown by a solid red line. Green ellipses are to be eliminated. Black ellipses increase their probabilities for estimation.
PPT Slide
Lager Image
Illustration of particle distribution and threshold.
Figure 4 shows that initially the particles have a uniform distribution, independent from angle information, in a conventional particle filter. The x -axis range is [0, 180°] because the result of (9) is always non-negative. According to the proposed method, the maximum difference can be 180°, which means that the target and a particle are moving in opposite directions.
PPT Slide
Lager Image
Initial PDF of particles without angle observation.
After running for a period of time, particles begin to have different probabilities. Figure 5 shows that there is no deterministic relation between the probabilities and angle differences. This is because in a conventional algorithm the probability distribution of the likelihood model does not depend on angle differences.
PPT Slide
Lager Image
PDF of particles without angle observation.
Those particles that are to the left of the red line are so-called desired particles; these particles will be weighted by the weighting function (10), whereas those that are to the right of the red line will be eliminated. So, the total probability will be calculated based on only the desired particles and will have a probability distribution function similar to that shown in Fig. 7 . The filter given in (10) is linear in the range [0, T ]. It is shown in Fig. 6 . In Fig. 6 , T is 45°.
PPT Slide
Lager Image
Weighting function of proposed particle filter.
After the filtering, weighting, and normalization processes, the new PDF of the remaining particles will be as shown in Fig. 7 . The actual effect of the proposed method can be seen in Fig. 7 . The probabilities in Fig. 5 were around approximately 0.004 to 0.007 for 100 particles. In the proposed method, the new probability values of the desired particles are increased, the highest among them being 0.045.
PPT Slide
Lager Image
PDF of particles after filtering, weighting, and normalization processes.
IV. Experimental Results and Considerations
Experiments are conducted on three different videos to show the contribution of our method to the existing particle filter method. All videos contain frames of size 480 × 720 pixels. Each video is of a different highway in the Rep. of Korea and has different medium properties, such as view angle, illumination, and occlusion. One of the videos is of a bridge section of a highway. It differs from the others in terms of the stability of the camera. The camera vibrates at certain times of the video. Videos that have partially or fully occluded scenes are chosen with higher priority, to show the better performance of our method. The findings of the proposed method are compared with the output of a condensation algorithm.
In the experiments, a particle filter was used with a single feature. Only a color histogram in the HSV domain was used for target recognition. Three different angle thresholds were used for tests to indicate the effect of the proposed method on particle number and tracking accuracy.
Figure 8 shows image stills taken from a video recording of a section of highway. The related video shows a truck that is often occluded by trees. The quality of the video is poor. The illumination in this video is low due to the recording time. The frames in column Fig. 8(a) show the first location of tracking and the frames in column Fig. 8(c) show the end of tracking. A conventional method and variations of the proposed method are able to track the target until it is occluded by trees. At this point, the conventional method loses the target but the proposed method keeps on tracking even though the target’s motion is partially occluded.
Figure 9 illustrates the paths of the tracking shown in Fig. 8 . The conventional method — shown by the green colored plot — loses the target and its tracker shifts to a different location. The fluctuations within the plots represent minor shifts on behalf of the target. At the upper end point of the ground-truth line, the target leaves the frame. The motion of the trackers beyond this point is not of our concern.
PPT Slide
Lager Image
Tracking captures of proposed particle filter (from top to bottom) with T-values of π/2, π/4, and π/8, and conventional particle filter.
PPT Slide
Lager Image
Comparison of tracking paths between proposed and conventional particle filter.
The Euclidean distance between the ground truth and each tracker path [17] is illustrated in Fig. 10 . It is calculated in terms of a distance between two vectors. In our case, the distances between the ground truth and each proposed method were calculated separately. The Euclidean distance in the case of the conventional method shows a dramatic increase when the tracker loses the target due to occlusion.
PPT Slide
Lager Image
Euclidean distance comparison of proposed and conventional particle filter.
In Fig. 11 , the target is occluded by the poles of sign posts. These poles cause the conventional method to lose the target, whereas all variations of the proposed method continue tracking the target until it disappears from view.
PPT Slide
Lager Image
Tracking captures of proposed particle filter (from top to bottom) with T-values of π/2, π/4, and π/8, and conventional particle filter.
In Fig. 12 , the proposed particle filter with π/8 can track the target till the end with some fluctuations during tracking. The reason for this is that the relative speed of the target from the camera view was fast and the number of particles used is very low.
PPT Slide
Lager Image
Comparison of tracking paths between proposed and conventional particle filter.
Figure 13 shows that the three different versions of the proposed method outperform the conventional method. The Euclidean distance of the conventional method increases when the size of the target decreases or when the target is occluded by poles.
PPT Slide
Lager Image
Euclidean distance comparison of proposed and conventional particle filter.
Figure 14 shows the captures of the third test, which is a more challenging medium than that of the previous cases. This test was conducted on a low-resolution video (of a bridge) that contained vibration and occlusion. The target in this video is a white SUV in the far-right lane. A car of similar color and shape enters the frame a little way into the video in Fig.14(c) . There are also other vehicles travelling at different angles to that of the motion of the target vehicle and at other locations on the bridge. The three versions of the proposed particle filter can continue tracking even when the target car is occluded by poles. Figure 14(c) shows that the conventional method loses the target at such a point.
PPT Slide
Lager Image
Tracking captures of proposed particle filter (from top to bottom) with T-values of π/2, π/4, and π/8, and conventional particle filter.
The tracking paths of the four trackers are illustrated in Fig. 15 . The longest tracking duration belongs to π/2, because it has the highest number of particles among the three versions of the proposed method. The motion of the tracker after the end of ground truth is negligible.
PPT Slide
Lager Image
Comparison of tracking paths between proposed and conventional particle filter.
The trackers with π/4 and π/8 thresholds also stop tracking after the occlusion point. This is due to the low quality of the video and vibration of the camera on the bridge.
Figure 16 shows the Euclidean distances calculated for all methods with regards to the third video. All versions of the proposed particle filter outperform the conventional method.
PPT Slide
Lager Image
Euclidean distance comparison of proposed and conventional particle filter.
The proposed particle filter tracks with fewer particles. This means that the proposed filter has a lower computational load compared to that of the conventional method. The initial number of particles for all methods used in this study is 100. This is a commonly used value for particle-filter tracking systems. Table 1 shows the number of particles used in each of the three videos. When we consider the average number of particles used in the proposed particle filter, in the case of π/2, it varies from 77 to 85; in the case of π/4, 43 to 52; and in the case of π/8, 26 to 37. This means that in the case of π/2 the number of particles decreased by 20%; for π/4, by 50%; and for π/8, by 70%. As the threshold range narrows, the number of particles decreases.
Number of particles used in tests.
Alg. type 1st video 2nd video 3rd video
Min. no. of particles Avg. no. of particles Min. no. of particles Avg. no. of particles Min. no. of particles Avg. no. of particles
Conv. 100 100 100 100 100 100
π/2 34 85 34 84 31 77
π/4 10 52 14 47 11 43
π/8 3 26 7 35 6 37
For π/2 and π/4, the proposed method supports the particle filter algorithm to continue tracking even under a very small number of particles. They can track a target by using a minimum of 34 and 10 particles, respectively.
In the case of π/8, we can generate a risk factor (in terms of accuracy) since this threshold keeps the region (between the direction of motion of the target and the threshold) of particles very narrow. Therefore, in some cases, a few particles can fall inside this region.
V. Conclusion
This study proposed a new approach for a particle filter in the field of object tracking. It presented a novel vehicle tracking method that redistributes and filters particles based on their angular differences to the direction of the target. Characteristics of target motion are used in addition to scalar features. Particles resulting from a predefined angle threshold are eliminated, whereas others are weighted. This weighting increases the probabilities of those particles that are moving in a direction similar to that of the target. Experiments were conducted for three videos with different characteristics. Results proved that the proposed method improved the accuracy and duration of tracking under occlusion and vibration. The computational load was decreased as a result of reduced consumption of particles. Furthermore, particle distributions became more efficient, which enabled our model to be more stable and robust against noises.
As a further study, the same method will be tested with different weighting functions. An automatic decision process should be implemented for the angular threshold instead of using a manually given value.
This work was supported by the Industrial Strategic technology development program of Ministry of Science, ICT and Future Planning, Rep. of Korea, (10045260, Development of Context Awareness Monitoring and Search System Based on High Definition Multi-video) and Brain Busan. Rep. of Korea.
BIO
Corresponding Author mustafaeren.yildirim@eng.bahcesehir.edu.tr
Mustafa Eren Yildirım received his BS degree in electrical engineering from Bahcesehir University, Istanbul, Turkey, in 2008 and his MS and PhD degrees in electronics engineering from the Graduate School of Electrical and Electronic Engineering, Kyungsung University, Pusan, Rep. of Korea, in 2010 and 2014, respectively. He worked as a researcher and lecturer for Kyungsung University, until August 2015. His research interests include image processing, computer vision, recognition, and object tracking.
furkan.ince@gediz.edu.tr
Ibrahim Furkan Ince received his PhD degree in IT convergence design from the Graduate School of Digital Design, Kyungsung University, Pusan, Rep. of Korea, in 2010. For post-doctoral studies, he participated in research activities at the University of Tokyo, Japan, from 2010 to 2012. He worked as a chief research engineer at Hanwul Multimedia Communication Co. Ltd., Pusan, Rep. of Korea, from May 2012 to May 2014. Currently, he is working as an assistant professor of computer engineering at Gediz University, Izmir, Turkey. His research interests include image processing, computer vision, pattern recognition, and human–computer interaction.
batu.salman@eng.bahcesehir.edu.tr
Yucel Batu Salman received his BS and MS degrees in computer engineering from Bahcesehir University, Istanbul, Turkey, in 2003 and 2005, respectively. He received his PhD degree in IT convergence design from Kyungsung University, Pusan, Rep. of Korea, in 2010. Since 2010, he has been with the Department of Software Engineering, Bahcesehir University, Istanbul, Turkey, where he is an assistant professor. He is currently vice director of the Graduate School of Natural and Applied Sciences, Bahcesehir University. His research interests include human–computer interaction, mobile programming, and computer vision.
jsong@ks.ac.kr
Jong Kwan Song received his BS degree in electronics engineering from Pusan National University, Rep. of Korea, in 1989 and his MS and PhD degrees in electronics engineering from the Korea Advanced Institute of Science and Technology, Daejeon, Rep. of Korea, in 1991 and 1995, respectively. From 1995 to 1997, he was a researcher at Korea Mobile Telecom, Daejeon, Rep. of Korea. Since 1997, he has been a professor with the Department of Electronics Engineering, Kyungsung University, Pusan, Rep. of Korea. His research interests include video/image processing, nonlinear digital signal processing, and embedded systems.
jsipark@ks.ac.kr
Jang Sik Park received his BS, MS, and PhD degrees in electronic engineering from Pusan National University, Rep. of Korea, in 1992, 1994, and 1999, respectively. From 1997 to 2011, he was a professor at Dongeui Institute of Technology, Pusan, Rep. of Korea. Since 2011, he has been a professor with the Department of Electronics Engineering, Kyungsung University, Pusan, Rep. of Korea. His research interests include video/image processing and understanding; speech and audio signal processing; and embedded systems.
bwyoon@ks.ac.kr
Byung Woo Yoon received his BS, MS, and PhD degrees in electronic engineering from Pusan National University, Rep. of Korea, in 1987, 1989, and 1992, respectively. From 1993 to 1995, he was a senior researcher at ETRI, where he was involved in the development of the CDMA Mobile Communication System Group. He was a visiting scholar at the Department of Electrical Engineering, University of Colorado, Boulder, USA, from 2001 to 2002, and at the Department of Electrical Engineering, University of North Carolina, chapel hill, USA, from 2008 to 2009. Since 1995, he has been a professor with the Department of Electronics Engineering, Kyungsung University, Pusan, Rep. of Korea. His research interests include signal processing, design of VLSI, and development of digital systems.
References
Szpak Z.L. , Tapamo J.R. 2011 “Maritime Surveillance: Tracking Ships Inside a Dynamic Background Using a Fast Level-Set,” Expert Syst. Appl. 38 (6) 6669 - 6680    DOI : 10.1016/j.eswa.2010.11.068
Das Sharma K. , Chatterjee A. , Rakshit A. 2012 “A PSO-Lyapunov Hybrid Stable Adaptive Fuzzy Tracking Control Approach for Vision-Based Robot Navigation,” IEEE Trans. Instrum. Meas. 61 (7) 1908 - 1914    DOI : 10.1109/TIM.2012.2182868
Prisacariu V.A. , Reid I. 2012 “3D Hand Tracking for Human Computer Interaction,” Image Vis. Comput. 30 (3) 236 - 250    DOI : 10.1016/j.imavis.2012.01.003
Fei X. , Hashimoto K. “An Object-Tracking Algorithm Based on Particle Filtering with Region-Based Level Set Method,” Proc. Int. Conf. Intell. Robots Syst. Taipei, Taiwan Oct. 18–22, 2010 2908 - 2913
Walia G.S. , Kapoor R. 2014 “Human Detection in Video and Images - a State-of-the-Art Survey,” Int. J. Pattern Recog. Artif. Intell. 28 (3) 1 - 25
Dong C.L. , Dong Y.N. 2009 “Survey on Video Based Vehicle Detection and Tracking Algorithms,” J. Nanjing University Posts Telecommun. Natural Sci. 29 (2) 88 - 94
Hue C. , Le Cadre J.P. , Perez P. 2002 “Tracking Multiple Objects with Particle Filtering,” IEEE Trans. Aerosp. Electron. Syst. 38 (3) 791 - 812    DOI : 10.1109/TAES.2002.1039400
Vihola M. 2007 “Rao-Blackwellised Particle Filtering in Random Set Multitarget Tracking,” IEEE Trans. Aerosp. Electron. Syst. 43 (2) 689 - 705    DOI : 10.1109/TAES.2007.4285362
Bruno M.G.S. , Pavlov A. 2005 “Improved Sequential Monte Carlo Filtering for Ballistic Target Tracking,” IEEE Trans. Aerosp. Electron. Systems 41 (3) 1103 - 1108    DOI : 10.1109/TAES.2005.1541456
Comaniciu D. , Ramesh V. , Meer P. 2003 “Kernel-Based Object Tracking,” IEEE Trans. Pattern Anal. Mach. Intell. 25 (5) 564 - 577    DOI : 10.1109/TPAMI.2003.1195991
Zhou S.K. , Chellappa R. , Moghaddam B. 2004 “Visual Tracking and Recognition Using Appearance-Adaptive Models in Particle Filters,” IEEE Trans. Image Process. 13 (11) 1491 - 1506    DOI : 10.1109/TIP.2004.836152
Gao M. 2012 “Object Tracking Based on Harmony Search: Comparative Study,” J. Electron Imag. 21 (4) 1 - 13
Gao M.L. 2013 “Object Tracking Using Firefly Algorithm,” IET Comput. Vis. 7 (4) 227 - 237    DOI : 10.1049/iet-cvi.2012.0207
Zhang S. 2013 “Robust Visual Tracking Based on Online Learning Sparse Representation,” Neurocomput 100 31 - 40    DOI : 10.1016/j.neucom.2011.11.031
Han H. 2011 “An Evolutionary Particle Filter with the Immune Genetic Algorithm for Intelligent Video Tracking,” Comput. Math. Appl. 62 (7) 2685 - 2695    DOI : 10.1016/j.camwa.2011.06.050
Isard M. , Blake A. 1998 “Condensation – Conditional Density Propagation for Visual Tracking,” Int. J. Comput. Vis. 29 (1) 5 - 28    DOI : 10.1023/A:1008078328650
Fan Z. , Ji H. , Zhang Y. 2015 “Iterative Particle Filter for Visual Tracking,” Signal Process.: Image Commun. 36 140 - 153    DOI : 10.1016/j.image.2015.07.001
Liu B. “Robust and Fast Collaborative Tracking with Two Stage Sparse Optimization,” European Conf. Comput. Vis. Heraklion, Greece Sept. 5–10, 2010 624 - 637
Bimbo D. , Dini F. 2011 “Particle Filter-Based Visual Tracking with a First Order Dynamic Model and Uncertainity Adaptation,” J. Comput. Vis. Image Understanding 115 (6) 771 - 786    DOI : 10.1016/j.cviu.2011.01.004
Dou J.F. , Li J.X. 2014 “Robust Visual Tracking Based on Adaptively Multi-feature Fusion and Particle Filter,” Optik – Int. J. Light Electron Optics 125 (5) 1680 - 1686    DOI : 10.1016/j.ijleo.2013.10.007
Gao M.L. 2015 “Firefly Algorithm (FA) Based Particle Filter Method for Visual Tracking,” Optik – Int. J. Light Electron Optics 126 (18) 1705 - 1711    DOI : 10.1016/j.ijleo.2015.05.028
Liu H. , Sun F. 2012 “Efficient Visual Tracking Using Particle Filter with Incremental Likelihood Calculation,” Inf. Sci. 195 141 - 153    DOI : 10.1016/j.ins.2012.01.033
El-Halym H.A.A. , Mahmoud I.I. , Habib S.E.D. 2012 “Proposed Hardware Architectures of Particle Filter for Object Tracking,” EURASIP J. Adv. Signal Process. 1 (17) 1 - 19
Qi W. 2011 “A Robust Approach for Multiple Vehicles Tracking Using Layered Particle Filter,” AEU – Int. J. Electron. Commun. 65 (7) 609 - 618    DOI : 10.1016/j.aeue.2010.06.006
Cho J.U. “Multiple Objects Tracking Circuit Using Particle Filters with Multiple Features,” IEEE Int. Conf. Robotics Autom. Roma, Italy Apr. 10–14, 2007 4639 - 4644
Cuevas E.J. , Zaldivar D.N. , Rojas R. 2007 “Particle Filter in Vision Tracking,” e-Gnosis 5 1 - 11
McFarlane N.J.B. , Schofield C.P. 1995 “Segmentation and Tracking of Piglets,” Mach. Vis. Appl. 8 (3) 187 - 193    DOI : 10.1007/BF01215814