(10) π t (n) =( 0 if ψ t−1 (n) ≥T π t (n) ×( 1−[ ψ t−1 (n) T ] ) if ψ t−1 (n)
According to (10), if a particle is outside the region
T
then it will be multiplied by 0 and will not be used. In contrast, if the particle is inside the region then it will be re-weighted depending on its angle difference with the target vehicle, which is obtained in (9). The advantage of this is that the number of particles for tracking is reduced. In (10),
T
is a threshold value having the same unit as that used for motion and particle angles. The decision of
T
affects the performance of the method. This relation is shown in further sections. The above expression reveals that as the angle difference between target vehicle and particle decreases, the weight of that particle increases. If the difference is equal to “0,” then the weight will be equal to “1.” After the decision on each particle is completed, re-weighted values are normalized. In this way, the likelihood probabilities of those particles that are moving in the same or similar direction to the target are increased.
Figure 3
shows the distribution of particles around the target vehicle and how the proposed method filters them. The angle threshold — shown by a blue dotted line — is shown next to the motion direction — shown by a solid red line. Green ellipses are to be eliminated. Black ellipses increase their probabilities for estimation.
Illustration of particle distribution and threshold.
Figure 4
shows that initially the particles have a uniform distribution, independent from angle information, in a conventional particle filter. The
x
-axis range is [0, 180°] because the result of (9) is always non-negative. According to the proposed method, the maximum difference can be 180°, which means that the target and a particle are moving in opposite directions.
Initial PDF of particles without angle observation.
After running for a period of time, particles begin to have different probabilities.
Figure 5
shows that there is no deterministic relation between the probabilities and angle differences. This is because in a conventional algorithm the probability distribution of the likelihood model does not depend on angle differences.
PDF of particles without angle observation.
Those particles that are to the left of the red line are so-called
desired
particles; these particles will be weighted by the weighting function (10), whereas those that are to the right of the red line will be eliminated. So, the total probability will be calculated based on only the desired particles and will have a probability distribution function similar to that shown in
Fig. 7
. The filter given in (10) is linear in the range [0,
T
]. It is shown in
Fig. 6
. In
Fig. 6
,
T
is 45°.
Weighting function of proposed particle filter.
After the filtering, weighting, and normalization processes, the new PDF of the remaining particles will be as shown in
Fig. 7
. The actual effect of the proposed method can be seen in
Fig. 7
. The probabilities in
Fig. 5
were around approximately 0.004 to 0.007 for 100 particles. In the proposed method, the new probability values of the desired particles are increased, the highest among them being 0.045.
PDF of particles after filtering, weighting, and normalization processes.
IV. Experimental Results and Considerations
Experiments are conducted on three different videos to show the contribution of our method to the existing particle filter method. All videos contain frames of size 480 × 720 pixels. Each video is of a different highway in the Rep. of Korea and has different medium properties, such as view angle, illumination, and occlusion. One of the videos is of a bridge section of a highway. It differs from the others in terms of the stability of the camera. The camera vibrates at certain times of the video. Videos that have partially or fully occluded scenes are chosen with higher priority, to show the better performance of our method. The findings of the proposed method are compared with the output of a condensation algorithm.
In the experiments, a particle filter was used with a single feature. Only a color histogram in the HSV domain was used for target recognition. Three different angle thresholds were used for tests to indicate the effect of the proposed method on particle number and tracking accuracy.
Figure 8
shows image stills taken from a video recording of a section of highway. The related video shows a truck that is often occluded by trees. The quality of the video is poor. The illumination in this video is low due to the recording time. The frames in column
Fig. 8(a)
show the first location of tracking and the frames in column
Fig. 8(c)
show the end of tracking. A conventional method and variations of the proposed method are able to track the target until it is occluded by trees. At this point, the conventional method loses the target but the proposed method keeps on tracking even though the target’s motion is partially occluded.
Figure 9
illustrates the paths of the tracking shown in
Fig. 8
. The conventional method — shown by the green colored plot — loses the target and its tracker shifts to a different location. The fluctuations within the plots represent minor shifts on behalf of the target. At the upper end point of the ground-truth line, the target leaves the frame. The motion of the trackers beyond this point is not of our concern.
Tracking captures of proposed particle filter (from top to bottom) with T-values of π/2, π/4, and π/8, and conventional particle filter.
Comparison of tracking paths between proposed and conventional particle filter.
The Euclidean distance between the ground truth and each tracker path
[17]
is illustrated in
Fig. 10
. It is calculated in terms of a distance between two vectors. In our case, the distances between the ground truth and each proposed method were calculated separately. The Euclidean distance in the case of the conventional method shows a dramatic increase when the tracker loses the target due to occlusion.
Euclidean distance comparison of proposed and conventional particle filter.
In
Fig. 11
, the target is occluded by the poles of sign posts. These poles cause the conventional method to lose the target, whereas all variations of the proposed method continue tracking the target until it disappears from view.
Tracking captures of proposed particle filter (from top to bottom) with T-values of π/2, π/4, and π/8, and conventional particle filter.
In
Fig. 12
, the proposed particle filter with π/8 can track the target till the end with some fluctuations during tracking. The reason for this is that the relative speed of the target from the camera view was fast and the number of particles used is very low.
Comparison of tracking paths between proposed and conventional particle filter.
Figure 13
shows that the three different versions of the proposed method outperform the conventional method. The Euclidean distance of the conventional method increases when the size of the target decreases or when the target is occluded by poles.
Euclidean distance comparison of proposed and conventional particle filter.
Figure 14
shows the captures of the third test, which is a more challenging medium than that of the previous cases. This test was conducted on a low-resolution video (of a bridge) that contained vibration and occlusion. The target in this video is a white SUV in the far-right lane. A car of similar color and shape enters the frame a little way into the video in
Fig.14(c)
. There are also other vehicles travelling at different angles to that of the motion of the target vehicle and at other locations on the bridge. The three versions of the proposed particle filter can continue tracking even when the target car is occluded by poles.
Figure 14(c)
shows that the conventional method loses the target at such a point.
Tracking captures of proposed particle filter (from top to bottom) with T-values of π/2, π/4, and π/8, and conventional particle filter.
The tracking paths of the four trackers are illustrated in
Fig. 15
. The longest tracking duration belongs to π/2, because it has the highest number of particles among the three versions of the proposed method. The motion of the tracker after the end of ground truth is negligible.
Comparison of tracking paths between proposed and conventional particle filter.
The trackers with π/4 and π/8 thresholds also stop tracking after the occlusion point. This is due to the low quality of the video and vibration of the camera on the bridge.
Figure 16
shows the Euclidean distances calculated for all methods with regards to the third video. All versions of the proposed particle filter outperform the conventional method.
Euclidean distance comparison of proposed and conventional particle filter.
The proposed particle filter tracks with fewer particles. This means that the proposed filter has a lower computational load compared to that of the conventional method. The initial number of particles for all methods used in this study is 100. This is a commonly used value for particle-filter tracking systems.
Table 1
shows the number of particles used in each of the three videos. When we consider the average number of particles used in the proposed particle filter, in the case of π/2, it varies from 77 to 85; in the case of π/4, 43 to 52; and in the case of π/8, 26 to 37. This means that in the case of π/2 the number of particles decreased by 20%; for π/4, by 50%; and for π/8, by 70%. As the threshold range narrows, the number of particles decreases.
Number of particles used in tests.
Alg. type | 1st video | 2nd video | 3rd video |
Min. no. of particles | Avg. no. of particles | Min. no. of particles | Avg. no. of particles | Min. no. of particles | Avg. no. of particles |
Conv. | 100 | 100 | 100 | 100 | 100 | 100 |
π/2 | 34 | 85 | 34 | 84 | 31 | 77 |
π/4 | 10 | 52 | 14 | 47 | 11 | 43 |
π/8 | 3 | 26 | 7 | 35 | 6 | 37 |
For π/2 and π/4, the proposed method supports the particle filter algorithm to continue tracking even under a very small number of particles. They can track a target by using a minimum of 34 and 10 particles, respectively.
In the case of π/8, we can generate a risk factor (in terms of accuracy) since this threshold keeps the region (between the direction of motion of the target and the threshold) of particles very narrow. Therefore, in some cases, a few particles can fall inside this region.
V. Conclusion
This study proposed a new approach for a particle filter in the field of object tracking. It presented a novel vehicle tracking method that redistributes and filters particles based on their angular differences to the direction of the target. Characteristics of target motion are used in addition to scalar features. Particles resulting from a predefined angle threshold are eliminated, whereas others are weighted. This weighting increases the probabilities of those particles that are moving in a direction similar to that of the target. Experiments were conducted for three videos with different characteristics. Results proved that the proposed method improved the accuracy and duration of tracking under occlusion and vibration. The computational load was decreased as a result of reduced consumption of particles. Furthermore, particle distributions became more efficient, which enabled our model to be more stable and robust against noises.
As a further study, the same method will be tested with different weighting functions. An automatic decision process should be implemented for the angular threshold instead of using a manually given value.
This work was supported by the Industrial Strategic technology development program of Ministry of Science, ICT and Future Planning, Rep. of Korea, (10045260, Development of Context Awareness Monitoring and Search System Based on High Definition Multi-video) and Brain Busan. Rep. of Korea.
BIO
Corresponding Author mustafaeren.yildirim@eng.bahcesehir.edu.tr
Mustafa Eren Yildirım received his BS degree in electrical engineering from Bahcesehir University, Istanbul, Turkey, in 2008 and his MS and PhD degrees in electronics engineering from the Graduate School of Electrical and Electronic Engineering, Kyungsung University, Pusan, Rep. of Korea, in 2010 and 2014, respectively. He worked as a researcher and lecturer for Kyungsung University, until August 2015. His research interests include image processing, computer vision, recognition, and object tracking.
furkan.ince@gediz.edu.tr
Ibrahim Furkan Ince received his PhD degree in IT convergence design from the Graduate School of Digital Design, Kyungsung University, Pusan, Rep. of Korea, in 2010. For post-doctoral studies, he participated in research activities at the University of Tokyo, Japan, from 2010 to 2012. He worked as a chief research engineer at Hanwul Multimedia Communication Co. Ltd., Pusan, Rep. of Korea, from May 2012 to May 2014. Currently, he is working as an assistant professor of computer engineering at Gediz University, Izmir, Turkey. His research interests include image processing, computer vision, pattern recognition, and human–computer interaction.
batu.salman@eng.bahcesehir.edu.tr
Yucel Batu Salman received his BS and MS degrees in computer engineering from Bahcesehir University, Istanbul, Turkey, in 2003 and 2005, respectively. He received his PhD degree in IT convergence design from Kyungsung University, Pusan, Rep. of Korea, in 2010. Since 2010, he has been with the Department of Software Engineering, Bahcesehir University, Istanbul, Turkey, where he is an assistant professor. He is currently vice director of the Graduate School of Natural and Applied Sciences, Bahcesehir University. His research interests include human–computer interaction, mobile programming, and computer vision.
jsong@ks.ac.kr
Jong Kwan Song received his BS degree in electronics engineering from Pusan National University, Rep. of Korea, in 1989 and his MS and PhD degrees in electronics engineering from the Korea Advanced Institute of Science and Technology, Daejeon, Rep. of Korea, in 1991 and 1995, respectively. From 1995 to 1997, he was a researcher at Korea Mobile Telecom, Daejeon, Rep. of Korea. Since 1997, he has been a professor with the Department of Electronics Engineering, Kyungsung University, Pusan, Rep. of Korea. His research interests include video/image processing, nonlinear digital signal processing, and embedded systems.
jsipark@ks.ac.kr
Jang Sik Park received his BS, MS, and PhD degrees in electronic engineering from Pusan National University, Rep. of Korea, in 1992, 1994, and 1999, respectively. From 1997 to 2011, he was a professor at Dongeui Institute of Technology, Pusan, Rep. of Korea. Since 2011, he has been a professor with the Department of Electronics Engineering, Kyungsung University, Pusan, Rep. of Korea. His research interests include video/image processing and understanding; speech and audio signal processing; and embedded systems.
bwyoon@ks.ac.kr
Byung Woo Yoon received his BS, MS, and PhD degrees in electronic engineering from Pusan National University, Rep. of Korea, in 1987, 1989, and 1992, respectively. From 1993 to 1995, he was a senior researcher at ETRI, where he was involved in the development of the CDMA Mobile Communication System Group. He was a visiting scholar at the Department of Electrical Engineering, University of Colorado, Boulder, USA, from 2001 to 2002, and at the Department of Electrical Engineering, University of North Carolina, chapel hill, USA, from 2008 to 2009. Since 1995, he has been a professor with the Department of Electronics Engineering, Kyungsung University, Pusan, Rep. of Korea. His research interests include signal processing, design of VLSI, and development of digital systems.
Szpak Z.L.
,
Tapamo J.R.
2011
“Maritime Surveillance: Tracking Ships Inside a Dynamic Background Using a Fast Level-Set,”
Expert Syst. Appl.
38
(6)
6669 -
6680
DOI : 10.1016/j.eswa.2010.11.068
Das Sharma K.
,
Chatterjee A.
,
Rakshit A.
2012
“A PSO-Lyapunov Hybrid Stable Adaptive Fuzzy Tracking Control Approach for Vision-Based Robot Navigation,”
IEEE Trans. Instrum. Meas.
61
(7)
1908 -
1914
DOI : 10.1109/TIM.2012.2182868
Fei X.
,
Hashimoto K.
“An Object-Tracking Algorithm Based on Particle Filtering with Region-Based Level Set Method,”
Proc. Int. Conf. Intell. Robots Syst.
Taipei, Taiwan
Oct. 18–22, 2010
2908 -
2913
Walia G.S.
,
Kapoor R.
2014
“Human Detection in Video and Images - a State-of-the-Art Survey,”
Int. J. Pattern Recog. Artif. Intell.
28
(3)
1 -
25
Dong C.L.
,
Dong Y.N.
2009
“Survey on Video Based Vehicle Detection and Tracking Algorithms,”
J. Nanjing University Posts Telecommun. Natural Sci.
29
(2)
88 -
94
Hue C.
,
Le Cadre J.P.
,
Perez P.
2002
“Tracking Multiple Objects with Particle Filtering,”
IEEE Trans. Aerosp. Electron. Syst.
38
(3)
791 -
812
DOI : 10.1109/TAES.2002.1039400
Vihola M.
2007
“Rao-Blackwellised Particle Filtering in Random Set Multitarget Tracking,”
IEEE Trans. Aerosp. Electron. Syst.
43
(2)
689 -
705
DOI : 10.1109/TAES.2007.4285362
Bruno M.G.S.
,
Pavlov A.
2005
“Improved Sequential Monte Carlo Filtering for Ballistic Target Tracking,”
IEEE Trans. Aerosp. Electron. Systems
41
(3)
1103 -
1108
DOI : 10.1109/TAES.2005.1541456
Comaniciu D.
,
Ramesh V.
,
Meer P.
2003
“Kernel-Based Object Tracking,”
IEEE Trans. Pattern Anal. Mach. Intell.
25
(5)
564 -
577
DOI : 10.1109/TPAMI.2003.1195991
Zhou S.K.
,
Chellappa R.
,
Moghaddam B.
2004
“Visual Tracking and Recognition Using Appearance-Adaptive Models in Particle Filters,”
IEEE Trans. Image Process.
13
(11)
1491 -
1506
DOI : 10.1109/TIP.2004.836152
Gao M.
2012
“Object Tracking Based on Harmony Search: Comparative Study,”
J. Electron Imag.
21
(4)
1 -
13
Han H.
2011
“An Evolutionary Particle Filter with the Immune Genetic Algorithm for Intelligent Video Tracking,”
Comput. Math. Appl.
62
(7)
2685 -
2695
DOI : 10.1016/j.camwa.2011.06.050
Isard M.
,
Blake A.
1998
“Condensation – Conditional Density Propagation for Visual Tracking,”
Int. J. Comput. Vis.
29
(1)
5 -
28
DOI : 10.1023/A:1008078328650
Liu B.
“Robust and Fast Collaborative Tracking with Two Stage Sparse Optimization,”
European Conf. Comput. Vis.
Heraklion, Greece
Sept. 5–10, 2010
624 -
637
Bimbo D.
,
Dini F.
2011
“Particle Filter-Based Visual Tracking with a First Order Dynamic Model and Uncertainity Adaptation,”
J. Comput. Vis. Image Understanding
115
(6)
771 -
786
DOI : 10.1016/j.cviu.2011.01.004
Dou J.F.
,
Li J.X.
2014
“Robust Visual Tracking Based on Adaptively Multi-feature Fusion and Particle Filter,”
Optik – Int. J. Light Electron Optics
125
(5)
1680 -
1686
DOI : 10.1016/j.ijleo.2013.10.007
Gao M.L.
2015
“Firefly Algorithm (FA) Based Particle Filter Method for Visual Tracking,”
Optik – Int. J. Light Electron Optics
126
(18)
1705 -
1711
DOI : 10.1016/j.ijleo.2015.05.028
Liu H.
,
Sun F.
2012
“Efficient Visual Tracking Using Particle Filter with Incremental Likelihood Calculation,”
Inf. Sci.
195
141 -
153
DOI : 10.1016/j.ins.2012.01.033
El-Halym H.A.A.
,
Mahmoud I.I.
,
Habib S.E.D.
2012
“Proposed Hardware Architectures of Particle Filter for Object Tracking,”
EURASIP J. Adv. Signal Process.
1
(17)
1 -
19
Qi W.
2011
“A Robust Approach for Multiple Vehicles Tracking Using Layered Particle Filter,”
AEU – Int. J. Electron. Commun.
65
(7)
609 -
618
DOI : 10.1016/j.aeue.2010.06.006
Cho J.U.
“Multiple Objects Tracking Circuit Using Particle Filters with Multiple Features,”
IEEE Int. Conf. Robotics Autom.
Roma, Italy
Apr. 10–14, 2007
4639 -
4644
Cuevas E.J.
,
Zaldivar D.N.
,
Rojas R.
2007
“Particle Filter in Vision Tracking,”
e-Gnosis
5
1 -
11
McFarlane N.J.B.
,
Schofield C.P.
1995
“Segmentation and Tracking of Piglets,”
Mach. Vis. Appl.
8
(3)
187 -
193
DOI : 10.1007/BF01215814