- Research Article
- Open access
- Published:
Object Tracking in Crowded Video Scenes Based on the Undecimated Wavelet Features and Texture Analysis
EURASIP Journal on Advances in Signal Processing volume 2008, Article number: 243534 (2007)
Abstract
We propose a new algorithm for object tracking in crowded video scenes by exploiting the properties of undecimated wavelet packet transform (UWPT) and interframe texture analysis. The algorithm is initialized by the user through specifying a region around the object of interest at the reference frame. Then, coefficients of the UWPT of the region are used to construct a feature vector (FV) for every pixel in that region. Optimal search for the best match is then performed by using the generated FVs inside an adaptive search window. Adaptation of the search window is achieved by interframe texture analysis to find the direction and speed of the object motion. This temporal texture analysis also assists in tracking of the object under partial or short-term full occlusion. Moreover, the tracking algorithm is robust to Gaussian and quantization noise processes. Experimental results show that the proposed algorithm has good performance for object tracking in crowded scenes on stairs, in airports, or at train stations in the presence of object translation, rotation, small scaling, and occlusion.
Publisher note
To access the full article, please see PDF.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Khansari, M., Rabiee, H.R., Asadi, M. et al. Object Tracking in Crowded Video Scenes Based on the Undecimated Wavelet Features and Texture Analysis. EURASIP J. Adv. Signal Process. 2008, 243534 (2007). https://doi.org/10.1155/2008/243534
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1155/2008/243534