Skip to main content

Object Tracking in Crowded Video Scenes Based on the Undecimated Wavelet Features and Texture Analysis

Abstract

We propose a new algorithm for object tracking in crowded video scenes by exploiting the properties of undecimated wavelet packet transform (UWPT) and interframe texture analysis. The algorithm is initialized by the user through specifying a region around the object of interest at the reference frame. Then, coefficients of the UWPT of the region are used to construct a feature vector (FV) for every pixel in that region. Optimal search for the best match is then performed by using the generated FVs inside an adaptive search window. Adaptation of the search window is achieved by interframe texture analysis to find the direction and speed of the object motion. This temporal texture analysis also assists in tracking of the object under partial or short-term full occlusion. Moreover, the tracking algorithm is robust to Gaussian and quantization noise processes. Experimental results show that the proposed algorithm has good performance for object tracking in crowded scenes on stairs, in airports, or at train stations in the presence of object translation, rotation, small scaling, and occlusion.

Publisher note

To access the full article, please see PDF.

Author information

Affiliations

Authors

Corresponding author

Correspondence to H. R. Rabiee.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Khansari, M., Rabiee, H.R., Asadi, M. et al. Object Tracking in Crowded Video Scenes Based on the Undecimated Wavelet Features and Texture Analysis. EURASIP J. Adv. Signal Process. 2008, 243534 (2007). https://doi.org/10.1155/2008/243534

Download citation

Keywords

  • Texture Analysis
  • Tracking Algorithm
  • Wavelet Packet
  • Object Tracking
  • Quantization Noise
\