Skip to content


  • Research Article
  • Open Access

Robust Abandoned Object Detection Using Dual Foregrounds

EURASIP Journal on Advances in Signal Processing20072008:197875

  • Received: 25 January 2007
  • Accepted: 28 August 2007
  • Published:


As an alternative to the tracking-based approaches that heavily depend on accurate detection of moving objects, which often fail for crowded scenarios, we present a pixelwise method that employs dual foregrounds to extract temporally static image regions. Depending on the application, these regions indicate objects that do not constitute the original background but were brought into the scene at a subsequent time, such as abandoned and removed items, illegally parked vehicles. We construct separate long- and short-term backgrounds that are implemented as pixelwise multivariate Gaussian models. Background parameters are adapted online using a Bayesian update mechanism imposed at different learning rates. By comparing each frame with these models, we estimate two foregrounds. We infer an evidence score at each pixel by applying a set of hypotheses on the foreground responses, and then aggregate the evidence in time to provide temporal consistency. Unlike optical flow-based approaches that smear boundaries, our method can accurately segment out objects even if they are fully occluded. It does not require on-site training to compensate for particular imaging conditions. While having a low-computational load, it readily lends itself to parallelization if further speed improvement is necessary.


  • Learning Rate
  • Object Detection
  • Gaussian Model
  • Image Region
  • Accurate Detection

Publisher note

To access the full article, please see PDF.

Authors’ Affiliations

Mitsubishi Electric Research Labs (MERL), 201 Broadway, Cambridge, MA 02139, USA
Mitsubishi Electric Corp. Advanced Technology R&D Center, Amagasaki, Hyogo 661-8661, Japan


© Fatih Porikli et al. 2008

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.