People counting in low density video sequences

  • Authors:
  • J. D. Valle, Jr.;L. E. S. Oliveira;A. L. Koerich;A. S. Britto, Jr.

  • Affiliations:
  • Pontifical Catholic University of Parana, Curitiba, PR, Brazil;Pontifical Catholic University of Parana, Curitiba, PR, Brazil;Pontifical Catholic University of Parana, Curitiba, PR, Brazil;Pontifical Catholic University of Parana, Curitiba, PR, Brazil

  • Venue:
  • PSIVT'07 Proceedings of the 2nd Pacific Rim conference on Advances in image and video technology
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a novel approach for automatic people counting in videos captured through a conventional closed-circuit television (CCTV) using computer vision techniques. The proposed approach consists of detecting and tracking moving objects in video scenes to further counting them when they enters into a virtual counting zone defined in the scene. One of the main problems of using conventional CCTV cameras is that they are usually not placed into a convenient position for counting and this may cause a lot of occlusions between persons when they are walking very close or in groups. To tackle this problem two strategies are investigated. The first one is based on two thresholds which are related to the average width and to the average area of a blob top zone, which represents a person head. By matching the width and the head region area of a current blob against these thresholds it is possible to estimate if the blob encloses one, two or three persons. The second strategy is based on a zoning scheme and extracts low level features from the top region of the blob, which is also related to a person head. Such feature vectors are used together with an instance-based classifier to estimate the number of persons enclosed by the blob. Experimental results on videos from two different databases have shown that the proposed approach is able to count the number of persons that pass through a counting zone with accuracy higher than 85%.