Alternative models for fish-eye lenses
Pattern Recognition Letters
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
A Hierarchical Approach to Robust Background Subtraction using Color and Gradient Information
MOTION '02 Proceedings of the Workshop on Motion and Video Computing
A new method to segment playfield and its applications in match analysis in sports video
Proceedings of the 12th annual ACM international conference on Multimedia
A Rational Function Lens Distortion Model for General Cameras
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Generalized Camera Calibration Including Fish-Eye Lenses
International Journal of Computer Vision
A Generic Camera Model and Calibration Method for Conventional, Wide-Angle, and Fish-Eye Lenses
IEEE Transactions on Pattern Analysis and Machine Intelligence
On-line trajectory clustering for anomalous events detection
Pattern Recognition Letters
Situation analysis and atypical event detection with multiple cameras and multi-object tracking
RobVis'08 Proceedings of the 2nd international conference on Robot vision
A macro-observation scheme for abnormal event detection in daily-life video sequences
EURASIP Journal on Advances in Signal Processing - Special issue on advanced image processing for defense and security applications
Ellipse constraints for improved wide-baseline feature matching and reconstruction
IWCIA'11 Proceedings of the 14th international conference on Combinatorial image analysis
Video Tracking: Theory and Practice
Video Tracking: Theory and Practice
Hi-index | 0.00 |
Utilization of camera systems for surveillance tasks (e. g. traffic monitoring) has become a standard procedure and has been in use for over 20 years. However, most of the cameras are operated locally and data analyzed manually. Locally means here a limited field of view and that the image sequences are processed independently from other cameras. For the enlargement of the observation area and to avoid occlusions and non-accessible areas multiple camera systems with overlapping and non-overlapping cameras are used. The joint processing of image sequences of a multi-camera system is a scientific and technical challenge. The processing is divided traditionally into camera calibration, object detection, tracking and interpretation. The fusion of information from different cameras is carried out in the world coordinate system. To reduce the network load, a distributed processing concept can be implemented. Object detection and tracking are fundamental image processing tasks for scene evaluation. Situation assessments are based mainly on characteristic local movement patterns (e.g. directions and speed), from which trajectories are derived. It is possible to recognize atypical movement patterns of each detected object by comparing local properties of the trajectories. Interaction of different objects can also be predicted with an additional classification algorithm. This presentation discusses trajectory based recognition algorithms for atypical event detection in multi object scenes to obtain area based types of information (e.g. maps of speed patterns, trajectory curvatures or erratic movements) and shows that two-dimensional areal data analysis of moving objects with multiple cameras offers new possibilities for situational analysis.