Distributed visual-target-surveillance system in wireless sensor networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Survey on contemporary remote surveillance systems for public safety
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Camera localization in distributed networks using trajectory estimation
Journal of Electrical and Computer Engineering
Automated safety control by video cameras
Proceedings of the 13th International Conference on Computer Systems and Technologies
Hi-index | 0.00 |
In order to monitor sufficiently large areas of interest for surveillance or any event detection, we need to look beyond stationary cameras and employ an automatically configurable network of nonoverlapping cameras. These cameras need not have an overlapping field of view and should be allowed to move freely in space. Moreover, features like zooming in/out, readily available in security cameras these days, should be exploited in order to focus on any particular area of interest if needed. In this paper, a practical framework is proposed to self-calibrate dynamically moving and zooming cameras and determine their absolute and relative orientations, assuming that their relative position is known. A global linear solution is presented for self-calibrating each zooming/focusing camera in the network. After self-calibration, it is shown that only one automatically computed vanishing point and a line lying on any plane orthogonal to the vertical direction is sufficient to infer the dynamic network configuration. Our method generalizes previous work which considers restricted camera motions. Using minimal assumptions, we are able to successfully demonstrate promising results on synthetic, as well as on real data.