Alignment by Maximization of Mutual Information
International Journal of Computer Vision
Monitoring Activities from Multiple Video Streams: Establishing a Common Coordinate Frame
IEEE Transactions on Pattern Analysis and Machine Intelligence
Aligning Non-Overlapping Sequences
International Journal of Computer Vision - Marr Prize Special Issue
The Correlation Ratio as a New Similarity Measure for Multimodal Image Registration
MICCAI '98 Proceedings of the First International Conference on Medical Image Computing and Computer-Assisted Intervention
Multiple View Geometry in Computer Vision
Multiple View Geometry in Computer Vision
On Registration of Regions of Interest (ROI) in Video Sequences
AVSS '03 Proceedings of the IEEE Conference on Advanced Video and Signal Based Surveillance
Feature-Based Sequence-to-Sequence Matching
International Journal of Computer Vision
Multiscale Fusion of Visible and Thermal IR Images for Illumination-Invariant Face Recognition
International Journal of Computer Vision
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Image and Vision Computing
Fusion of color and infrared video for moving human detection
Pattern Recognition
Background-subtraction using contour-based fusion of thermal and visible imagery
Computer Vision and Image Understanding
Mutual information based registration of multimodal stereo videos for person tracking
Computer Vision and Image Understanding
Infrared-visual image registration based on corners and hausdorff distance
SCIA'07 Proceedings of the 15th Scandinavian conference on Image analysis
Computer Vision and Image Understanding
Rapid multimodality registration based on MM-SURF
Neurocomputing
Hi-index | 0.00 |
The registration of images from multiple types of sensors (particularly infrared sensors and visible color sensors) is a step toward achieving multi-sensor fusion. This paper proposes a registration method using a novel error function. Registration of infrared and visible color images is performed by using the trajectories of moving objects obtained using background subtraction and simple tracking. The trajectory points are matched using a RANSAC-based algorithm and a novel registration criterion, which is based on the overlap of foreground pixels in composite foreground images. This criterion allows performing registration when there are few trajectories and gives more stable results. Our method was tested and its performance quantified using nine scenarios. It outperforms a related method only based on trajectory points in cases where there are few moving objects.