A new curve detection method: randomized Hough transform (RHT)
Pattern Recognition Letters
Robust regression methods for computer vision: a review
International Journal of Computer Vision
A Hough transform algorithm with a 2D hypothesis testing kernel
CVGIP: Image Understanding
Vector Space Projections: A Numerical Approach to Signal and Image Processing, Neural Nets, and Optics
MINPRAN: A New Robust Estimator for Computer Vision
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast Line Detection Algorithms Based on Combinatorial Optimization
IWVF-4 Proceedings of the 4th International Workshop on Visual Form
A fast Hough transform for segment detection
IEEE Transactions on Image Processing
Fast Line Detection Algorithms Based on Combinatorial Optimization
IWVF-4 Proceedings of the 4th International Workshop on Visual Form
The Hough Transform without the Accumulators
Proceedings of the Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Randomized relaxation methods for the maximum feasible subsystem problem
IPCO'05 Proceedings of the 11th international conference on Integer Programming and Combinatorial Optimization
Hi-index | 0.00 |
In this paper we present a new class of algorithms for detecting lines in digital images. The approach is based on a general formulation of a combinatorial optimization problem. It aims at estimating piecewise linear models. A linear system is constructed with the coordinates of all contour points in the image as coefficients and the line parameters as unknowns. The resulting linear system is then partitioned into a close-to-minimum number of consistent subsystems using a greedy strategy based on a thermal variant of the perceptron algorithm. While the partition into consistent subsystems yields the classification of the corresponding image points into a close-to-minimum number of lines. A comparison with the standard Hough Transform and the Randomized Hough Transform shows the considerable advantages of our combinatorial optimization approach in terms of memory requirements, time complexity, robustness with respect to noise, possibility of introducing "a priori" knowledge, and quality of the solutions regardless of the algorithm parameter settings.