Softmax to Softassign: neural network algorithms for combinatorial optimization
Journal of Artificial Neural Networks - Special issue: neural networks for optimization
A Graduated Assignment Algorithm for Graph Matching
IEEE Transactions on Pattern Analysis and Machine Intelligence
Convergence properties of the softassign quadratic assignment algorithm
Neural Computation
A Spectral Technique for Correspondence Problems Using Pairwise Constraints
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Reweighted random walks for graph matching
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part V
Towards effective prioritizing water pipe replacement and rehabilitation
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
We focus on the problem of graph matching that is fundamental in computer vision and machine learning. Many state-of-the-arts frequently formulate it as integer quadratic programming, which incorporates both unary and second-order terms. This formulation is in general NP-hard thus obtaining an exact solution is computationally intractable. Therefore most algorithms seek the approximate optimum by relaxing techniques. This paper commences with the finding of the "circular" character of solution chain obtained by the iterative Gradient Assignment (via Hungarian method) in the discrete domain, and proposes a method for guiding the solver converging to a fixed point, resulting a convergent algorithm for graph matching in discrete domain. Furthermore, we extend the algorithms to their counterparts in continuous domain, proving the classical graduated assignment algorithm will converge to a double-circular solution chain, and the proposed Soft Constrained Graduated Assignment (SCGA) method will converge to a fixed (discrete) point, both under wild conditions. Competitive performances are reported in both synthetic and real experiments.