Fuzzy Systems as Universal Approximators
IEEE Transactions on Computers
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Parallel Optimization: Theory, Algorithms and Applications
Parallel Optimization: Theory, Algorithms and Applications
Interior Methods for Nonlinear Optimization
SIAM Review
AFPAC '00 Proceedings of the Second International Workshop on Algebraic Frames for the Perception-Action Cycle
Fast learning in networks of locally-tuned processing units
Neural Computation
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Computational Intelligence and Neuroscience - Advances in Nonnegative Matrix and Tensor Factorization
Exploratory learning structures in artificial cognitive systems
Image and Vision Computing
Learning Higher-Order Markov Models for Object Tracking in Image Sequences
ISVC '09 Proceedings of the 5th International Symposium on Advances in Visual Computing: Part II
Real-time visual recognition of objects and scenes using P-channel matching
SCIA'07 Proceedings of the 15th Scandinavian conference on Image analysis
On the Numerical Analysis of Oblique Projectors
SIAM Journal on Matrix Analysis and Applications
Hi-index | 0.98 |
This paper brings together a novel information representation model for use in signal processing and computer vision problems, with a particular algorithmic development of the Landweber iterative algorithm. The information representation model allows a representation of multiple values for a variable as well as an expression for confidence. Both properties are important for effective computation using multi-level models, where a choice between models will be implementable as part of the optimization process. It is shown that in this way the algorithm can deal with a class of high-dimensional, sparse, and constrained least-squares problems, which arise in various computer vision learning tasks, such as object recognition and object pose estimation. While the algorithm has been applied to the solution of such problems, it has so far been used heuristically. In this paper we describe the properties and some of the peculiarities of the channel representation and optimization, and put them on firm mathematical ground. We consider the optimization a convexly constrained weighted least-squares problem and propose for its solution a projected Landweber method which employs oblique projections onto the closed convex constraint set. We formulate the problem, present the algorithm and work out its convergence properties, including a rate-of-convergence result. The results are put in perspective with currently available projected Landweber methods. An application to supervised learning is described, and the method is evaluated in an experiment involving function approximation, as well as application to transient signals.