The 'Neural' Phonetic Typewriter
Computer
Neural computation and self-organizing maps: an introduction
Neural computation and self-organizing maps: an introduction
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Self-Organizing Maps
Multi-layer Perceptrons for Functional Data Analysis: A Projection Based Approach
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
Recursive self-organizing network models
Neural Networks - 2004 Special issue: New developments in self-organizing systems
Nonlinear prediction of speech
ICASSP '91 Proceedings of the Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference
Representation properties of networks: Kolmogorov's theorem is irrelevant
Neural Computation
Functional Principal Component Learning Using Oja's Method and Sobolev Norms
WSOM '09 Proceedings of the 7th International Workshop on Advances in Self-Organizing Maps
Representation of functional data in neural networks
Neurocomputing
Growing a hypercubical output space in a self-organizing feature map
IEEE Transactions on Neural Networks
Bankruptcy analysis with self-organizing maps in learning metrics
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, we propose an extension of a self-organizing map called self-organizing multilayer perceptron (SOMLP) whose purpose is to achieve quantization of spaces of functions. Based on the use of multilayer perceptron networks, SOMLP comprises the unsupervised as well as supervised learning algorithms. We demonstrate that it is possible to use the commonly used vector quantization algorithms (LVQ algorithms) to build new algorithms called functional quantization algorithms (LFQ algorithms). The SOMLP can be used to model nonlinear and/or nonstationary complex dynamic processes, such as speech signals. While most of the functional data analysis (FDA) research is based on B-spline or similar univariate functions, the SOMLP algorithm allows quantization of function with high dimensional input space. As a consequence, classical FDA methods can be outperformed by increasing the dimensionality of the input space of the functions under analysis. Experiments on artificial and real world examples are presented which illustrate the potential of this approach.