Multilayer feedforward networks are universal approximators
Neural Networks
The cascade-correlation learning architecture
Advances in neural information processing systems 2
The recurrent cascade-correlation architecture
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Feedforward nets for interpolation and classification
Journal of Computer and System Sciences
Learning with Recurrent Neural Networks
Learning with Recurrent Neural Networks
Bi-directional computing architecture for time series prediction
Neural Networks
Application of Cascade Correlation Networks for Structures toChemistry
Applied Intelligence
Generalization Ability of Folding Networks
IEEE Transactions on Knowledge and Data Engineering
Extended Cascade-Correlation for Syntactic and Structural Pattern Recognition
SSPR '96 Proceedings of the 6th International Workshop on Advances in Structural and Syntactical Pattern Recognition
Similarity learning for graph-based image representations
Pattern Recognition Letters - Special issue: Graph-based representations in pattern recognition
Hidden Tree Markov Models for Document Image Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bi-Causal Recurrent Cascade Correlation
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 3 - Volume 3
A recursive connectionist approach for predicting disulfide connectivity in proteins
Proceedings of the 2003 ACM symposium on Applied computing
Spatiotemporal Connectionist Networks: A Taxonomy and Review
Neural Computation
Formal determination of context in contextual recursive cascade correlation networks
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Input-output HMMs for sequence processing
IEEE Transactions on Neural Networks
Computational capabilities of local-feedback recurrent networks acting as finite-state machines
IEEE Transactions on Neural Networks
Supervised neural networks for the classification of structures
IEEE Transactions on Neural Networks
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
Contextual processing of structured data by recursive cascade correlation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Introduction: Special issue on neural networks and kernel methods for structured domains
Neural Networks - Special issue on neural networks and kernel methods for structured domains
Theoretical Computer Science
Graph self-organizing maps for cyclic and unbounded graphs
Neurocomputing
Classification of graphical data made easy
Neurocomputing
Neural network for graphs: a contextual constructive approach
IEEE Transactions on Neural Networks
A Maximum-Likelihood Connectionist Model for Unsupervised Learning over Graphical Domains
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
A simple and effective neural model for the classification of structured patterns
KES'07/WIRN'07 Proceedings of the 11th international conference, KES 2007 and XVII Italian workshop on neural networks conference on Knowledge-based intelligent information and engineering systems: Part I
Unbiased SVM density estimation with application to graphical pattern recognition
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
IEEE Computational Intelligence Magazine
Supervised encoding of graph-of-graphs for classification and regression problems
INEX'09 Proceedings of the Focused retrieval and evaluation, and 8th international conference on Initiative for the evaluation of XML retrieval
How to quantitatively compare data dissimilarities for unsupervised machine learning?
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Kernel robust soft learning vector quantization
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Learning vector quantization for (dis-)similarities
Neurocomputing
Hi-index | 0.00 |
Cascade correlation (CC) constitutes a training method for neural networks that determines the weights as well as the neural architecture during training. Various extensions of CC to structured data have been proposed: recurrent cascade correlation (RCC) for sequences, recursive cascade correlation (RecCC) for tree structures with limited fan-out, and contextual recursive cascade correlation (CRecCC) for rooted directed positional acyclic graphs (DPAGs) with limited fan-in and fan-out. We show that these models possess the universal approximation property in the following sense: given a probability measure P on the input set, every measurable function from sequences into a real vector space can be approximated by a sigmoidal RCC up to any desired degree of accuracy up to inputs of arbitrary small probability. Every measurable function from tree structures with limited fan-out into a real vector space can be approximated by a sigmoidal RecCC with multiplicative neurons up to any desired degree of accuracy up to inputs of arbitrary small probability. For sigmoidal CRecCC networks with multiplicative neurons, we show the universal approximation capability for functions on an important subset of all DPAGs with limited fan-in and fan-out for which a specific linear representation yields unique codes. We give one sufficient structural condition for the latter property, which can easily be tested: the enumeration of ingoing and outgoing edges should be compatible. This property can be fulfilled for every DPAG with fan-in and fan-out two via reenumeration of children and parents, and for larger fan-in and fan-out via an expansion of the fan-in and fan-out and reenumeration of children and parents. In addition, the result can be generalized to the case of input-output isomorphic transductions of structures. Thus, CRecCC networks constitute the first neural models for which the universal approximation capability of functions involving fairly general acyclic graph structures is proved.