Computational geometry: an introduction
Computational geometry: an introduction
Multilayer feedforward networks are universal approximators
Neural Networks
Approximation capabilities of multilayer feedforward networks
Neural Networks
On the Problem of Local Minima in Backpropagation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Computational and Applied Mathematics
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Graph Neural Networks for Ranking Web Pages
WI '05 Proceedings of the 2005 IEEE/WIC/ACM International Conference on Web Intelligence
Neural Networks - Special issue on neural networks and kernel methods for structured domains
Graph Neural Networks for Object Localization
Proceedings of the 2006 conference on ECAI 2006: 17th European Conference on Artificial Intelligence August 29 -- September 1, 2006, Riva del Garda, Italy
The graph neural network model
IEEE Transactions on Neural Networks
Supervised neural networks for the classification of structures
IEEE Transactions on Neural Networks
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
On the closure of the set of functions that can be realized by a given multilayer perceptron
IEEE Transactions on Neural Networks
A self-organizing map for adaptive processing of structured data
IEEE Transactions on Neural Networks
Recursive processing of cyclic graphs
IEEE Transactions on Neural Networks
A Neuro Fuzzy Approach for Handling Structured Data
SUM '08 Proceedings of the 2nd international conference on Scalable Uncertainty Management
Graph self-organizing maps for cyclic and unbounded graphs
Neurocomputing
Ranking Attack Graphs with Graph Neural Networks
ISPEC '09 Proceedings of the 5th International Conference on Information Security Practice and Experience
The graph neural network model
IEEE Transactions on Neural Networks
A ConceptLink graph for text structure mining
ACSC '09 Proceedings of the Thirty-Second Australasian Conference on Computer Science - Volume 91
A machine learning approach to link prediction for interlinked documents
INEX'09 Proceedings of the Focused retrieval and evaluation, and 8th international conference on Initiative for the evaluation of XML retrieval
Supervised encoding of graph-of-graphs for classification and regression problems
INEX'09 Proceedings of the Focused retrieval and evaluation, and 8th international conference on Initiative for the evaluation of XML retrieval
Sentence extraction by graph neural networks
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part III
Web spam detection by probability mapping graphSOMs and graph neural networks
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
How to quantitatively compare data dissimilarities for unsupervised machine learning?
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Kernel robust soft learning vector quantization
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Learning vector quantization for (dis-)similarities
Neurocomputing
Hi-index | 0.00 |
In this paper, we will consider the approximation properties of a recently introduced neural network model called graph neural network (GNN), which can be used to process-structured data inputs, e.g., acyclic graphs, cyclic graphs, and directed or undirected graphs. This class of neural networks implements a function τ(G, n) ∈ IRm that maps a graph G and one of its nodes n onto an m-dimensional Euclidean space. We characterize the functions that can be approximated by GNNs, in probability, up to any prescribed degree of precision. This set contains the maps that satisfy a property called preservation of the unfolding equivalence, and includes most of the practically useful functions on graphs; the only known exception is when the input graph contains particular patterns of symmetries when unfolding equivalence may not be preserved. The result can be considered an extension of the universal approximation property established for the classic feedforward neural networks (FNNs). Some experimental examples are used to show the computational capabilities of the proposed model.