Reducing the Dimensionality of Vector Space Embeddings of Graphs
MLDM '07 Proceedings of the 5th international conference on Machine Learning and Data Mining in Pattern Recognition
Reducing the dimensionality of dissimilarity space embedding graph kernels
Engineering Applications of Artificial Intelligence
Computational capabilities of graph neural networks
IEEE Transactions on Neural Networks
Neural network for graphs: a contextual constructive approach
IEEE Transactions on Neural Networks
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Classifier ensembles for vector space embedding of graphs
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Graph embedding in vector spaces by means of prototype selection
GbRPR'07 Proceedings of the 6th IAPR-TC-15 international conference on Graph-based representations in pattern recognition
An overview of AI research in Italy
Artificial intelligence
Recursive neural networks and graphs: dealing with cycles
WIRN'05 Proceedings of the 16th Italian conference on Neural Nets
Hi-index | 0.00 |
Recursive neural networks are a powerful tool for processing structured data. According to the recursive learning paradigm, the input information consists of directed positional acyclic graphs (DPAGs). In fact, recursive networks are fed following the partial order defined by the links of the graph. Unfortunately, the hypothesis of processing DPAGs is sometimes too restrictive, being the nature of some real-world problems intrinsically cyclic. In this paper, a methodology is proposed, which allows us to process any cyclic directed graph. Therefore, the computational power of recursive networks is definitely established, also clarifying the underlying limitations of the model.