Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Solving the multiple instance problem with axis-parallel rectangles
Artificial Intelligence
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
A framework for multiple-instance learning
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Learning in graphical models
Bayesian Networks and Decision Graphs
Bayesian Networks and Decision Graphs
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Connectionist Speech Recognition: A Hybrid Approach
Connectionist Speech Recognition: A Hybrid Approach
Inductive Logic Programming: Techniques and Applications
Inductive Logic Programming: Techniques and Applications
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Using Logical Decision Trees for Clustering
ILP '97 Proceedings of the 7th International Workshop on Inductive Logic Programming
A survey of kernels for structured data
ACM SIGKDD Explorations Newsletter
Naive Bayesian Classification of Structured Data
Machine Learning
A Sparse Object Category Model for Efficient Learning and Exhaustive Recognition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Recursive self-organizing network models
Neural Networks - 2004 Special issue: New developments in self-organizing systems
Universal Approximation Capability of Cascade Correlation for Structures
Neural Computation
Introduction: Special issue on neural networks and kernel methods for structured domains
Neural Networks - Special issue on neural networks and kernel methods for structured domains
Neural Networks - Special issue on neural networks and kernel methods for structured domains
Neural Computation
Learning probabilistic relational models
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Object recognition using multiresolution trees
SSPR'06/SPR'06 Proceedings of the 2006 joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Classifying relational data with neural networks
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
Supervised neural networks for the classification of structures
IEEE Transactions on Neural Networks
Robust combination of neural networks and hidden Markov models for speech recognition
IEEE Transactions on Neural Networks
Learning long-term dependencies with gradient descent is difficult
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The classification of graphical patterns (i.e., data that are represented in the form of labeled graphs) is a problem that has been receiving considerable attention by the machine learning community in recent years. Solutions to the problem would be valuable to a number of applications, ranging from bioinformatics and cheminformatics to Web-related tasks, structural pattern recognition for image processing, etc. Several approaches have been proposed so far, e.g. inductive logic programming and kernels for graphs. Connectionist models were introduced too, namely recursive neural nets (RNN) and graph neural nets (GNN). Although their theoretical properties are sound and thoroughly understood, RNNs and GNNs suffer some drawbacks that may limit their application. This paper introduces an alternative connectionist framework for learning discriminant functions over graphical data. The approach is simple and suitable to maximum-a-posteriori classification of broad families of graphs, and overcomes some limitations of RNNs and GNNs. The idea is to describe a graph as an algebraic relation, i.e. as a subset of the Cartesian product. The class-posterior probabilities given the relation are then reduced (under an iid assumption) to products of probabilistic quantities, estimated using a multilayer perceptron. Empirical evidence shows that, in spite of its simplicity, the technique compares favorably with established approaches on several tasks involving different graphical representations of the data. In particular, in the classification of molecules from the Mutagenesis dataset (friendly+unfriendly) the best result to date (93.91%) is obtained.