Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
The nature of statistical learning theory
The nature of statistical learning theory
Local overfitting control via leverages
Neural Computation
Theoretical Computer Science
Recurrent networks for structured data - A unifying approach and its properties
Cognitive Systems Research
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The recent developments of statistical learning focused on vector machines, which learn from examples that are described by vectors of features. However, there are many fields where structured data must be handled; therefore, it would be desirable to learn from examples described by graphs.Graph machines learn real numbers from graphs. Basically, for each graph, a separate learning machine is built, whose algebraic structure contains the same information as the graph. We describe the training of such machines, and show that virtual leave-one-out, a powerful method for assessing the generalization capabilities of conventional vector machines, can be extended to graph machines. Academic examples are described, together with applications to the prediction of pharmaceutical activities of molecules and to the classification of properties; the potential of graph machines for computer-aided drug design are highlighted.