A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Diffusion Kernels on Graphs and Other Discrete Input Spaces
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.00 |
In this paper, we consider learning problems defined on graph-structured data. We propose an incremental supervised learning algorithm for network-based estimators using diffusion kernels. Diffusion kernel nodes are iteratively added in the training process. For each new node added, the kernel function center and the output connection weight are decided according to an empirical risk driven rule based on an extended chained version of the Nadaraja-Watson estimator. Then the diffusion parameters are determined by a genetic-like optimization technique.