The nature of statistical learning theory
The nature of statistical learning theory
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Combining pattern recognition modalities at the sensor level via kernel fusion
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Variational relevance vector machines
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Hi-index | 0.00 |
The Support Kernel Machine (SKM) and the Relevance Kernel Machine (RKM) are two principles for selectively combining object-representation modalities of different kinds by means of incorporating supervised selectivity into the classical kernel-based SVM. The former principle consists in rigidly selecting a subset of presumably informative support kernels and excluding the others, whereas the latter one assigns positive weights to all of them. The RKM algorithm was fully elaborated in previous publications; however the previous algorithm implementing the SKM principle of selectivity supervision is applicable only to real-valued features. The present paper fills in this gap by harnessing the framework of subdifferential calculus for computationally solving the problem of constrained nondifferentiable convex optimization that occurs in the SKM training criterion applicable to arbitrary kernel-based modalities of object representation.