Discriminatively regularized least-squares classification
Pattern Recognition
Recovery Rate of Clustering Algorithms
PSIVT '09 Proceedings of the 3rd Pacific Rim Symposium on Advances in Image and Video Technology
An MRF-based kernel method for nonlinear feature extraction
Image and Vision Computing
Multi-kernel SVM based classification for brain tumor segmentation of MRI multi-sequence
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
A Novel Regularization Learning for Single-View Patterns: Multi-View Discriminative Regularization
Neural Processing Letters
Employing multiple-kernel support vector machines for counterfeit banknote recognition
Applied Soft Computing
A multiple-kernel support vector regression approach for stock market price forecasting
Expert Systems with Applications: An International Journal
Kernel feature selection to fuse multi-spectral MRI images for brain tumor segmentation
Computer Vision and Image Understanding
Letters: Random optimized geometric ensembles
Neurocomputing
Random projection ensemble learning with multiple empirical kernels
Knowledge-Based Systems
An efficient multiple-kernel learning for pattern classification
Expert Systems with Applications: An International Journal
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Hi-index | 0.15 |
With the newly-proposed Canonical Correlation Analysis (CCA) named NmCCA that is an alternative formulation of CCA for more than two views of the same phenomenon, we develop a new effective multiple kernel learning algorithm. First, we adopt the empirical kernels to map the input data into m different feature spaces corresponding to different kernels. Then through the incorporation of NmCCA in a learning algorithm, one single learning process based on the regularization learning is developed, where a special term called Inter-Function Similarity Loss RIFSL is introduced for the agreement of multi-view outputs. In implementation, we select the Modification of Ho-Kashyap algorithm with Squared approximation of the misclassification errors (MHKS) as the incorporated paradigm, and the experimental results on benchmark datasets demonstrate the feasibility and effectiveness of the proposed algorithm named MultiK-MHKS.