Machine Learning
Weighted Parzen Windows for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Database for Handwritten Text Recognition Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
Kernel density classification and boosting: an L2 analysis
Statistics and Computing
A Small Sphere and Large Margin Approach for Novelty Detection Using Training Data with Outliers
IEEE Transactions on Pattern Analysis and Machine Intelligence
From minimum enclosing ball to fast fuzzy inference system training on large datasets
IEEE Transactions on Fuzzy Systems
Rigorous proof of termination of SMO algorithm for support vector Machines
IEEE Transactions on Neural Networks
Generalized Core Vector Machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Recent research indicates that the standard Minimum Enclosing Ball (MEB) or the center-constrained MEB can be used for effective training on large datasets by employing the core vector machine (CVM) or generalized CVM (GCVM). However, for another extensively-used MEB, i.e., MEB with total soft margin (T-MEB for brevity), we cannot directly employ the CVM or GCVM to realize its fast training for large datasets due to the fact that the involved inequality constraint is violated. In this paper, a fast learning algorithm called FL-TMEB for scaling up T-MEB is presented. First, FL-TMEB slightly relaxes the constraints in TMEB such that it can be equivalent to the corresponding center-constrained MEB, which can be solved with the corresponding Core Set (CS) by CVM. Then, with the help of the sub-optimal solution theorem about T-MEB, FL-TMEB attempts to obtain the extended core set (ECS) by including the neighbors of some samples in the CS into the ECS. Finally, FL-TMEB takes the optimal weights of ECS as the approximation solution of T-MEB. Experimental results on UCI and USPS datasets demonstrate that the proposed method is effective.