The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Evolution strategies –A comprehensive introduction
Natural Computing: an international journal
Atomic Decomposition by Basis Pursuit
SIAM Review
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Large Scale Kernel Regression via Linear Programming
Machine Learning
SAR Image Superresolution via 2-D Adaptive Extrapolation
Multidimensional Systems and Signal Processing
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
Topographic Independent Component Analysis
Neural Computation
Pattern recognition with SVM and dual-tree complex wavelets
Image and Vision Computing
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
More efficiency in multiple kernel learning
Proceedings of the 24th international conference on Machine learning
A hybrid machine learning approach to network anomaly detection
Information Sciences: an International Journal
Support vector machines with adaptive Lq penalty
Computational Statistics & Data Analysis
RRS + LS-SVM: a new strategy for “a priori” sample selection
Neural Computing and Applications
Artificial Intelligence in Medicine
A weighted rough set based method developed for class imbalance learning
Information Sciences: an International Journal
Evolutionary tuning of multiple SVM parameters
Neurocomputing
An affine scaling methodology for best basis selection
IEEE Transactions on Signal Processing
Time-frequency localization operators: a geometric phase space approach
IEEE Transactions on Information Theory
SAR imaging via modern 2-D spectral estimation methods
IEEE Transactions on Image Processing
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Information Sciences: an International Journal
Online independent reduced least squares support vector regression
Information Sciences: an International Journal
Probabilistic support vector machines for classification of noise affected data
Information Sciences: an International Journal
An improved (µ+λ)-constrained differential evolution for constrained optimization
Information Sciences: an International Journal
Image denoising using SVM classification in nonsubsampled contourlet transform domain
Information Sciences: an International Journal
Computers and Operations Research
Asymmetric least squares support vector machine classifiers
Computational Statistics & Data Analysis
Hi-index | 0.07 |
Not only different databases but two classes of data within a database can also have different data structures. SVM and LS-SVM typically minimize the empirical @f-risk; regularized versions subject to fixed penalty (L"2 or L"1 penalty) are non-adaptive since their penalty forms are pre-determined. They often perform well only for certain types of situations. For example, LS-SVM with L"2 penalty is not preferred if the underlying model is sparse. This paper proposes an adaptive penalty learning procedure called evolution strategies (ES) based adaptive L"p least squares support vector machine (ES-based L"p LS-SVM) to address the above issue. By introducing multiple kernels, a L"p penalty based nonlinear objective function is derived. The iterative re-weighted minimal solver (IRMS) algorithm is used to solve the nonlinear function. Then evolution strategies (ES) is used to solve the multi-parameters optimization problem. Penalty parameterp, kernel and regularized parameters are adaptively selected by the proposed ES-based algorithm in the process of training the data, which makes it easier to achieve the optimal solution. Numerical experiments are conducted on two artificial data sets and six real world data sets. The experiment results show that the proposed procedure offer better generalization performance than the standard SVM, the LS-SVM and other improved algorithms.