The nature of statistical learning theory
The nature of statistical learning theory
Analysis of Half-Quadratic Minimization Methods for Signal and Image Recovery
SIAM Journal on Scientific Computing
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Robust self-tuning semi-supervised learning
Neurocomputing
Label Propagation through Linear Neighborhoods
IEEE Transactions on Knowledge and Data Engineering
Robust feature extraction via information theoretic learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Variational Graph Embedding for Globally and Locally Consistent Feature Extraction
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Robust Discriminant Analysis Based on Nonparametric Maximum Entropy
ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
To deal with the problem of sensitivity to noise in semi-supervised learning for biometrics, this paper proposes a robust Gaussian-Laplacian Regularized (GLR) framework based on maximum correntropy criterion (MCC), called GLR-MCC, along with its convergence analysis. The half quadratic (HQ) optimization technique is used to simplify the correntropy optimization problem to a standard semi-supervised problem in each iteration. Experimental results show that the proposed GRL-MCC can effectively improve the semi-supervised learning performance and is robust to mislabeling noise and occlusion as compared with GLR.