Text Classification from Labeled and Unlabeled Documents using EM
Machine Learning - Special issue on information retrieval
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Convergence Properties of the Nelder--Mead Simplex Method in Low Dimensions
SIAM Journal on Optimization
Beyond the point cloud: from transductive to semi-supervised learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Semi-supervised time series classification
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Sparse and Semi-supervised Visual Mapping with the S^3GP
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
On semi-supervised kernel methods
On semi-supervised kernel methods
Adaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression
Expert Systems with Applications: An International Journal
A manifold regularization approach to calibration reduction for sensor-network based tracking
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
Nonlocal Discrete Regularization on Weighted Graphs: A Framework for Image and Manifold Processing
IEEE Transactions on Image Processing
Discriminative Orthogonal Nonnegative matrix factorization with flexibility for data representation
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
The manifold regularization (MR) based semi-supervised learning could explore structural relationships from both labeled and unlabeled data. However, the model selection of MR seriously affects its predictive performance due to the inherent additional geometry regularizer of labeled and unlabeled data. In this paper, two continuous and two inherent discrete hyperparameters are selected as optimization variables, and a leave-one-out cross-validation (LOOCV) based Predicted REsidual Sum of Squares (PRESS) criterion is first presented for model selection of MR to choose appropriate regularization coefficients and kernel parameters. Considering the inherent discontinuity of the two hyperparameters, the minimization process is implemented by using a improved Nelder-Mead simplex algorithm to solve the inherent discrete and continues hybrid variables set. The manifold regularization and model selection algorithm are applied to six synthetic and real-life benchmark dataset. The proposed approach, leveraged by effectively exploiting the embedded intrinsic geometric manifolds and unbiased LOOCV estimation, outperforms the original MR and supervised learning approaches in the empirical study.