Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Hidden space support vector machines
IEEE Transactions on Neural Networks
On the sparseness of 1-norm support vector machines
Neural Networks
Density-induced margin support vector machines
Pattern Recognition
Nonlinear nearest subspace classifier
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
A fast algorithm for kernel 1-norm support vector machines
Knowledge-Based Systems
Hi-index | 0.00 |
A new nonlinear principle component analysis (PCA) method, hidden space principal component analysis (HSPCA) is presented in this paper. Firstly, the data in the input space is mapped into a high hidden space by a nonlinear function whose role is similar to that of hidden neurons in Artificial Neural Networks. Then the goal of features extraction and data compression will be implemented by performing PCA on the mapped data in the hidden space. Compared with linear PCA method, our algorithm is a nonlinear PCA one essentially and can extract the data features more efficiently. While compared with kernel PCA method presented recently, the mapped samples are exactly known and the conditions satisfied by nonlinear mapping functions are more relaxed. The unique condition is symmetry for kernel function in HSPCA. Finally, experimental results on artificial and real-world data show the feasibility and validity of HSPCA.