Neural Computation
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Circular nodes in neural networks
Neural Computation
Limitations of nonlinear PCA as performed with generic neural networks
IEEE Transactions on Neural Networks
Dimensional reduction in the protein secondary structure prediction: non-linear method improvements
International Journal of Computational Intelligence in Bioinformatics and Systems Biology
Quasi-objective nonlinear principal component analysis
Neural Networks
Neural Processing Letters
Hi-index | 0.00 |
With very noisy data, having plentiful samples eliminates overfitting in nonlinear regression, but not in nonlinear principal component analysis (NLPCA). To overcome this problem in NLPCA, a new information criterion (IC) is proposed for selecting the best model among multiple models with different complexity and regularization (i.e. weight penalty). This IC gauges the inconsistency I between the nonlinear principal components (u and u@?) for every data point x and its nearest neighbour x@?, with I=1-correlation(u,u@?), where I tends to increase with overfitted solutions. Tests were performed using autoassociative neural networks for NLPCA on synthetic and real climate data (tropical Pacific sea surface temperatures and equatorial stratospheric winds), with the IC performing well in model selection and in deciding between an open curve or a closed curve solution.