Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Pattern classification: a unified view of statistical and neural approaches
Pattern classification: a unified view of statistical and neural approaches
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Benchmarking a Reduced Multivariate Polynomial Pattern Classifier
IEEE Transactions on Pattern Analysis and Machine Intelligence
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern recognition using discriminative feature extraction
IEEE Transactions on Signal Processing
Neural and statistical classifiers-taxonomy and two case studies
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The polynomial neural network, or called polynomial network classifier (PNC), is a powerful nonlinear classifier that can separate classes of complicated distributions. A method that expands polynomial terms on principal subspace has yielded superior performance. In this paper, we aim to further improve the performance of the subspace-feature-based PNC. In the framework of discriminative feature extraction (DFE), we adjust the subspace parameters together with the network weights in supervised learning. Under the objective of minimum squared error, the parameters can be efficiently updated by stochastic gradient descent. In experiments on 13 datasets from the UCI Machine Learning Repository, we show that DFE can either improve the classification accuracy or reduce the network complexity. On seven datasets, the accuracy of PNC is competitive with support vector classifiers.