Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Some Thoughts on the QZ Algorithm for Solving the Generalized Eigenvalue Problem
ACM Transactions on Mathematical Software (TOMS)
Machine Learning
Logistic regression and artificial neural network classification models: a methodology review
Journal of Biomedical Informatics
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds
IEEE Transactions on Pattern Analysis and Machine Intelligence
EfficientL1regularized logistic regression
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
A novel kernel-based maximum a posteriori classification method
Neural Networks
A maximum a-posteriori identification criterion for wavelet domain watermarking
International Journal of Wireless and Mobile Computing
Comparing measures of sparsity
IEEE Transactions on Information Theory
On the sparseness of 1-norm support vector machines
Neural Networks
Maximum a posteriori based kernel classifier trained by linear programming
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
Logistic regression-based pattern classifiers for symbolic interval data
Pattern Analysis & Applications
NOLISP'05 Proceedings of the 3rd international conference on Non-Linear Analyses and Algorithms for Speech Processing
Cost functions to estimate a posteriori probabilities in multiclass problems
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a new approach to a maximum a posteriori (MAP)-based classification, specifically, MAP-based kernel classification trained by linear programming (MAPLP). Unlike traditional MAP-based classifiers, MAPLP does not directly estimate a posterior probability for classification. Instead, it introduces a kernelized function to an objective function that behaves similarly to a MAP-based classifier. To evaluate the performance of MAPLP, a binary classification experiment was performed with 13 datasets. The results of this experiment are compared with those coming from conventional MAP-based kernel classifiers and also from other state-of-the-art classification methods. It shows that MAPLP performs promisingly against the other classification methods. It is argued that the proposed approach makes a significant contribution to MAP-based classification research; the approach widens the freedom to choose an objective function, it is not constrained to the strict sense Bayesian, and can be solved by linear programming. A substantial advantage of our proposed approach is that the objective function is undemanding, having only a single parameter. This simplicity, thus, allows for further research development in the future.