A General Probabilistic Formulation for Supervised Neural Classifiers
Journal of VLSI Signal Processing Systems
Partial likelihood for online order selection
Signal Processing - Special issue: Information theoretic signal processing
Proceedings of the 2010 conference on Information Modelling and Knowledge Bases XXI
Hi-index | 0.01 |
Partial likelihood (PL) provides a unified statistical framework for developing and studying adaptive techniques for nonlinear signal processing. In this paper, we present the general formulation for learning posterior probabilities on the PL cost for multi-class classifier design. We show that the fundamental information-theoretic relationship for learning on the PL cost, the equivalence of likelihood maximization and relative entropy minimization, is satisfied for the multiclass case for the perceptron probability model using softmax normalization. We note the inefficiency of training a softmax network and propose an efficient multiclass equalizer structure based on binary coding of the output classes. We show that the well-formed property of the PL cost is satisfied for the softmax and the new multiclass classifier. We present simulation results to demonstrate this fact and note that though the traditional mean square error (MSE) cost uses the available information more efficiently than the PL cost for the multi-class case, the new multi-class equalizer based on binary coding is much more effective in tracking abrupt changes due to the well-formed property of the cost that it uses.