The impact of changing populations on classifier performance
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Neural Network Classification and Prior Class Probabilities
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
On the structure of strict sense Bayesian cost functions and its applications
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Many supervised learning algorithms are based on the assumption that the training data set reflects the underlying statistical model of the real data. However, this stationarity assumption may be partially violated in practice: for instance, if the cost of collecting data is class dependent, the class priors of the training data set may be different from that of the test set. A robust solution to this problem is selecting the classifier that minimize the error probability under the worst case conditions. This is known as the minimax strategy. In this paper we propose a mechanism to train a neural network in order to estimate the minimax classifier that is robust to changes in the class priors. This procedure is illustrated on a softmax-based neural network, although it can be applied to other structures. Several experimental results show the advantages of the proposed methods with respect to other approaches.