Model selection
Elements of information theory
Elements of information theory
C4.5: programs for machine learning
C4.5: programs for machine learning
An introduction to model selection
Journal of Mathematical Psychology
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
Energy, entropy and information potential for neural computation
Energy, entropy and information potential for neural computation
Cluster ensembles --- a knowledge reuse framework for combining multiple partitions
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selection of Generative Models in Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Assessing Classifiers from Two Independent Data Sets Using ROC Analysis: A Nonparametric Approach
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Medical Image Categorization and Retrieval for PACS Using the GMM-KL Framework
IEEE Transactions on Information Technology in Biomedicine
On Strong Consistency of Model Selection in Classification
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Although the conventional performance indexes, such as accuracy, are commonly used in classifier selection or evaluation, information-based criteria, such as mutual information, are becoming popular in feature/model selections. In this work, we analyze the classifier learning model with the maximization normalized mutual information (NI) criterion, which is novel and well defined in a compact range for classifier evaluation. We derive close-form relations of normalized mutual information with respect to accuracy, precision, and recall in binary classifications. By exploring the relations among them, we reveal that NI is actually a set of nonlinear functions, with a concordant power-exponent form, to each performance index. The relations can also be expressed with respect to precision and recall, or to false alarm and hitting rate (recall).