An Experimental and Theoretical Comparison of Model SelectionMethods
Machine Learning - Special issue on the eighth annual conference on computational learning theory, (COLT '95)
Finding Cutpoints in Noisy Binary Sequences - A Revised Empirical Evaluation
AI '99 Proceedings of the 12th Australian Joint Conference on Artificial Intelligence: Advanced Topics in Artificial Intelligence
Point Estimation Using the Kullback-Leibler Loss Function and MML
PAKDD '98 Proceedings of the Second Pacific-Asia Conference on Research and Development in Knowledge Discovery and Data Mining
Minimum Message Length Segmentation
PAKDD '98 Proceedings of the Second Pacific-Asia Conference on Research and Development in Knowledge Discovery and Data Mining
The Kindest Cut: Minimum Message Length Segmentation
ALT '96 Proceedings of the 7th International Workshop on Algorithmic Learning Theory
Minimum Message Length Grouping of Ordered Data
ALT '00 Proceedings of the 11th International Conference on Algorithmic Learning Theory
A review on time series data mining
Engineering Applications of Artificial Intelligence
The modeling of time series based on fuzzy information granules
Expert Systems with Applications: An International Journal
Hi-index | 0.02 |
This paper investigates the coding of change-points in the information-theoretic Minimum Message Length (MML) framework. Change-point coding regions affect model selection and parameter estimation in problems such as time series segmentation and decision trees. The Minimum Message Length (MML) and Minimum Description Length (MDL78) approaches to change-point problems have been shown to perform well by several authors. In this paper we compare some published MML and MDL78 methods and introduce some new MML approximations called 'MMLDc' and 'MMLDF'. These new approximations are empirically compared with Strict MML (SMML), Fairly Strict MML (FSMML), MML68, the Minimum Expected Kullback-Leibler Distance (MEKLD) loss function and MDL78 on a tractable binomial change-point problem.