The Journal of Machine Learning Research
Hi-index | 754.84 |
Local asymptotic arguments imply that parameter selection via the minimum description length (MDL) resembles a traditional hypothesis test. A common approximation for MDL estimates the cost of adding a parameter at about (1/2)log n bits for a model fit to n observations. While accurate for parameters which are large on a standardized scale, this approximation overstates the parameter cost near zero. We find that encoding the parameter produces a shorter description length when the corresponding estimator is about two standard errors away from zero, as in a traditional statistical hypothesis test