The capacity of monotonic functions
Discrete Applied Mathematics - Special issue: Vapnik-Chervonenkis dimension
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Classification trees for problems with monotonicity constraints
ACM SIGKDD Explorations Newsletter
The Multilevel Classification Problem and a Monotonicity Hint
IDEAL '02 Proceedings of the Third International Conference on Intelligent Data Engineering and Automated Learning
Multivariate Student-t self-organizing maps
Neural Networks
When learning naive bayesian classifiers preserves monotonicity
ECSQARU'11 Proceedings of the 11th European conference on Symbolic and quantitative approaches to reasoning with uncertainty
Hi-index | 0.00 |
Let f be a function on 驴 d that is monotonic in every variable. There are 2 d possible assignments to the directions of monotonicity (two per variable). We provide sufficient conditions under which the optimal linear model obtained from a least squares regression on f will identify the monotonicity directions correctly. We show that when the input dimensions are independent, the linear fit correctly identifies the monotonicity directions. We provide an example to illustrate that in the general case, when the input dimensions are not independent, the linear fit may not identify the directions correctly. However, when the inputs are jointly Gaussian, as is often assumed in practice, the linear fit will correctly identify the monotonicity directions, even if the input dimensions are dependent. Gaussian densities are a special case of a more general class of densities (Mahalanobis densities) for which the result holds. Our results hold when f is a classification or regression function. If a finite data set is sampled from the function, we show that if the exact linear regression would have yielded the correct monotonicity directions, then the sample regression will also do so asymptotically (in a probabilistic sense). This result holds even if the data are noisy.