On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
Causality: models, reasoning, and inference
Causality: models, reasoning, and inference
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
A Linear Non-Gaussian Acyclic Model for Causal Discovery
The Journal of Machine Learning Research
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
DirectLiNGAM: A Direct Method for Learning a Linear Non-Gaussian Structural Equation Model
The Journal of Machine Learning Research
Learning Causal Relations in Multivariate Time Series Data
ACM Transactions on Intelligent Systems and Technology (TIST)
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Hi-index | 0.00 |
Motivated by causal inference problems, we propose a novel method for regression that minimizes the statistical dependence between regressors and residuals. The key advantage of this approach to regression is that it does not assume a particular distribution of the noise, i.e., it is non-parametric with respect to the noise distribution. We argue that the proposed regression method is well suited to the task of causal inference in additive noise models. A practical disadvantage is that the resulting optimization problem is generally non-convex and can be difficult to solve. Nevertheless, we report good results on one of the tasks of the NIPS 2008 Causality Challenge, where the goal is to distinguish causes from effects in pairs of statistically dependent variables. In addition, we propose an algorithm for efficiently inferring causal models from observational data for more than two variables. The required number of regressions and independence tests is quadratic in the number of variables, which is a significant improvement over the simple method that tests all possible DAGs.