Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
An algorithm for deciding if a set of observed independencies has a causal explanation
UAI '92 Proceedings of the eighth conference on Uncertainty in Artificial Intelligence
Machine Learning - Special issue on learning with probabilistic representations
Causality: models, reasoning, and inference
Causality: models, reasoning, and inference
Distributional clustering of English words
ACL '93 Proceedings of the 31st annual meeting on Association for Computational Linguistics
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing)
Learning Bayesian Networks
Minimum Free Energies with "Data Temperature" for Parameter Learning of Bayesian Networks
ICTAI '08 Proceedings of the 2008 20th IEEE International Conference on Tools with Artificial Intelligence - Volume 01
Probabilistic latent semantic analysis
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Causal inference and causal explanation with background knowledge
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Robust independence testing for constraint-based learning of causal structure
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Hi-index | 0.00 |
Constraint-based search methods, which are a major approach to learning Bayesian networks, are expected to be effective in causal discovery tasks. However, such methods often suffer from impracticality of classical hypothesis testing for conditional independence when the sample size is insufficiently large. We propose a new conditional independence (CI) testing method that is effective for small samples. Our method uses the minimum free energy principle, which originates from thermodynamics, with the "Data Temperature" assumption recently proposed for relating probabilistic fluctuation to virtual thermal fluctuation. We define free energy using Kullback---Leibler divergence in a manner corresponding to an information-geometric perspective. This CI method incorporates the maximum entropy principle and converges to classical hypothesis tests in asymptotic regions. We provide a simulation study, the results of which show that our method improves the learning performance of the well known PC algorithm in some respects.