A model for reasoning about persistence and causation
Computational Intelligence
Continuous time bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Construction and Methods of Learning of Bayesian Networks
Cybernetics and Systems Analysis
Data association for topic intensity tracking
ICML '06 Proceedings of the 23rd international conference on Machine learning
SYSML'07 Proceedings of the 2nd USENIX workshop on Tackling computer systems problems with machine learning techniques
Extending continuous time Bayesian networks
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Continuous time particle filtering
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Continuous Time Bayesian Network Reasoning and Learning Engine
The Journal of Machine Learning Research
Update rules for parameter estimation in continuous time Bayesian network
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Learning continuous-time social network dynamics
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Importance Sampling for Continuous Time Bayesian Networks
The Journal of Machine Learning Research
Combining link and content for collective active learning
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
PutMode: prediction of uncertain trajectories in moving objects databases
Applied Intelligence
Intrusion detection using continuous time Bayesian networks
Journal of Artificial Intelligence Research
Mean Field Variational Approximation for Continuous-Time Bayesian Networks
The Journal of Machine Learning Research
Batch Mode Active Learning for Networked Data
ACM Transactions on Intelligent Systems and Technology (TIST)
Simulation metamodeling in continuous time using dynamic Bayesian networks
Proceedings of the Winter Simulation Conference
Hi-index | 0.00 |
Continuous time Bayesian networks (CTBN) describe structured stochastic processes with finitely many states that evolve nver continuous time. A CTBN is a directed (possibly cyclic) dependency graph over a set of variables, each of which represents a finite state continuous time Markov process whose transition model is a function of its parents. We address the problem of leaning parameters and structure of a CTBN from fully observed data. We define a conjugate prior for CTBNs and show how it can be used both for Bayesian parameter estimation and as the basis of a Bayesian score for structure learning. Because acyclicity is not a constraint in CTBNs, we can show that the structure leaning problem is significantly easier, both in theory and in practice, than structure leaning for dynamic Bayesian networks (DBNs). Furthermore, as CTBNs can tailor the parameters and dependency structure to the different time granularities of the evolution of different variables, they can provide a better fit to continuous-time processes than DBNs with a fixed time granularity.