Foundations of logic programming; (2nd extended ed.)
Foundations of logic programming; (2nd extended ed.)
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Relational Markov models and their application to adaptive web navigation
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
The Journal of Machine Learning Research
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Training conditional random fields via gradient tree boosting
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Machine Learning
Journal of Artificial Intelligence Research
Dynamic probabilistic relational models
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Top-down induction of first-order logical decision trees
Artificial Intelligence
Discriminative probabilistic models for relational data
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Efficiently inducing features of conditional random fields
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Non-parametric policy gradients: a unified treatment of propositional and relational domains
Proceedings of the 25th international conference on Machine learning
ALLPAD: Approximate Learning of Logic Programs with Annotated Disjunctions
Inductive Logic Programming
Multi-class Prediction Using Stochastic Logic Programs
Inductive Logic Programming
Relational Sequence Alignments and Logos
Inductive Logic Programming
Relational Transformation-based Tagging for Activity Recognition
Fundamenta Informaticae - Progress on Multi-Relational Data Mining
Relational Sequence Clustering for Aggregating Similar Agents
ISMIS '09 Proceedings of the 18th International Symposium on Foundations of Intelligent Systems
Lifted probabilistic inference with counting formulas
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Probabilistic inductive logic programming
Protein fold discovery using stochastic logic programs
Probabilistic inductive logic programming
Integrating knowledge capture and supervised learning through a human-computer interface
Proceedings of the sixth international conference on Knowledge capture
Optimizing probabilistic models for relational sequence learning
ISMIS'11 Proceedings of the 19th international conference on Foundations of intelligent systems
Enhancing activity recognition in smart homes using feature induction
DaWaK'11 Proceedings of the 13th international conference on Data warehousing and knowledge discovery
Spatial role labeling: Towards extraction of spatial relations from natural language
ACM Transactions on Speech and Language Processing (TSLP)
Imitation learning in relational domains: a functional-gradient boosting approach
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Learning 3D geological structure from drill-rig sensors for automated mining
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Three
Relational Transformation-based Tagging for Activity Recognition
Fundamenta Informaticae - Progress on Multi-Relational Data Mining
Location-based reasoning about complex multi-agent behavior
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
Conditional Random Fields (CRFs) provide a powerful instrument for labeling sequences. So far, however, CRFs have only been considered for labeling sequences over flat alphabets. In this paper, we describe TildeCRF, the first method for training CRFs on logical sequences, i.e., sequences over an alphabet of logical atoms. TildeCRF's key idea is to use relational regression trees in Dietterich et al.'s gradient tree boosting approach. Thus, the CRF potential functions are represented as weighted sums of relational regression trees. Experiments show a significant improvement over established results achieved with hidden Markov models and Fisher kernels for logical sequences.