The Hierarchical Hidden Markov Model: Analysis and Applications
Machine Learning
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
A Generic Framework for Semantic Sports Video Analysis Using Dynamic Bayesian Networks
MMM '05 Proceedings of the 11th International Multimedia Modelling Conference
ICME '03 Proceedings of the 2003 International Conference on Multimedia and Expo - Volume 3 (ICME '03) - Volume 03
The Journal of Machine Learning Research
A discriminative model corresponding to hierarchical HMMs
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Learning with segment boundaries for hierarchical HMMs
ICAPR'05 Proceedings of the Third international conference on Advances in Pattern Recognition - Volume Part I
Hierarchical hidden conditional random fields for information extraction
LION'05 Proceedings of the 5th international conference on Learning and Intelligent Optimization
Hi-index | 0.00 |
Hidden Markov Models (HMMs) are very popular generative models for sequence data. Recent research has, however, shown that Conditional Random Fields (CRFs), a type of discriminative model, outperform HMMs in many tasks. We have previously proposed Hierarchical Hidden Conditional Random Fields (HHCRFs), a discriminative model corresponding to hierarchical HMMs (HHMMs). Given observations, HHCRFs model the conditional probability of the states at the upper levels. States at the lower levels are hidden and marginalized in the model definition. In addition, we have developed a parameter learning algorithm that requires only the states at the upper levels in the training data. Previously we applied HHCRFs to the segmentation of electroen-cephalographic (EEG) data for a Brain-Computer Interface, and showed that HHCRFs outperform HHMMs. In this paper, we apply HHCRFs to labeling artificial data and sports video segmentation.