A Bayesian/Information Theoretic Model of Learning to Learn viaMultiple Task Sampling
Machine Learning - Special issue on inductive transfer
Machine Learning - Special issue on inductive transfer
Learning to learn: introduction and overview
Learning to learn
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Learning to Decode Cognitive States from Brain Images
Machine Learning
ICML '06 Proceedings of the 23rd international conference on Machine learning
Exploiting parameter domain knowledge for learning in bayesian networks
Exploiting parameter domain knowledge for learning in bayesian networks
Bayesian Network Learning with Parameter Constraints
The Journal of Machine Learning Research
Exploiting temporal information in functional magnetic resonance imaging brain data
MICCAI'05 Proceedings of the 8th international conference on Medical Image Computing and Computer-Assisted Intervention - Volume Part I
A supervised clustering approach for fMRI-based inference of brain states
Pattern Recognition
Hybrid random subsample classifier ensemble for high dimensional data sets
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
Modern classification techniques perform well when the number of training examples exceed the number of features. If, however, the number of features greatly exceed the number of training examples, then these same techniques can fail. To address this problem, we present a hierarchical Bayesian framework that shares information between features by modeling similarities between their parameters. We believe this approach is applicable to many sparse, high dimensional problems and especially relevant to those with both spatial and temporal components. One such problem is fMRI time series, and we present a case study that shows how we can successfully classify in this domain with 80,000 original features and only 2 training examples per class.