Random forest-based manifold learning for classification of imaging data in dementia

  • Authors:
  • Katherine R. Gray;Paul Aljabar;Rolf A. Heckemann;Alexander Hammers;Daniel Rueckert

  • Affiliations:
  • Department of Computing, Imperial College London, United Kingdom;Department of Computing, Imperial College London, United Kingdom;Fondation Neurodis, CERMEP-Imagerie du Vivant, Lyon, France and Faculty of Medicine, Imperial College London, United Kingdom;Fondation Neurodis, CERMEP-Imagerie du Vivant, Lyon, France and Faculty of Medicine, Imperial College London, United Kingdom;Department of Computing, Imperial College London, United Kingdom

  • Venue:
  • MLMI'11 Proceedings of the Second international conference on Machine learning in medical imaging
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

Neurodegenerative disorders are characterized by changes in multiple biomarkers, which may provide complementary information for diagnosis and prognosis. We present a framework in which proximities derived from random forests are used to learn a low-dimensional manifold from labelled training data and then to infer the clinical labels of test data mapped to this space. The proposed method facilitates the combination of embeddings from multiple datasets, resulting in the generation of a joint embedding that simultaneously encodes information about all the available features. It is possible to combine different types of data without additional processing, and we demonstrate this key feature by application to voxel-based FDG-PET and region-based MR imaging data from the ADNI study. Classification based on the joint embedding coordinates out-performs classification based on either modality alone. Results are impressive compared with other state-of-the-art machine learning techniques applied to multi-modality imaging data.