Bayesian Reconstruction of Perceptual Experiences from Human Brain Activity

  • Authors:
  • Jack Gallant;Thomas Naselaris;Ryan Prenger;Kendrick Kay;Dustin Stansbury;Michael Oliver;An Vu;Shinji Nishimoto

  • Affiliations:
  • Program in Neuroscience, and Departments of Psychology,;Program in Neuroscience,;Physics,;Departments of Psychology,;Vision Science and,;Vision Science and,;Bioengineering, University of California at Berkeley, Berkeley, USA CA 94720;Program in Neuroscience,

  • Venue:
  • FAC '09 Proceedings of the 5th International Conference on Foundations of Augmented Cognition. Neuroergonomics and Operational Neuroscience: Held as Part of HCI International 2009
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A method for decoding the subjective contents of perceptual systems in the human brain would have broad practical utility for communication and as a brain-machine interface. Previous approaches to this problem in vision have used linear classifiers to solve specific problems, but these approaches were not general enough to solve complex problems such as reconstructing subjective perceptual states. We have developed a new approach to these problems based on quantitative encoding models that explicitly describe how visual stimuli are (nonlinearly) transformed into brain activity. We then invert these encoding models in order to decode activity evoked by novel images or movies, providing reconstructions with unprecedented fidelity. Here we briefly review these results and the potential uses of perceptual decoding devices.