Nonlinear V1 responses to natural scenes revealed by neural network analysis

  • Authors:
  • Ryan Prenger;Michael C.-K. Wu;Stephen V. David;Jack L. Gallant

  • Affiliations:
  • Department of Physics, University of California Berkeley, Berkeley CA;Biophysics Graduate Group, University of California Berkeley, Berkeley CA;Department of Bioengineering, University of California Berkeley, Berkeley CA;Helen Wills Neuroscience Institute, University of California Berkeley, Berkeley CA and Department of Psychology, University of California Berkeley, 3210 Tolman Hall #1650, Berkeley, CA

  • Venue:
  • Neural Networks - 2004 Special issue Vision and brain
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

A key goal in the study of visual processing is to obtain a comprehensive description of the relationship between visual stimuli and neuronal responses. One way to guide the search for models is to use a general nonparametric regression algorithm, such as a neural network. We have developed a multilayer feed-forward network algorithm that can be used to characterize nonlinear stimulus-response mapping functions of neurons in primary visual cortex (area V1) using natural image stimuli. The network is capable of extracting several known V1 response properties such as: orientation and spatial frequency tuning, the spatial phase invariance of complex cells, and direction selectivity. We present details of a method for training networks and visualizing their properties. We also compare how well conventional explicit models and those developed using neural networks can predict novel responses to natural scenes.