Electron tomography and multiscale biology

  • Authors:
  • Albert F. Lawrence;Séastien Phan;Mark Ellisman

  • Affiliations:
  • National Center for Microscopy and Imaging Research, University of California, San Diego, California;National Center for Microscopy and Imaging Research, University of California, San Diego, California;National Center for Microscopy and Imaging Research, University of California, San Diego, California

  • Venue:
  • TAMC'12 Proceedings of the 9th Annual international conference on Theory and Applications of Models of Computation
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Electron tomography (ET) is an emerging technology for the three dimensional imaging of cellular ultrastructure. In combination with other techniques, it can provide three dimensional reconstructions of protein assemblies, correlate 3D structures with functional investigations at the light microscope level and provide structural information which extends the findings of genomics and molecular biology. Realistic physical details are essential for the task of modeling over many spatial scales. While the electron microscope resolution can be as low as a fraction of a nm, a typical 3D reconstruction may just cover 1/1015 of the volume of an optical microscope reconstruction. In order to bridge the gap between those two approaches, the available spatial range of an ET reconstruction has been expanded by various techniques. Large sensor arrays and wide-field camera assemblies have increased the field dimensions by a factor of ten over the past decade, and new techniques for serial tomography and montaging make possible the assembly of many three-dimensional reconstructions. The number of tomographic volumes necessary to incorporate an average cell down to the protein assembly level is of the order 104, and given the imaging and algorithm requirements, the computational problem lays well in the exascale range. Tomographic reconstruction can be made parallel to a very high degree, and their associated algorithms can be mapped to the simplified processors comprising, for example, a graphics processor unit. Programming this on a GPU board yields a large speedup, but we expect that many more orders of magnitude improvement in computational capabilities will still be required in the coming decade. Exascale computing will raise a new set of problems, associated with component energy requirements (cost per operation and costs of data transfer) and heat dissipation issues. As energy per operation is driven down, reliability decreases, which in turn raises difficult problems in validation of computer models (is the algorithmic approach faithful to physical reality), and verification of codes (is the computation reliably correct and replicable). Leaving aside the hardware issues, many of these problems will require new mathematical and algorithmic approaches, including, potentially, a re-evaluation of the Turing model of computation.