Enabling Very-Large Scale Earthquake Simulations on Parallel Machines

  • Authors:
  • Yifeng Cui;Reagan Moore;Kim Olsen;Amit Chourasia;Philip Maechling;Bernard Minster;Steven Day;Yuanfang Hu;Jing Zhu;Amitava Majumdar;Thomas Jordan

  • Affiliations:
  • San Diego Supercomputer Center, 9500 Gilman Drive, La Jolla, CA 92093-0505, USA;San Diego Supercomputer Center, 9500 Gilman Drive, La Jolla, CA 92093-0505, USA;San Diego State University, 5500 Campanile Drive, San Diego, CA 92182, USA;San Diego Supercomputer Center, 9500 Gilman Drive, La Jolla, CA 92093-0505, USA;University of Southern California, Southern California Earthquake Center, Los Angeles, CA 90089, USA;Scripps Institution of Oceanography, 9500 Gilman Drive, La Jolla, CA 92024, USA;San Diego State University, 5500 Campanile Drive, San Diego, CA 92182, USA;San Diego Supercomputer Center, 9500 Gilman Drive, La Jolla, CA 92093-0505, USA;San Diego Supercomputer Center, 9500 Gilman Drive, La Jolla, CA 92093-0505, USA;San Diego Supercomputer Center, 9500 Gilman Drive, La Jolla, CA 92093-0505, USA;University of Southern California, Southern California Earthquake Center, Los Angeles, CA 90089, USA

  • Venue:
  • ICCS '07 Proceedings of the 7th international conference on Computational Science, Part I: ICCS 2007
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Southern California Earthquake Center initiated a major large-scale earthquake simulation called TeraShake. The simulations propagated seismic waves across a domain of 600x300x80 km at 200 meter resolution, some of the largest and most detailed earthquake simulations of the southern San Andreas fault. The output from a single simulation may be as large as 47 terabytes of data and 400,000 files. The execution of these large simulations requires high levels of expertise and resource coordination. We describe how we performed single-processor optimization of the application, optimization of the I/O handling, and the optimization of execution initialization. We also look at the challenges presented by run-time data archive management and visualization. The improvements made to the application as it was recently scaled up to 40k BlueGene processors have created a community code that can be used by the wider SCEC community to perform large scale earthquake simulations.