Performance prediction of large-scale parallell system and application using macro-level simulation

  • Authors:
  • Ryutaro Susukita;Hisashige Ando;Mutsumi Aoyagi;Hiroaki Honda;Yuichi Inadomi;Koji Inoue;Shigeru Ishizuki;Yasunori Kimura;Hidemi Komatsu;Motoyoshi Kurokawa;Kazuaki J. Murakami;Hidetomo Shibamura;Shuji Yamamura;Yunqing Yu

  • Affiliations:
  • Information Technologies & Nanotechnologies, Fukuoka, Japan;Fujitsu, Tokyo, Japan;Kyushu University, Fukuoka, Japan;Information Technologies & Nanotechnologies, Fukuoka, Japan;Information Technologies & Nanotechnologies, Fukuoka, Japan;Kyushu University, Fukuoka, Japan;Fujitsu, Tokyo, Japan;Fujitsu, Tokyo, Japan;Fujitsu, Tokyo, Japan;RIKEN (The Institute of Physical & Chemical Research), Wako, Japan;Kyushu University, Fukuoka, Japan;Information Technologies & Nanotechnologies, Fukuoka, Japan;Fujitsu, Tokyo, Japan;Kyushu University, Fukuoka, Japan

  • Venue:
  • Proceedings of the 2008 ACM/IEEE conference on Supercomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

To predict application performance on an HPC system is an important technology for designing the computing system and developing applications. However, accurate prediction is a challenge, particularly, in the case of a future coming system with higher performance. In this paper, we present a new method for predicting application performance on HPC systems. This method combines modeling of sequential performance on a single processor and macro-level simulations of applications for parallel performance on the entire system. In the simulation, the execution flow is traced but kernel computations are omitted for reducing the execution time. Validation on a real terascale system showed that the predicted and measured performance agreed within 10% to 20 %. We employed the method in designing a hypothetical petascale system of 32768 SIMD-extended processor cores. For predicting application performance on the petascale system, the macro-level simulation required several hours.