Towards a scalable fully-implicit fully-coupled resistive MHD formulation with stabilized FE methods

  • Authors:
  • J. N. Shadid;R. P. Pawlowski;J. W. Banks;L. Chacón;P. T. Lin;R. S. Tuminaro

  • Affiliations:
  • Sandia National Laboratories, Albuquerque, NM 87185, USA;Sandia National Laboratories, Albuquerque, NM 87185, USA;Lawrence Livermore National Laboratory, Livermore, CA 94551, USA;Oak Ridge National Laboratory, Oak Ridge, TN 37831, USA;Sandia National Laboratories, Albuquerque, NM 87185, USA;Sandia National Laboratories, Albuquerque, NM 87185, USA

  • Venue:
  • Journal of Computational Physics
  • Year:
  • 2010

Quantified Score

Hi-index 31.47

Visualization

Abstract

This paper explores the development of a scalable, nonlinear, fully-implicit stabilized unstructured finite element (FE) capability for 2D incompressible (reduced) resistive MHD. The discussion considers the implementation of a stabilized FE formulation in context of a fully-implicit time integration and direct-to-steady-state solution capability. The nonlinear solver strategy employs Newton-Krylov methods, which are preconditioned using fully-coupled algebraic multilevel preconditioners. These preconditioners are shown to enable a robust, scalable and efficient solution approach for the large-scale sparse linear systems generated by the Newton linearization. Verification results demonstrate the expected order-of-accuracy for the stabilized FE discretization. The approach is tested on a variety of prototype problems, including both low-Lundquist number (e.g., an MHD Faraday conduction pump and a hydromagnetic Rayleigh-Bernard linear stability calculation) and moderately-high Lundquist number (magnetic island coalescence problem) examples. Initial results that explore the scaling of the solution methods are presented on up to 4096 processors for problems with up to 64M unknowns on a CrayXT3/4. Additionally, a large-scale proof-of-capability calculation for 1 billion unknowns for the MHD Faraday pump problem on 24,000 cores is presented.