A big data urban growth simulation at a national scale: Configuring the GIS and neural network based Land Transformation Model to run in a High Performance Computing (HPC) environment

  • Authors:
  • Bryan C. Pijanowski;Amin Tayyebi;Jarrod Doucette;Burak K. Pekin;David Braun;James Plourde

  • Affiliations:
  • Department of Forestry and Natural Resources, Purdue University, 195 Marsteller Street, West Lafayette, IN 47907, USA;Department of Forestry and Natural Resources, Purdue University, 195 Marsteller Street, West Lafayette, IN 47907, USA and Department of Entomology, University of Wisconsin, Madison, WI 53706, USA;Department of Forestry and Natural Resources, Purdue University, 195 Marsteller Street, West Lafayette, IN 47907, USA;Department of Forestry and Natural Resources, Purdue University, 195 Marsteller Street, West Lafayette, IN 47907, USA and Institute for Conservation Research, San Diego Zoo Global, 15600 San Pasqu ...;Rosen Center for Advanced Computing, Information Technology Division, Purdue University, West Lafayette, IN 47907, USA and Thavron Solutions, Kokomo, IN 46906, USA;Department of Forestry and Natural Resources, Purdue University, 195 Marsteller Street, West Lafayette, IN 47907, USA and Worldwide Construction and Foresy Division, John Deere, 1515 5th Avenue, M ...

  • Venue:
  • Environmental Modelling & Software
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Land Transformation Model (LTM) is a Land Use Land Cover Change (LUCC) model which was originally developed to simulate local scale LUCC patterns. The model uses a commercial windows-based GIS program to process and manage spatial data and an artificial neural network (ANN) program within a series of batch routines to learn about spatial patterns in data. In this paper, we provide an overview of a redesigned LTM capable of running at continental scales and at a fine (30m) resolution using a new architecture that employs a windows-based High Performance Computing (HPC) cluster. This paper provides an overview of the new architecture which we discuss within the context of modeling LUCC that requires: (1) using an HPC to run a modified version of our LTM; (2) managing large datasets in terms of size and quantity of files; (3) integration of tools that are executed using different scripting languages; and (4) a large number of steps necessitating several aspects of job management.