Cloud MapReduce for particle filter-based data assimilation for wildfire spread simulation

  • Authors:
  • Fan Bai;Xiaolin Hu

  • Affiliations:
  • Georgia State University;Georgia State University

  • Venue:
  • Proceedings of the High Performance Computing Symposium
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

MapReduce is a domain-independent programming model for processing data in a highly parallel fashion. With MapReduce, parallel computing can be automatically carried out in large-scale commodity machines. This paper presents a method that utilizes the parallel and distributed processing capability of Hadoop MapReduce for particle filter-based data assimilation in wildfire spread simulation. We parallelize the sampling and weight computation steps of the particle filtering algorithm based on the MapReduce programming model. Experiment results show that our approach significantly increases the performance of particle filter-based data assimilation.