A hadoop-based packet trace processing tool

  • Authors:
  • Yeonhee Lee;Wonchul Kang;Youngseok Lee

  • Affiliations:
  • Chungnam National University, Daejeon, Republic of Korea;Chungnam National University, Daejeon, Republic of Korea;Chungnam National University, Daejeon, Republic of Korea

  • Venue:
  • TMA'11 Proceedings of the Third international conference on Traffic monitoring and analysis
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Internet traffic measurement and analysis has become a significantly challenging job because large packet trace files captured on fast links could not be easily handled on a single server with limited computing and memory resources. Hadoop is a popular open-source cloud computing platform that provides a software programming framework called MapReduce and the distributed filesystem, HDFS, which are useful for analyzing a large data set. Therefore, in this paper, we present a Hadoopbased packet processing tool that provides scalability for a large data set by harnessing MapReduce and HDFS. To tackle large packet trace files in Hadoop efficiently, we devised a new binary input format, called PcapInputFormat, hiding the complexity of processing binary-formatted packet data and parsing each packet record. We also designed efficient traffic analysis MapReduce job models consisting of map and reduce functions. To evaluate our tool, we compared its computation time with a well-known packet-processing tool, CoralReef, and showed that our approach is more affordable to process a large set of packet data.