Automated benchmarking and analysis tool

  • Authors:
  • Tomas Kalibera;Jakub Lehotsky;David Majda;Branislav Repcek;Michal Tomcanyi;Antonin Tomecek;Petr Tuma;Jaroslav Urban

  • Affiliations:
  • Charles University, Prague, Czech Republic;Charles University, Prague, Czech Republic;Charles University, Prague, Czech Republic;Charles University, Prague, Czech Republic;Charles University, Prague, Czech Republic;Charles University, Prague, Czech Republic;Charles University, Prague, Czech Republic;Charles University, Prague, Czech Republic

  • Venue:
  • valuetools '06 Proceedings of the 1st international conference on Performance evaluation methodolgies and tools
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Benchmarking is an important performance evaluation technique that provides performance data representative of real systems. Such data can be used to verify the results of performance modeling and simulation, or to detect performance changes. Automated benchmarking is an increasingly popular approach to tracking performance changes during software development, which gives developers a timely feedback on their work. In contrast with the advances in modeling and simulation tools, the tools for automated benchmarking are usually being implemented ad-hoc for each project, wasting resources and limiting functionality.We present the result of project BEEN, a generic tool for automated benchmarking in a heterogeneous distributed environment. BEEN automates all steps of a benchmark experiment from software building and deployment through measurement and load monitoring to the evaluation of results. The notable features include separation of measurement from the evaluation and ability to adaptively scale the benchmark experiment based on the evaluation. BEEN has been designed to facilitate automated detection of performance changes during software development (regression benchmarking).