Another viewpoint on "evaluating web software reliability based on workload and failure data extracted from server logs"

  • Authors:
  • Toan Huynh;James Miller

  • Affiliations:
  • Electrical and Computer Engineering Research Facility, Department of Electrical and Computer Engineering, University of Alberta, Edmonton, Canada T6G 2 V4;Electrical and Computer Engineering Research Facility, Department of Electrical and Computer Engineering, University of Alberta, Edmonton, Canada T6G 2 V4

  • Venue:
  • Empirical Software Engineering
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

An approach of determining a website's reliability is evaluated in this paper. This technique extracts workload measures and error codes from the server's data logs. This information is then used to calculate the reliability for a particular website. This study follows on from a previous study, and hence, can be regarded as a "partial replication" (technically, as both studies are case studies not formal experiments, this description is inaccurate. Unfortunately, no corresponding definition exists for case studies, and hence the term is used to convey a general sense of purpose) of the original study. Although the method proposed by the original study is feasible, the effectiveness of just using a specific error type and a specific workload to estimate the reliability of websites is questionable. In this study, different error types and their usefulness for reliability analysis are examined and discussed. After a thorough investigation, we believe that reliability analysis for websites must be based on more specific error definitions as they can provide a superior reliability estimate for today's highly dynamic websites.