Analysis of Interrater Agreement in ISO/IEC 15504-Based Software Process Assessment

  • Authors:
  • H.-Y. Lee;H.-W. Jung;C.-S. Chung;J. Lee;K. Lee;H. Jeong

  • Affiliations:
  • -;-;-;-;-;-

  • Venue:
  • APAQS '01 Proceedings of the Second Asia-Pacific Conference on Quality Software
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

The emerging ISO/IEC 15504 standard provides aframework and a model for software process assessmentand improvement. There are two requirements for reliableprocess assessment; internal reliability and externalreliability. The objective of this study is to provide anempirical case of external reliability, i.e. the interrateragreement in ISO/IEC 15504-based software processassessment. Interrater agreement implies the extent towhich independent assessors agree in their ratings ofsoftware process attributes. Our dataset was from twoassessments conducted using the ISO/IEC 15504 standard.The results showed "substantial" to "excellent"agreement. This implies that the two assessments acquiredexternal reliability.