Application of Metamorphic Testing to Supervised Classifiers

  • Authors:
  • Xiaoyuan Xie;Joshua Ho;Christian Murphy;Gail Kaiser;Baowen Xu;Tsong Yueh Chen

  • Affiliations:
  • -;-;-;-;-;-

  • Venue:
  • QSIC '09 Proceedings of the 2009 Ninth International Conference on Quality Software
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many applications in the field of scientific computing - such as computational biology, computational linguistics, and others - depend on Machine Learning algorithms to provide important core functionality to support solutions in the particular problem domains. However, it is difficult to test such applications because often there is no "test oracle" to indicate what the correct output should be for arbitrary input. To help address the quality of such software, in this paper we present a technique for testing the implementations of supervised machine learning classification algorithms on which such scientific computing software depends. Our technique is based on an approach called "metamorphic testing", which has been shown to be effective in such cases. More importantly, we demonstrate that our technique not only serves the purpose of verification, but also can be applied in validation. In addition to presenting our technique, we describe a case study we performed on a real-world machine learning application framework, and discuss how programmers implementing machine learning algorithms can avoid the common pitfalls discovered in our study. We also discuss how our findings can be of use to other areas outside scientific computing, as well.