Technical efficiency-based selection of learning cases to improve forecasting accuracy of neural networks under monotonicity assumption

  • Authors:
  • Parag C. Pendharkar;James A. Rodger

  • Affiliations:
  • Information Systems, School of Business Administration, Capital College, Pennsylvania State University, 777 W. Harrisburg Pike, Middletown, PA;MIS and Decision Sciences, Eberly College of Business and Information Technology, Indiana University of Pennsylvania, Indiana, PA

  • Venue:
  • Decision Support Systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we show that when an artificial neural network (ANN) model is used for learning monotonic forecasting functions, it may be useful to screen training data so the screened examples approximately satisfy the monotonicity property. We show how a technical efficiency-based ranking, using the data envelopment analysis (DEA) model, and a predetermined monotonicity property can be identified. Using a health care forecasting problem, the monotonicity assumption, and a predetermined threshold efficiency level, we use DEA to split training data into two mutually exclusive, "efficient" and "inefficient", training data subsets. We compare the performance of the ANN by using the "efficient" and "inefficient" training data subsets. Our results indicate that the predictive performance of an ANN that is trained on the "efficient" training data subset is higher than the predictive performance of an ANN that is trained on the "inefficient" training data subset.