OP-KNN: method and applications

  • Authors:
  • Qi Yu;Yoan Miche;Antti Sorjamaa;Alberto Guillen;Amaury Lendasse;Eric Séverin

  • Affiliations:
  • Department of Information and Computer Science, Aalto University School of Science and Technology, Aalto, Finland;Department of Information and Computer Science, Aalto University School of Science and Technology, Aalto, Finland;Department of Information and Computer Science, Aalto University School of Science and Technology, Aalto, Finland;Department of Computer Technology and Architecture, University of Granada, Granada, Spain;Department of Information and Computer Science, Aalto University School of Science and Technology, Aalto, Finland;Department GEA, University of Lille 1, Villeneuve d'ascq cedex, France

  • Venue:
  • Advances in Artificial Neural Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a methodology named Optimally Pruned K-Nearest Neighbors (OP-KNNs) which has the advantage of competing with state-of-the-art methods while remaining fast. It builds a one hidden-layer feedforward neural network using K-Nearest Neighbors as kernels to perform regression. Multiresponse Sparse Regression (MRSR) is used in order to rank each kth nearest neighbor and finally Leave-One-Out estimation is used to select the optimal number of neighbors and to estimate the generalization performances. Since computational time of this method is small, this paper presents a strategy using OP-KNN to perform Variable Selection which is tested successfully on eight real-life data sets from different application fields. In summary, the most significant characteristic of this method is that it provides good performance and a comparatively simple model at extremely high-learning speed.