A kernel hat matrix based rejection criterion for outlier removal in support vector regression

  • Authors:
  • Franck Dufrenois;Jean Charles Noyer

  • Affiliations:
  • Laboratoire d'Analyse des Systèmes du Littoral, Université of Calais, France;Laboratoire d'Analyse des Systèmes du Littoral, Université of Calais, France

  • Venue:
  • IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a kernel hat matrix based learning stage for outlier removal. In particular, we show that the gaussian kernel hat matrix have very interesting discriminative properties under the condition of choosing appropriate values for kernel parameters. Thus, we develop a practical model selection criteria in order to well separate the "outlier" distribution from the "dominant" distribution. This learning stage, beforehand applied to the training data set, offers a new answer for down-weighting outliers corrupting both the response and predictor variables in regression tasks. The application to simulated and real data shows the robustness of the proposed approach.