2005 Special issue: A new classifier based on information theoretic learning with unlabeled data

  • Authors:
  • Kyu-Hwa Jeong;Jian-Wu Xu;Deniz Erdogmus;Jose C. Principe

  • Affiliations:
  • Computational NeuroEngineering Laboratory, Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA;Oregon Graduate Institute, OHSU, Portland, OR 97006, USA;Computational NeuroEngineering Laboratory, Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA

  • Venue:
  • Neural Networks - 2005 Special issue: IJCNN 2005
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Supervised learning is conventionally performed with pairwise input-output labeled data. After the training procedure, the adaptive system's weights are fixed while the testing procedure with unlabeled data is performed. Recently, in an attempt to improve classification performance unlabeled data has been exploited in the machine learning community. In this paper, we present an information theoretic learning (ITL) approach based on density divergence minimization to obtain an extended training algorithm using unlabeled data during the testing. The method uses a boosting-like algorithm with an ITL based cost function. Preliminary simulations suggest that the method has the potential to improve the performance of classifiers in the application phase.