Resolving Hidden Representations

  • Authors:
  • Cheng-Yuan Liou;Wei-Chen Cheng

  • Affiliations:
  • Department of Computer Science and Information Engineering, National Taiwan University, Republic of China;Department of Computer Science and Information Engineering, National Taiwan University, Republic of China

  • Venue:
  • Neural Information Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper presents a novel technique to separate the pattern representation in each hidden layer to facilitate many classification tasks. This technique requires that all patterns in the same class will have near representions and the patterns in different classes will have distant representions. This requirement is applied to any two data patterns to train a selected hidden layer of the MLP or the RNN. The MLP can be trained layer by layer feedforwardly to accomplish resolved representations. The trained MLP can serve as a kind of kernel functions for categorizing multiple classes.