Finding nominally conditioned multivariate polynomials using a four-layer perceptron having shared weights

  • Authors:
  • Yusuke Tanahashi;Kazumi Saito;Daisuke Kitakoshi;Ryohei Nakano

  • Affiliations:
  • Nagoya Institute of Technology, Nagoya, Japan;NTT Communication Science Laboratories, NTT Corporation, Soraku, Kyoto, Japan;Nagoya Institute of Technology, Nagoya, Japan;Nagoya Institute of Technology, Nagoya, Japan

  • Venue:
  • KES'06 Proceedings of the 10th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part II
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a method for discovering nominally conditioned polynomials to fit multivariate data containing numeric and nominal variables using a four-layer perceptron having shared weights. A polynomial is accompanied with the nominal condition stating a subspace where the polynomial is applied. To get a succinct neural network, we focus on weight sharing, where a weight is allowed to have one of common weights. A near-zero common weight can be eliminated. Our method iteratively merges and splits common weights based on 2nd-order criteria, escaping from local optima. Moreover, our method selects the optimal number of hidden units based on cross-validation. The experiments showed that our method can restore the original sharing structure for an artificial data set, and discovers rather succinct rules for a real data set.