Nonlinear quantization on Hebbian-type associative memories

  • Authors:
  • Chishyan Liaw;Ching-Tsorng Tsai;Chao-Hui Ko

  • Affiliations:
  • Department of Computer Science, Tunghai University, Taichung, Taiwan 407;Department of Computer Science, Tunghai University, Taichung, Taiwan 407;Department of Information Management, Hsiuping Institute of Technology, Taichung, Taiwan 412

  • Venue:
  • Applied Intelligence
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Hebbian-type associative memory is characterized by its simple architecture. However, the hardware implementation of Hebbian-type associative memories is normally complicated when there are a huge number of patterns stored. To simplify the interconnection values of a network, a nonlinear quantization strategy is presented. The strategy takes into account the property that the interconnection values are Gaussian distributed, and divides the interconnection weight values into a small number of unequal ranges accordingly. Interconnection weight values in each range contain information equally and each range is quantized to a value.The equation of probability of direct convergence was derived. The probability of direct convergence of nonlinear quantized networks with a small number of ranges is compatible with their original networks. The effects of linear and nonlinear quantization were also assessed in terms of recall capability, information capacity, and number of bits storing interconnection values saved by quantization. The performance of the proposed nonlinear quantization strategy is better than that of the linear quantization while retaining a recall capability that is compatible with its original network. The proposed approach reduces the number of connection weights and the size of the chip areas of a Hebbian-type associative memory while approximately retaining its performance.