A novel minimum-size activation function and its derivative

  • Authors:
  • Manuel Carrasco-Robles;Luis Serrano

  • Affiliations:
  • Department of Electrical and Electronic Engineering, Public University of Navarra, Pamplona, Spain;Department of Electrical and Electronic Engineering, Public University of Navarra, Pamplona, Spain

  • Venue:
  • IEEE Transactions on Circuits and Systems II: Express Briefs
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This brief presents two novel architectures, i.e., a nonlinear neural activation function and its derivative. Both are suitable for implementations of neurons in multilayer perceptron networks with an on-chip backpropagation learning algorithm. The activation function proposed shows minimal area and power consumption, can be considered as an approximation to the tanh(nx) function, and can be programmed to achieve any slope at the origin that is equal to or greater than 2. The derivative proposed also shows minimal area and maximizes its similarity, with the ideal derivative in the proximities of the origin being the best approximation for its degree of complexity. Both topologies are designed with subthreshold metal-oxide-semiconductor transistors in order to minimize power consumption. Likewise, they have been designed with balanced and fully differential topologies, so that external influences, offset, and distortion of even order are reduced. Moreover, a detailed analysis using the General Translinear Principle shows that the activation function is being affected by the body effect but the derivative function is immune to it. The activation function and the proposed derivative are thoroughly analyzed, and measured results are presented for our implementation on 0.5-µm AMI Semiconductor (AMIS) CMOS technology.