Natural gradient works efficiently in learning
Neural Computation
Detecting Concept Drift with Support Vector Machines
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
On "Natural" Learning and Pruning in Multilayered Perceptrons
Neural Computation
Learning sensory representations with intrinsic plasticity
Neurocomputing
Synergies Between Intrinsic and Synaptic Plasticity Mechanisms
Neural Computation
Natural learning in NLDA networks
Neural Networks
Improving reservoirs using intrinsic plasticity
Neurocomputing
Adaptive concept drift detection
Statistical Analysis and Data Mining - Best of SDM'09
A gradient rule for the plasticity of a neuron’s intrinsic excitability
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
Hi-index | 0.01 |
This paper investigates the learning dynamics of intrinsic plasticity (IP), which is a learning rule to tune a neuron's activation function such that its output distribution becomes approximately exponentially distributed. The information-geometric properties of intrinsic plasticity are analyzed and the improved natural gradient intrinsic plasticity (NIP) dynamics are evaluated for a variety of input distributions. Together with a further new modification of the IP rule, the high capability of NIP to cope with drift is demonstrated to have superior performance as compared to the standard gradient in experiments with synthetic and real world data.