Evolving neural networks through augmenting topologies
Evolutionary Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
Evolving Connectionist Systems: The Knowledge Engineering Approach
Evolving Connectionist Systems: The Knowledge Engineering Approach
2007 Special Issue: The cerebellum as a liquid state machine
Neural Networks
Isolated word recognition with the Liquid State Machine: a case study
Information Processing Letters - Special issue on applications of spiking neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The Liquid State Machine (LSM) is a biologically plausible computational neural network model for real-time computing on time-varying inputs, whose structure and function were inspired by the properties of neocortical columns in the central nervous system of mammals. The LSM uses spiking neurons connected by dynamic synapses to project inputs into a high dimensional feature space, allowing classification of inputs by linear separation, similar to the approach used in support vector machines (SVMs). The performance of a LSM neural network model on pattern recognition tasks mainly depends on its parameter settings. Two parameters are of particular interest: the distribution of synaptic strengths and synaptic connectivity. To design an efficient liquid filter that performs desired kernel functions, these parameters need to be optimized. We have studied performance as a function of these parameters for several models of synaptic connectivity. The results show that in order to achieve good performance, large synaptic weights are required to compensate for a small number of synapses in the liquid filter, and vice versa. In addition, a larger variance of the synaptic weights results in better performance for LSM benchmark problems. We also propose a genetic algorithm-based approach to evolve the liquid filter from a minimum structure with no connections, to an optimized kernel with a minimal number of synapses and high classification accuracy. This approach facilitates the design of an optimal LSM with reduced computational complexity. Results obtained using this genetic programming approach show that the synaptic weight distribution after evolution is similar in shape to that found in cortical circuitry.