A Solution for the N-bit Parity Problem Using a Single Translated Multiplicative Neuron
Neural Processing Letters
Neural Networks
Hi-index | 0.00 |
Evolutionary algorithms have been successfully applied to the design and training of neural networks, such as in optimization of network architecture, learning connection weights, and selecting training data. While most of existing evolutionary methods are focused on one of these aspects, we present in this paper an integrated approach that employs evolutionary mechanisms for the optimization of these components simultaneously. This approach is especially effective when evolving irregular, not-strictly-layered networks of heterogeneous neurons with variable receptive fields. The core of our method is the neural tree representation scheme combined with the Bayesian evolutionary learning framework. The generality and flexibility of neural trees make it easy to express and modify complex neural architectures by means of standard crossover and mutation operators. The Bayesian evolutionary framework provides a theoretical foundation for finding compact neural networks using a small data set by principled exploitation of background knowledge available in the problem domain. Performance of the presented method is demonstrated on a suite of benchmark problems and compared with those of related methods.