Input and Output Feature Selection
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Sample Complexity for Function Learning Tasks through Linear Neural Networks
MICAI '02 Proceedings of the Second Mexican International Conference on Artificial Intelligence: Advances in Artificial Intelligence
An adaptive evolutionary algorithm for Volterra system identification
Pattern Recognition Letters
Financial time series prediction using polynomial pipelined neural networks
Expert Systems with Applications: An International Journal
Evolutionarily optimized features in functional link neural network for classification
Expert Systems with Applications: An International Journal
A novel learning scheme for Chebyshev functional link neural networks
Advances in Artificial Neural Systems
Towards a methodology to search for near-optimal representations in classification problems
IWINAC'05 Proceedings of the First international work-conference on the Interplay Between Natural and Artificial Computation conference on Artificial Intelligence and Knowledge Engineering Applications: a bioinspired approach - Volume Part II
Feature discovery in classification problems
IDA'05 Proceedings of the 6th international conference on Advances in Intelligent Data Analysis
Evolutionary search of optimal features
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Functional link neural network: artificial bee colony for time series temperature prediction
ICCSA'13 Proceedings of the 13th international conference on Computational Science and Its Applications - Volume 1
International Journal of Business Intelligence and Data Mining
Hi-index | 0.00 |
This paper addresses the genetic design of functional link networks (FLN). FLN are high-order perceptrons (HOP) without hidden units. Despite their linear nature, FLN can capture nonlinear input-output relationships, provided that they are fed with an adequate set of polynomial inputs, which are constructed out of the original input attributes. Given this set, it turns out to be very simple to train the network, as compared with a multilayer perceptron (MLP). However finding the optimal subset of units is a difficult problem because of its nongradient nature and the large number of available units, especially for high degrees. Some constructive growing methods have been proposed to address this issue, Here, we rely on the global search capabilities of a genetic algorithm to scan the space of subsets of polynomial units, which is plagued by a host of local minima. By contrast, the quadratic error function of each individual FLN has only one minimum, which makes fitness evaluation practically noiseless. We find that surprisingly simple FLN compare favorably with other more complex architectures derived by means of constructive and evolutionary algorithms on some UCI benchmark data sets. Moreover, our models are especially amenable to interpretation, due to an incremental approach that penalizes complex architectures and starts with a pool of single-attribute FLN