Multilayer feedforward networks are universal approximators
Neural Networks
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Feedforward nets for interpolation and classification
Journal of Computer and System Sciences
Original Contribution: Parity with two layer feedforward nets
Neural Networks
Time Series Analysis, Forecasting and Control
Time Series Analysis, Forecasting and Control
Predicting Time Series with Support Vector Machines
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
The Influence of the Sigmoid Function Parameters on the Speed of Backpropagation Learning
IWANN '96 Proceedings of the International Workshop on Artificial Neural Networks: From Natural to Artificial Neural Computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Fast Learning Neural Network Intrusion Detection System
AIMS '09 Proceedings of the 3rd International Conference on Autonomous Infrastructure, Management and Security: Scalability of Networks and Services
Hi-index | 0.00 |
The paper offers a critical analysis of the procedure observed in many applications of neural networks. Given a problem to be solved, a favorite NN-architecture is chosen and its parameters tuned with some standard training algorithm, but without taking in consideration relevant features of the problem or possibly its interdisciplinary nature. Three relevant benchmark problems are discussed to illustrate the thesis that "brute force solving is not the same as understanding".