Multilayer feedforward networks are universal approximators
Neural Networks
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Universal approximation using radial-basis-function networks
Neural Computation
Approximation and radial-basis-function networks
Neural Computation
Signal and image processing with neural networks: a C++ sourcebook
Signal and image processing with neural networks: a C++ sourcebook
Elements of artificial neural networks
Elements of artificial neural networks
A class of gradient unconstrained minimization algorithms with adaptive stepsize
Journal of Computational and Applied Mathematics
Neural Networks for Statistical Modeling
Neural Networks for Statistical Modeling
Mathematical Methods for Neural Network Analysis and Design
Mathematical Methods for Neural Network Analysis and Design
Neural Networks for Optimization and Signal Processing
Neural Networks for Optimization and Signal Processing
Neural Network Learning Using Low-Discrepancy Sequence
AI '99 Proceedings of the 12th Australian Joint Conference on Artificial Intelligence: Advanced Topics in Artificial Intelligence
Comparative Assessment of Algorithms and Software for Global Optimization
Journal of Global Optimization
Journal of Global Optimization
Nonlinear optimization with GAMS /LGO
Journal of Global Optimization
Comparison of Stochastic Global Optimization Methods to Estimate Neural Network Weights
Neural Processing Letters
Expert Systems with Applications: An International Journal
Deep, narrow sigmoid belief networks are universal approximators
Neural Computation
Predicting Trading Signals of Stock Market Indices Using Neural Networks
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Monotonic multi-layer perceptron networks as universal approximators
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Intelligent control of braking process
Expert Systems with Applications: An International Journal
Hi-index | 12.09 |
Artificial neural networks (ANNs) are used extensively to model unknown or unspecified functional relationships between the input and output of a ''black box'' system. In order to apply the generic ANN concept to actual system model fitting problems, a key requirement is the training of the chosen (postulated) ANN structure. Such training serves to select the ANN parameters in order to minimize the discrepancy between modeled system output and the training set of observations. We consider the parameterization of ANNs as a potentially multi-modal optimization problem, and then introduce a corresponding global optimization (GO) framework. The practical viability of the GO based ANN training approach is illustrated by finding close numerical approximations of one-dimensional, yet visibly challenging functions. For this purpose, we have implemented a flexible ANN framework and an easily expandable set of test functions in the technical computing system Mathematica. The MathOptimizer Professional global-local optimization software has been used to solve the induced (multi-dimensional) ANN calibration problems.