Time series: theory and methods
Time series: theory and methods
Multilayer feedforward networks are universal approximators
Neural Networks
Modern heuristic techniques for combinatorial problems
Some new results on neural network approximation
Neural Networks
Rational function neural network
Neural Computation
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Time Series Analysis, Forecasting and Control
Time Series Analysis, Forecasting and Control
On the harmonious mating strategy through tabu search
Information Sciences: an International Journal - Special issue: Evolutionary computation
Sigmoidal Function Classes for Feedforward Artificial Neural Networks
Neural Processing Letters
Complementary Log-Log and Probit: Activation Functions Implemented in Artificial Neural Networks
HIS '08 Proceedings of the 2008 8th International Conference on Hybrid Intelligent Systems
Neural networks with asymmetric activation function for function approximation
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A dynamic architecture for artificial neural networks
Neurocomputing
Applications of multi-objective structure optimization
Neurocomputing
Comparison of new activation functions in neural network for forecasting financial time series
Neural Computing and Applications
A new class of hybrid models for time series forecasting
Expert Systems with Applications: An International Journal
Review: Hybrid expert systems: A survey of current approaches and applications
Expert Systems with Applications: An International Journal
Multilayer feedforward networks with adaptive spline activation function
IEEE Transactions on Neural Networks
Constructive feedforward neural networks using Hermite polynomial activation functions
IEEE Transactions on Neural Networks
Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm
IEEE Transactions on Neural Networks
An Optimization Methodology for Neural Network Weights and Architectures
IEEE Transactions on Neural Networks
Training feedforward networks with the Marquardt algorithm
IEEE Transactions on Neural Networks
Hybrid Training Method for MLP: Optimization of Architecture and Training
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An approach to reservoir computing design and training
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
The use of neural network models for time series forecasting has been motivated by experimental results that indicate high capacity for function approximation with good accuracy. Generally, these models use activation functions with fixed parameters. However, it is known that the choice of activation function strongly influences the complexity and neural network performance and that a limited number of activation functions has been used in general. We describe the use of an asymmetric activation functions family with free parameter for neural networks. We prove that the activation functions family defined, satisfies the requirements of the universal approximation theorem We present a methodology for global optimization of the activation functions family with free parameter and the connections between the processing units of the neural network. The main idea is to optimize, simultaneously, the weights and activation function used in a Multilayer Perceptron (MLP), through an approach that combines the advantages of simulated annealing, tabu search and a local learning algorithm. We have chosen two local learning algorithms: the backpropagation with momentum (BPM) and Levenberg-Marquardt (LM). The overall purpose is to improve performance in time series forecasting.