ACM Transactions on Mathematical Software (TOMS)
Global optimization
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Genetic algorithms + data structures = evolution programs (2nd, extended ed.)
Genetic algorithms + data structures = evolution programs (2nd, extended ed.)
Trace-Based Methods for Solving Nonlinear Global Optimization and Satisfiability Problems
Journal of Global Optimization
Efficient and Adaptive Lagrange-Multiplier Methods for Nonlinear Continuous Global Optimization
Journal of Global Optimization
A Fast Heuristic Global Learning Algorithm for Multilayer Neural Networks
Neural Processing Letters
Hybrid Feedforward Neural Networks for Solving Classification Problems
Neural Processing Letters
Efficient Global Optimization for Image Registration
IEEE Transactions on Knowledge and Data Engineering
Neural Network Approach to Design of Distributed Hard Real-Time Systems
Proceedings of the 6th International Conference on Computational Intelligence, Theory and Applications: Fuzzy Days
Efficient Learning in Adaptive Processing of Data Structures
Neural Processing Letters
Variations of the two-spiral task
Connection Science
Neural network output optimization using interval analysis
IEEE Transactions on Neural Networks
Cascade-connected ANN structures for indoor WLAN positioning
IDEAL'09 Proceedings of the 10th international conference on Intelligent data engineering and automated learning
Quality modeling of chemical product based on a new chaotic Elman neural network
ICNC'09 Proceedings of the 5th international conference on Natural computation
A novel generalized congruence neural networks
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Proceedings of the 4th International Symposium on Applied Sciences in Biomedical and Communication Technologies
MLDM'05 Proceedings of the 4th international conference on Machine Learning and Data Mining in Pattern Recognition
Suitability of two associative memory neural networks to character recognition
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Nonlinear speech processing: overview and possibilities in speech coding
Nonlinear Speech Modeling and Applications
Automatic task decomposition for the neuroevolution of augmenting topologies (NEAT) algorithm
EvoBIO'12 Proceedings of the 10th European conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics
Multi-system-multi-operator localization in PLMN using neural networks
International Journal of Communication Systems
Simultaneous optimization of artificial neural networks for financial forecasting
Applied Intelligence
Agent-Based approach to RBF network training with floating centroids
ICCCI'12 Proceedings of the 4th international conference on Computational Collective Intelligence: technologies and applications - Volume Part II
Fourier-assisted machine learning of hard disk drive access time models
PDSW '13 Proceedings of the 8th Parallel Data Storage Workshop
Hi-index | 0.00 |
Many learning algorithms find their roots in function-minimization algorithms that can be classified as local- or global-minimization algorithms. Algorithms that focus on either extreme--local search or global search--do not work well. The authors propose a hybrid method, called NOVEL for Nonlinear Optimization via External Lead, that combines global and local searches to explore the solution space, locate promising regions, and find local minima. To guide exploration of the solution space, it uses a continuous terrain-independent trace that does not get trapped in local minima. NOVEL next uses a locate gradient to attract the search to a local minimum, but the trace pulls it out once little improvement is found. NOVEL then selects one initial point for each promising region and uses these points for a descent algorithm to find local minima. It thus avoids searching unpromising local minima from random starting points using computationally expensive descent algorithms. In an implementation using differential- and difference-equation solvers, NOVEL demonstrated superior performance in five benchmark comparisons against the best global optimization algorithms.