Expert Systems with Applications: An International Journal
Review: A review of ant algorithms
Expert Systems with Applications: An International Journal
A Stigmergy-Based Algorithm for Continuous Optimization Tested on Real-Life-Like Environment
EvoWorkshops '09 Proceedings of the EvoWorkshops 2009 on Applications of Evolutionary Computing: EvoCOMNET, EvoENVIRONMENT, EvoFIN, EvoGAMES, EvoHOT, EvoIASP, EvoINTERACTION, EvoMUSART, EvoNUM, EvoSTOC, EvoTRANSLOG
Fuzzy ant colony optimization for optimal control
ACC'09 Proceedings of the 2009 conference on American Control Conference
An incremental ant colony algorithm with local search for continuous optimization
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Expert Systems with Applications: An International Journal
Training neural networks with harmony search algorithms for classification problems
Engineering Applications of Artificial Intelligence
Engineering Applications of Artificial Intelligence
Computers in Biology and Medicine
The differential ant-stigmergy algorithm
Information Sciences: an International Journal
Artificial neural network training using a new efficient optimization algorithm
Applied Soft Computing
Biological plausibility in optimisation: an ecosystemic view
International Journal of Bio-Inspired Computation
A multi-objective ant colony system algorithm for virtual machine placement in cloud computing
Journal of Computer and System Sciences
Weaver Ant Colony Optimization-Based Neural Network Learning for Mammogram Classification
International Journal of Swarm Intelligence Research
An intelligent algorithm: MACO for continuous optimization models
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.00 |
Ant colony optimization (ACO) is an optimization technique that was inspired by the foraging behaviour of real ant colonies. Originally, the method was introduced for the application to discrete optimization problems. Recently we proposed a first ACO variant for continuous optimization. In this work we choose the training of feed-forward neural networks for pattern classification as a test case for this algorithm. In addition, we propose hybrid algorithm variants that incorporate short runs of classical gradient techniques such as backpropagation. For evaluating our algorithms we apply them to classification problems from the medical field, and compare the results to some basic algorithms from the literature. The results show, first, that the best of our algorithms are comparable to gradient-based algorithms for neural network training, and second, that our algorithms compare favorably with a basic genetic algorithm.