Combining the Perceptron Algorithm with Logarithmic Simulated Annealing
Neural Processing Letters
Decision trees using model ensemble-based nodes
Pattern Recognition
Machine scheduling in custom furniture industry through neuro-evolutionary hybridization
Applied Soft Computing
An online AUC formulation for binary classification
Pattern Recognition
Logarithmic simulated annealing for X-ray diagnosis
Artificial Intelligence in Medicine
Privacy-preserving back-propagation and extreme learning machine algorithms
Data & Knowledge Engineering
A classifier based on minimum circum circle
ICSI'12 Proceedings of the Third international conference on Advances in Swarm Intelligence - Volume Part II
Learning user preferences for adaptive pervasive environments: An incremental and temporal approach
ACM Transactions on Autonomous and Adaptive Systems (TAAS)
Seeing is not believing: visual verifications through liveness analysis using mobile devices
Proceedings of the 29th Annual Computer Security Applications Conference
Computers and Industrial Engineering
Line filtering for surgical tool localization in 3D ultrasound images
Computers in Biology and Medicine
Segmentation of abdominal organs from CT using a multi-level, hierarchical neural network strategy
Computer Methods and Programs in Biomedicine
Journal of Computer Security
Hi-index | 0.00 |
A key task for connectionist research is the development and analysis of learning algorithms. An examination is made of several supervised learning algorithms for single-cell and network models. The heart of these algorithms is the pocket algorithm, a modification of perceptron learning that makes perceptron learning well-behaved with nonseparable training data, even if the data are noisy and contradictory. Features of these algorithms include speed algorithms fast enough to handle large sets of training data; network scaling properties, i.e. network methods scale up almost as well as single-cell models when the number of inputs is increased; analytic tractability, i.e. upper bounds on classification error are derivable; online learning, i.e. some variants can learn continually, without referring to previous data; and winner-take-all groups or choice groups, i.e. algorithms can be adapted to select one out of a number of possible classifications. These learning algorithms are suitable for applications in machine learning, pattern recognition, and connectionist expert systems