Algorithms for clustering data
Algorithms for clustering data
The Capacity of Multilevel Threshold Functions
IEEE Transactions on Pattern Analysis and Machine Intelligence
The logic of connectionist systems
Neural computing architectures
Neurocomputing
The appeal of parallel distributed processing
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
A Novel Feature Recognition Neural Network and its Application to Character Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Comments on 'Design of Supervised Classifiers Using Boolean Neural Neworks'
IEEE Transactions on Pattern Analysis and Machine Intelligence
Reply to: Comments on 'Design of Supervised Classifiers Using Boolean Neural Networks'
IEEE Transactions on Pattern Analysis and Machine Intelligence
Vision Experiments with Neural Deformable Template Matching
Neural Processing Letters
A Boolean Neural Network Approach for the Traveling Salesman Problem
IEEE Transactions on Computers
A GA-based flexible learning algorithm with error tolerance for digital binary neural networks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Learning by discovering conflicts
AI'03 Proceedings of the 16th Canadian society for computational studies of intelligence conference on Advances in artificial intelligence
Analysis for characteristics of GA-Based learning method of binary neural networks
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
Hi-index | 0.15 |
In this paper we present two supervised pattern classifiers designed using Boolean neural networks (BNN). They are 1) nearest-to-an-exemplar (NTE) and 2) Boolean k-nearest neighbor (BKNN) classifier. The emphasis during the design of these classifiers was on simplicity, robustness, and the ease of hardware implementation. The classifiers use the idea of radius of attraction (ROA) to achieve their goal. Mathematical analysis of the algorithms presented in the paper is done to prove their feasibility. Both classifiers are tested with well-known binary and continuous feature valued data sets yielding results comparable with those obtained by similar existing classifiers.