Structured induction in expert systems
Structured induction in expert systems
Explorations in parallel distributed processing: a handbook of models, programs, and exercises
Explorations in parallel distributed processing: a handbook of models, programs, and exercises
Improving the design of similarity-based rule-learning systems
International Journal of Expert Systems
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Machine Learning
Protos: a unified approach to concept representation, classification, and learning
Protos: a unified approach to concept representation, classification, and learning
Rule learning by searching on adapted nets
AAAI'91 Proceedings of the ninth National conference on Artificial intelligence - Volume 2
An efficient way to learn English grapheme-to-phoneme rules automatically
ICASSP'93 Proceedings of the 1993 IEEE international conference on Acoustics, speech, and signal processing: speech processing - Volume II
Hi-index | 0.00 |
Despite the fact that many symbolic and connectionist (neural net) learning algorithms are addressing the same problem of learning from classified examples, very little Is known regarding their comparative strengths and weaknesses. This paper presents the results of experiments comparing the ID3 symbolic learning algorithm with the perceptron and back-propagation connectionist learning algorithms on several large real-world data sets. The results show that ID3 and perceptron run significantly faster than does backpropagation, both during learning and during classification of novel examples. However, the probability of correctly classifying new examples is about the same for the three systems. On noisy data sets there is some indication that backpropagation classifies more accurately.