Constructing deterministic finite-state automata in recurrent neural networks
Journal of the ACM (JACM)
Genetic search for accurate feature sets
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Machine Learning - Special issue on multistrategy learning
Protein Secondary Structure Prediction Using Data Mining Tool C5
ICTAI '99 Proceedings of the 11th IEEE International Conference on Tools with Artificial Intelligence
Connectionist theory refinement: genetically searching the space of network topologies
Journal of Artificial Intelligence Research
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
ICIC'09 Proceedings of the Intelligent computing 5th international conference on Emerging intelligent computing technology and applications
On a hybrid weightless neural system
International Journal of Bio-Inspired Computation
AAAI'92 Proceedings of the tenth national conference on Artificial intelligence
Post supervised based learning of feature weight values
SETN'06 Proceedings of the 4th Helenic conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
This article describes a connectionist method for refining algorithms represented as generalized finite-state automata. The method translates the rule-like knowledge in an automaton into a corresponding artificial neural network, and then refines the reformulated automaton by applying backpropagation to a set of examples. This technique for translating an automaton into a network extends the KBANN algorithm, a system that translates a set of prepositional rules into a corresponding neural network. The extended system, FSKBANN, allows one to refine the large class of algorithms that can be represented as state-based processes. As a test, FSKBANN is used to improve the Chou–Fasman algorithm, a method for predicting how globular proteins fold. Empirical evidence shows that the multistrategy approach of FSKBANN leads to a statistically-significantly, more accurate solution than both the original Chou–Fasman algorithm and a neural network trained using the standard approach. Extensive statistics report the types of errors made by the Chou–Fasman algorithm, the standard neural network, and the FSKBANN network.