Using Knowledge-Based Neural Networks to Improve Algorithms: Refining the Chou–Fasman Algorithm for Protein Folding

  • Authors:
  • Richard Maclin;Jude W. Shavlik

  • Affiliations:
  • Computer Sciences Department, University of Wisconsin, 1210 W. Dayton St., Madison, WI 53706. MACLIN@CS.WISC.EDU;Computer Sciences Department, University of Wisconsin, 1210 W. Dayton St., Madison, WI 53706. SHAVLIK@CS.WISC.EDU

  • Venue:
  • Machine Learning - Special issue on multistrategy learning
  • Year:
  • 1993

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article describes a connectionist method for refining algorithms represented as generalized finite-state automata. The method translates the rule-like knowledge in an automaton into a corresponding artificial neural network, and then refines the reformulated automaton by applying backpropagation to a set of examples. This technique for translating an automaton into a network extends the KBANN algorithm, a system that translates a set of prepositional rules into a corresponding neural network. The extended system, FSKBANN, allows one to refine the large class of algorithms that can be represented as state-based processes. As a test, FSKBANN is used to improve the Chou–Fasman algorithm, a method for predicting how globular proteins fold. Empirical evidence shows that the multistrategy approach of FSKBANN leads to a statistically-significantly, more accurate solution than both the original Chou–Fasman algorithm and a neural network trained using the standard approach. Extensive statistics report the types of errors made by the Chou–Fasman algorithm, the standard neural network, and the FSKBANN network.