APL '90 Conference proceedings on APL 90: for the future
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
APL '91 Proceedings of the international conference on APL '91
Language as an intellectual tool: from hieroglyphics to APL
IBM Systems Journal
IBM Systems Journal
APL '92 Proceedings of the international conference on APL
APL '84 Proceedings of the international conference on APL
Time series forecasting using neural networks
APL '94 Proceedings of the international conference on APL : the language and its applications: the language and its applications
A parallel correlation-based algorithm in J learns neural network connections
APL '94 Proceedings of the international conference on APL : the language and its applications: the language and its applications
Early bankruptcy detection using neural networks
APL '95 Proceedings of the international conference on Applied programming languages
Hi-index | 0.00 |
Neural networks, trained by backpropagation, are designed and described in the language J, an APL derivative with powerful function encapsulation features. Both the languages J [4,6,7] and APL [5] help to identify and isolate the parallelism that is inherent in network training algorithms. Non-critical details of data input and derived output processes are de-emphasized by relegating those functions to callable stand-alone modules. Such input and output modules can be isolated and customized individually for managing communication with arbitrary, external storage systems. The central objective of this research is the design and precise description of a neural network training kernel. Such kernel designs are valuable for producing efficient reusable computer codes and facilitating the transfer of neural network technology from developers to users.