A new polynomial-time algorithm for linear programming
Combinatorica
Long term memory storage capacity of multiconnected neural networks
Biological Cybernetics
What size net gives valid generalization?
Neural Computation
Generalizing smoothness constraints from discrete samples
Neural Computation
Efficient parallel learning algorithms for neural networks
Advances in neural information processing systems 1
Skeletonization: a technique for trimming the fat from a network via relevance assessment
Advances in neural information processing systems 1
Comparing biases for minimal network construction with back-propagation
Advances in neural information processing systems 1
Creating artificial neural networks that generalize
Neural Networks
The cascade-correlation learning architecture
Advances in neural information processing systems 2
Advances in neural information processing systems 2
The threshold order of a Boolean function
Discrete Applied Mathematics
Advances in neural information processing systems 2
IEEE Transactions on Neural Networks
Two Neural Network Construction Methods
Neural Processing Letters
A neural net compiler system for hierarchical organization
ACM SIGPLAN Notices
Neural Computation
On Using Constructivism in Neural Classifier Systems
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
An Adaptive Activation Function for Higher Order Neural Networks
AI '02 Proceedings of the 15th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
A New Adaptive Neural Network Model for Financial Data Mining
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
Data mining using an adaptive HONN model with hyperbolic tangent neurons
PKAW'10 Proceedings of the 11th international conference on Knowledge management and acquisition for smart systems and services
Hi-index | 0.00 |
Constructive learning algorithms are important because they address two practical difficulties of learning in artificial neural networks. First, it is not always possible to determine the minimal network consistent with a particular problem. Second, algorithms like backpropagation can require networks that are larger than the minimal architecture for satisfactory convergence. Further, constructive algorithms have the advantage that polynomial-time learning is possible if network size is chosen by the learning algorithm so that the learning of the problem under consideration is simplified. This article considers the representational ability of feedforward networks (FFNs) in terms of the fan-in required by the hidden units of a network. We define network order to be the maximum fan-in of the hidden units of a network. We prove, in terms of the problems they may represent, that a higher-order network (HON) is at least as powerful as any other FFN architecture when the order of the networks are the same. Next, we present a detailed theoretical development of a constructive, polynomial-time algorithm that will determine an exact HON realization with minimal order for an arbitrary binary or bipolar mapping problem. This algorithm does not have any parameters that need tuning for good performance. We show how an FFN with sigmoidal hidden units can be determined from the HON realization in polynomial time. Last, simulation results of the constructive HON algorithm are presented for the two-or-more clumps problem, demonstrating that the algorithm performs well when compared with the Tiling and Upstart algorithms.