Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Artificial Neural Networks: Learning Algorithms, Performance Evaluation, and Applications
Artificial Neural Networks: Learning Algorithms, Performance Evaluation, and Applications
Feed-forward neural networks: learning algorithms, statistical properties, and applications
Feed-forward neural networks: learning algorithms, statistical properties, and applications
Sliding window adaptive fast QR and QR-lattice algorithms
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
Component-wise Iterative Optimization (CIO) is a method of dealing with a large data in the OLAP applications, which can be treated as the enhancement of the traditional batch version methods such as least squares. The salient feature of the method is to process transactions one by one, optimizes estimates iteratively for each parameter over the given objective function, and update models on the fly. A new learning algorithm can be proposed when applying CIO to feed-forward neural networks with a single hidden layer. It incorporates the internal structure of feed-forward neural networks with a single hidden layer by applying the algorithm CIO in closed-form expressions to update weights between the output layer and the hidden layer. Its optimally computational property is a natural consequence inherited from the property of the algorithm CIO and is also demonstrated in an illustrative example.