Computerised Auto-Scoring System Based Upon Feature Extraction and Neural Network Technologies
Journal of Intelligent and Robotic Systems
MISEP Method for Postnonlinear Blind Source Separation
Neural Computation
Extracting nonlinear features for multispectral images by FCMC and KPCA
Digital Signal Processing
Avoiding local minima in feedforward neural networks by simultaneous learning
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Orthogonal local spline discriminant projection with application to face recognition
Pattern Recognition Letters
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
A new feature of uniformity of image texture directions coinciding with the human eyes perception
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
Leaf recognition based on the combination of wavelet transform and gaussian interpolation
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
A sequential niching technique for particle swarm optimization
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
Signature verification using wavelet transform and support vector machine
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
A fast input selection algorithm for neural modeling of nonlinear dynamic systems
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part I
Improvements to the conventional layer-by-layer BP algorithm
ICIC'05 Proceedings of the 2005 international conference on Advances in Intelligent Computing - Volume Part II
Hi-index | 0.00 |
In this paper, the local minima-free conditions of the outer-supervised feedforward neural networks (FNN) based on batch-style learning are studied by means of the embedded subspace method. It is proven that only if the rendition that the number of the hidden neurons is not less than that of the training samples, which is sufficient but not necessary, is satisfied, the network will necessarily converge to the global minima with null cost, and that the condition that the range space of the outer-supervised signal matrix is included in the range space of the hidden output matrix Is sufficient and necessary condition for the local minima-free in the error surface. In addition, under the condition of the number of the hidden neurons being less than that of the training samples and greater than the number of the output neurons, it is demonstrated that there will also only exist the global minima with null cost in the error surface if the first layer weights are adequately selected