Road recognition for vision navigation of an autonomous vehicle by fuzzy reasoning
Fuzzy Sets and Systems
Improving backpropagation learning under limited precision
Pattern Recognition Letters
Backpropagation Algorithm for Logic Oriented Neural Networks
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 2 - Volume 2
IEEE Transactions on Neural Networks
Fast neural networks without multipliers
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A type of optimized neural networks with limited precision weights (LPWNN) is presented in this paper. Such neural networks, which require less memory for storing the weights and less expensive floating point units in order to perform the computations involved, are better suited for embedded systems implementation than the real weight ones. Based on analyzing the learning capability of LPWNN, Quantize Back-propagation Step-by-Step (QBPSS) algorithm is proposed for such neural networks to overcome the effects of limited precision. Methods of designing and training LPNN are represented, including the quantization of non-linear activation function and the selection of learning rate, network architecture and weights precision. The optimized LPWNN performance has been evaluated by comparing to conventional neural networks with double-precision floating-point weights on road recognition of image for intelligent vehicle in ARM 9 embedded systems, and the results show the optimized LPWNN has 7 times faster than the conventional ones.