Design and defect tolerance beyond CMOS
CODES+ISSS '08 Proceedings of the 6th IEEE/ACM/IFIP international conference on Hardware/Software codesign and system synthesis
Scaling-efficient in-situ training of CMOL CrossNet classifiers
Neural Networks
Learning with memristive devices: How should we model their behavior?
NANOARCH '11 Proceedings of the 2011 IEEE/ACM International Symposium on Nanoscale Architectures
Fault-tolerant programmable logic array for nanoelectronics
International Journal of Circuit Theory and Applications
Robust learning approach for neuro-inspired nanoscale crossbar architecture
ACM Journal on Emerging Technologies in Computing Systems (JETC) - Special Issue on Reliability and Device Degradation in Emerging Technologies and Special Issue on WoSAR 2011
Hi-index | 0.00 |
Mixed-signal neuromorphic networks (‘CrossNets’), based on hybrid CMOS/nanodevice circuits, may provide unprecedented performance for important pattern classification tasks. The synaptic weights necessary for such tasks may be imported from an external ‘precursor’ network with either continuous or discrete synaptic weights (in the former case, with the quantization—‘clipping’—due to the binary character of the elementary synaptic nanodevices—latching switches.) Alternatively, the weights may be adjusted ‘in situ’ (inside the CrossNet) using a pseudo-stochastic method, or set-up using a mixed-mode method partly employing external circuitry. Our calculations have shown that CrossNet pattern classifiers, using any of these synaptic weight adjustment methods, may be remarkably resilient. For example, in a CrossNet with synapses in the form of two small square arrays with 4 × 4 nanodevices each, the resulting weight discreteness may have a virtually negligible effect on the classification fidelity, while the fraction of defective devices which affects the performance substantially ranges from ∼20% to as high as 90% (!), depending on the training method. Copyright © 2007 John Wiley & Sons, Ltd.