The principle of using nanoscale memory devices as artificial synapses in neuromorphic circuits is recognized as a promising way to build ground-breaking circuit architectures tolerant to defects and variability. Yet, actual experimental demonstrations of the neural network type of circuits based on non-conventional/non-CMOS memory devices and displaying function learning capabilities remain very scarce. We show here that carbon-nanotube-based memory elements can be used as artificial synapses, combined with conventional neurons and trained to perform functions through the application of a supervised learning algorithm. The same ensemble of eight devices can notably be trained multiple times to code successively any three-input linearly separable Boolean logic function despite device-to-device variability. This work thus represents one of the very few demonstrations of actual function learning with synapses based on nanoscale building blocks. The potential of such an approach for...
Karim Gacem, Jean-Marie Retrouvey, Djaafar Chabi, Arianna Filoramo, Weisheng Zhao, Jacques-Olivier Klein and Vincent Derycke
Click for full article
Karim Gacem, Jean-Marie Retrouvey, Djaafar Chabi, Arianna Filoramo, Weisheng Zhao, Jacques-Olivier Klein and Vincent Derycke
Click for full article
No comments:
Post a Comment