z-logo
open-access-imgOpen Access
Parallel Operation of Self‐Limited Analog Programming for Fast Array‐Level Weight Programming and Update
Author(s) -
Song Hanchan,
An Jangho,
Son Seoil,
Kim Young Seok,
Park Juseong,
Jeon Jae Bum,
Kim Geunyoung,
Kim Kyung Min
Publication year - 2020
Publication title -
advanced intelligent systems
Language(s) - English
Resource type - Journals
ISSN - 2640-4567
DOI - 10.1002/aisy.202000014
Subject(s) - memristor , neuromorphic engineering , mnist database , computer science , artificial neural network , programming paradigm , memistor , parallel computing , electronic engineering , resistive random access memory , artificial intelligence , voltage , electrical engineering , engineering , programming language
Memristive neural networks perform vector matrix multiplication efficiently, which is used for the accelerator of neuromorphic computing. To train the memristor cells in a memristive neural network, the analog conductance state of the memristor should be programmed in parallel; otherwise, the resulting long training time can limit the size of the neural network. Herein, a novel parallel programming method using the self‐limited analog switching behavior of the memristor is proposed. A Pt/Ti:NbO x /NbO x /TiN charge trap memristor device for the programming demonstration is utilized, and a convolutional neural network is emulated to train the MNIST dataset, based on the device characteristics. In the simulation, the proposed programming method is able to reduce programming time to as low as 1/130, compared with the sequential programming method. The simulation suggests that the programming time required by the proposed method is not affected by array size, which makes it very promising in a high‐density neural network.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here