z-logo
open-access-imgOpen Access
Gradient Descent on Multilevel Spin–Orbit Synapses with Tunable Variations
Author(s) -
Lan Xiukai,
Cao Yi,
Liu Xiangyu,
Xu Kaijia,
Liu Chuan,
Zheng Houzhi,
Wang Kaiyou
Publication year - 2021
Publication title -
advanced intelligent systems
Language(s) - English
Resource type - Journals
ISSN - 2640-4567
DOI - 10.1002/aisy.202000182
Subject(s) - neuromorphic engineering , linearity , synapse , computer science , gradient descent , artificial neural network , materials science , electronic engineering , artificial intelligence , neuroscience , engineering , biology
Neuromorphic computing using multilevel nonvolatile memories as synapses offers opportunities for future energy‐ and area‐efficient artificial intelligence. Among these memories, artificial synapses based on current‐induced magnetization switching driven by spin–orbit torques (SOTs) have attracted great attention recently. Herein, the gradient descent algorithm, a primary learning algorithm, implemented on a 2 × 1 SOT synaptic array is reported. Successful pattern classifications are experimentally realized through the tuning of cycle‐to‐cycle variation, linearity range, and linearity deviation of the multilevel SOT synapse. Also, a larger m × n SOT synaptic array with m controlling transistors is proposed and it is found that the classification accuracies can be improved dramatically by decreasing the cycle‐to‐cycle variation. A way for the application of spin–orbit device arrays in neuromorphic computing is paved and the crucial importance of the cycle‐to‐cycle variation for a multilevel SOT synapse is suggested.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here