Hebbian Learning in FPGA Silicon Neuronal Network
Author(s) -
Jing Li,
Yuichi Katori,
Takashi Kohno
Publication year - 2013
Language(s) - English
Resource type - Conference proceedings
DOI - 10.12792/icisip2013.020
Subject(s) - hebbian theory , computer science , leabra , content addressable memory , hopfield network , artificial intelligence , artificial neural network , content addressable storage , wake sleep algorithm , generalization error
This paper describes a digital silicon neuronal network trained by the Hebbian learning rule that can execute the auto-associative memory. In our previous work, a fully connected network of 256 silicon neurons based on the digital spiking silicon neuron (DSSN) model and kinetic-model-based silicon synapses were implemented. In this work, we added circuit modules that append Hebbian learning function and fitted it to a Xilinx Virtex 6 XC6VSX315T FPGA device. The performances of auto-associative memory with several spike-time-dependent Hebbian learning rules and the correlation rule are compared. The results show that Hebbian learning rules that model both synaptic potentiation and depression improve the retrieval probability in our silicon neuronal network.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom