Ten thousand times faster: Classifying multidimensional data on a spiking neuromorphic hardware system.
Author(s) -
Nawrot Martin
Publication year - 2011
Publication title -
frontiers in computational neuroscience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.794
H-Index - 58
ISSN - 1662-5188
DOI - 10.3389/conf.fncom.2011.53.00109
Subject(s) - neuromorphic engineering , computer science , spiking neural network , classifier (uml) , computational neuroscience , systems neuroscience , artificial intelligence , artificial neural network , pattern recognition (psychology) , computer architecture , computer hardware , machine learning , neuroscience , myelin , oligodendrocyte , biology , central nervous system
Fig. 5 A) Inputoutput relation (firing rate) in the simulator. Firing rate averaged over all neurons in a group (glo merulus). Colors denote different glomeruli. Each point corresponds to one stimulus presentation. B) Same se tup on in hardware neurons, C) after calibration. D) Classifier performance on simulator and on hardware (without lateral inhibition). Ten thousand times faster: Classifying multidimensional data on a spiking neuromorphic hardware system. Michael Schmuker1,2, Daniel Brüderle3, Sven Schrader3, Martin P Nawrot1,2 Motivation Neuromorphic computing is an emerging technology that aims at bioinspired highperformance computing with spiking neuro nal networks. The FACETS/Brainscales neuromorphic hardware system runs networks of spiking neurons with a speedup of 104 [1]. Our aim was to implement a network of spiking neurons that can be trained in a supervised fashion, and to run this network on neuromorphic hardware to classify multidimensional data. The structure of the first layers of neuronal processing in the olfactory system provides a well suited template for a neuronal architecture processing multidimensional data. Challenges Classifier circuit and learning rule Challenge: Implement a supervised classifier that operates with spiking neurons. Solution: A spiking network implemented in PyNN [2], running in the NEST simulator and on the FACETS/Brainscales hardware. A featureencoding layer converges onto an association layer that has winnertakemost properties (Fig. 1). The network is trained in a supervised fashion, using a percep tronlike learning rule operating on firing rates (Fig. 1 caption). Sampling data with virtual receptors Challenge: Firing rates of spiking neurons can only represent a bounded and nonnegative range of values. We need a suitable transformation mapping real values into that value range. Solution: Virtual Receptors (VR). The response strength of a VR depends on its distance to the presented data point [3]. We use a Neural Gas (NG) algorithm [4] to distribute virtual re ceptors in data space, like olfactory receptors sample chemical space (Fig. 2). Receptor response is computed as a function of the distance between data point and receptor. This transformation yields a bounded and nonnegative repre sentation of any realvalued data set. Dimensionality can be adjusted to exceed the number of original data dimensions (dimensional oversampling), enabling a sparser representation. Decorrelation Challenge: Virtual receptors provide correlated data, but the classifier learning rule works best with uncorrelated data. Solution: Decorrelation through lateral inhibition in a preproces sing layer (see decorrelation layer in Fig. 1). Three kinds of inhibitory connectivity matrices were tested: NGbased (inhibitory connections between receptors given by the NG graph edges), correlation (inhibitory weight depends on correlation between receptors), and random lateral inhibition. Correlationbased lateral inhibition yields best decorrelation, followed by NG and random connectivity (Fig. 3). Benchmarking the impact of decorrelation on classifier perfor mance shows an increase in accuracy with increasing lateral inhibition, but no clear preference for a specific method, proba bly a ceiling effect of the spiking classifier (Fig. 4). Implementation on neuromorphic hardware Challenge: Hardware neurons vary in their firing rate response (Fig. 5B). The classifier learns on output rate, so rate variation has negative impact on the classifier performance. Solution: Calibrate the sensitivity of neuron groups (glomeruli) to achieve more homogeneous representation of input rates. We developed a calibration method that balances inhomogenities across model glomeruli (Fig. 5C). After calibration, the hardware implementation of the classifier reaches the same performance as in the simulator (Fig. 5D). 1: Neuroinformatics & Theoretical Neuroscience, Institute of Biology, Freie Universität Berlin, Berlin, Germany 2: Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany 3: Electronic Visions, Kirchoff Institute for Physics, Universität Heidelberg, Germany email: m.schmuker@fuberlin.de
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom