Contrastive Similarity Matching for Supervised Learning
Author(s) -
Shanshan Qin,
Nayantara Mudur,
Cengiz Pehlevan
Publication year - 2021
Publication title -
neural computation
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.235
H-Index - 169
eISSN - 1530-888X
pISSN - 0899-7667
DOI - 10.1162/neco_a_01374
Subject(s) - hebbian theory , similarity (geometry) , artificial intelligence , matching (statistics) , computer science , artificial neural network , leabra , pattern recognition (psychology) , competitive learning , function (biology) , deep learning , machine learning , mathematics , image (mathematics) , statistics , wake sleep algorithm , evolutionary biology , generalization error , biology
We propose a novel biologically plausible solution to the credit assignment problem motivated by observations in the ventral visual pathway and trained deep neural networks. In both, representations of objects in the same category become progressively more similar, while objects belonging to different categories become less similar. We use this observation to motivate a layer-specific learning goal in a deep network: each layer aims to learn a representational similarity matrix that interpolates between previous and later layers. We formulate this idea using a contrastive similarity matching objective function and derive from it deep neural networks with feedforward, lateral, and feedback connections and neurons that exhibit biologically plausible Hebbian and anti-Hebbian plasticity. Contrastive similarity matching can be interpreted as an energy-based learning algorithm, but with significant differences from others in how a contrastive function is constructed.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom