Nonlinear convergence boosts information coding in circuits with parallel outputs
Author(s) -
Gabrielle J. Gutierrez,
Fred Rieke,
Eric SheaBrown
Publication year - 2021
Publication title -
proceedings of the national academy of sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 5.011
H-Index - 771
eISSN - 1091-6490
pISSN - 0027-8424
DOI - 10.1073/pnas.1921882118
Subject(s) - electronic circuit , convergence (economics) , computer science , divergence (linguistics) , coding (social sciences) , nonlinear system , information theory , computation , algorithm , mathematics , engineering , statistics , physics , quantum mechanics , electrical engineering , economics , economic growth , linguistics , philosophy
Neural circuits are structured with layers of converging and diverging connectivity and selectivity-inducing nonlinearities at neurons and synapses. These components have the potential to hamper an accurate encoding of the circuit inputs. Past computational studies have optimized the nonlinearities of single neurons, or connection weights in networks, to maximize encoded information, but have not grappled with the simultaneous impact of convergent circuit structure and nonlinear response functions for efficient coding. Our approach is to compare model circuits with different combinations of convergence, divergence, and nonlinear neurons to discover how interactions between these components affect coding efficiency. We find that a convergent circuit with divergent parallel pathways can encode more information with nonlinear subunits than with linear subunits, despite the compressive loss induced by the convergence and the nonlinearities when considered separately.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom