Data Entry: Please note that the research database will be replaced by UNIverse by the end of October 2023. Please enter your data into the system https://universe-intern.unibas.ch. Thanks

Login for users with Unibas email account...

Login for registered users without Unibas email account...

 
A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network
ConferencePaper (Artikel, die in Tagungsbänden erschienen sind)
 
ID 4605976
Author(s) Confavreux, Basile; Zenke, Friedemann; Agnes, Everton J.; Lillicrap, Timothy; Vogels, Tim P.
Author(s) at UniBasel Agnes, Everton Joao
Year 2020
Title A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network
Editor(s) Larochelle, H.; Ranzato, M.; Hadsell, R.; Balcan, M. F.; Lin, H.
Book title (Conference Proceedings) Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Place of Conference Vancouver, Canada
Publisher Curran Associates, Inc.
Pages 1-11
Abstract The search for biologically faithful synaptic plasticity rules has resulted in a large body of models. They are usually inspired by -- and fitted to -- experimental data, but they rarely produce neural dynamics that serve complex functions. These failures suggest that current plasticity models are still under-constrained by existing data. Here, we present an alternative approach that uses meta-learning to discover plausible synaptic plasticity rules. Instead of experimental data, the rules are constrained by the functions they implement and the structure they are meant to produce. Briefly, we parameterize synaptic plasticity rules by a Volterra expansion and then use supervised learning methods (gradient descent or evolutionary strategies) to minimize a problem-dependent loss function that quantifies how effectively a candidate plasticity rule transforms an initially random network into one with the desired function. We first validate our approach by re-discovering previously described plasticity rules, starting at the single-neuron level and Oja’s rule'', a simple Hebbian plasticity rule that captures the direction of most variability of inputs to a neuron (i.e., the first principal component). We expand the problem to the network level and ask the framework to find Oja’s rule together with an anti-Hebbian rule such that an initially random two-layer firing-rate network will recover several principal components of the input space after learning. Next, we move to networks of integrate-and-fire neurons with plastic inhibitory afferents. We train for rules that achieve a target firing rate by countering tuned excitation. Our algorithm discovers a specific subset of the manifold of rules that can solve this task. Our work is a proof of principle of an automated and unbiased approach to unveil synaptic plasticity rules that obey biological constraints and can solve complex functions.
URL https://proceedings.neurips.cc/paper/2020/hash/bdbd5ebfde4934142c8a88e7a3796cd5-Abstract.html
edoc-URL https://edoc.unibas.ch/79146/
Full Text on edoc No
Digital Object Identifier DOI 10.1101/2020.10.24.353409
 
   

MCSS v5.8 PRO. 0.401 sec, queries - 0.000 sec ©Universität Basel  |  Impressum   |    
24/04/2024