Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
2.
Front Neurosci ; 17: 1183321, 2023.
Article in English | MEDLINE | ID: mdl-37250397

ABSTRACT

We propose that in order to harness our understanding of neuroscience toward machine learning, we must first have powerful tools for training brain-like models of learning. Although substantial progress has been made toward understanding the dynamics of learning in the brain, neuroscience-derived models of learning have yet to demonstrate the same performance capabilities as methods in deep learning such as gradient descent. Inspired by the successes of machine learning using gradient descent, we introduce a bi-level optimization framework that seeks to both solve online learning tasks and improve the ability to learn online using models of plasticity from neuroscience. We demonstrate that models of three-factor learning with synaptic plasticity taken from the neuroscience literature can be trained in Spiking Neural Networks (SNNs) with gradient descent via a framework of learning-to-learn to address challenging online learning problems. This framework opens a new path toward developing neuroscience inspired online learning algorithms.

3.
J Math Biol ; 85(4): 39, 2022 09 22.
Article in English | MEDLINE | ID: mdl-36136245

ABSTRACT

A model of population growth and dispersal is considered where the spatial habitat is a lattice and reproduction occurs generationally. The resulting discrete dynamical system exhibits velocity locking, where rational speed invasion fronts are observed to persist as parameters are varied. In this article, we construct locked fronts for a particular piecewise linear reproduction function. These fronts are shown to be linear combinations of exponentially decaying solutions to the linear system near the unstable state. Based upon these front solutions, we then derive expressions for the boundary of locking regions in parameter space. We obtain leading order expansions for the locking regions in the limit as the migration parameter tends to zero. Strict spectral stability in exponentially weighted spaces is also established.


Subject(s)
Ecosystem , Population Growth , Reproduction
4.
Front Neurorobot ; 15: 629210, 2021.
Article in English | MEDLINE | ID: mdl-34630063

ABSTRACT

The adaptive changes in synaptic efficacy that occur between spiking neurons have been demonstrated to play a critical role in learning for biological neural networks. Despite this source of inspiration, many learning focused applications using Spiking Neural Networks (SNNs) retain static synaptic connections, preventing additional learning after the initial training period. Here, we introduce a framework for simultaneously learning the underlying fixed-weights and the rules governing the dynamics of synaptic plasticity and neuromodulated synaptic plasticity in SNNs through gradient descent. We further demonstrate the capabilities of this framework on a series of challenging benchmarks, learning the parameters of several plasticity rules including BCM, Oja's, and their respective set of neuromodulatory variants. The experimental results display that SNNs augmented with differentiable plasticity are sufficient for solving a set of challenging temporal learning tasks that a traditional SNN fails to solve, even in the presence of significant noise. These networks are also shown to be capable of producing locomotion on a high-dimensional robotic learning task, where near-minimal degradation in performance is observed in the presence of novel conditions not seen during the initial training period.

SELECTION OF CITATIONS
SEARCH DETAIL
...