Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Language
Publication year range
1.
Neural Comput ; 36(7): 1424-1432, 2024 Jun 07.
Article in English | MEDLINE | ID: mdl-38669690

ABSTRACT

In recent years, there has been an intense debate about how learning in biological neural networks (BNNs) differs from learning in artificial neural networks. It is often argued that the updating of connections in the brain relies only on local information, and therefore a stochastic gradient-descent type optimization method cannot be used. In this note, we study a stochastic model for supervised learning in BNNs. We show that a (continuous) gradient step occurs approximately when each learning opportunity is processed by many local updates. This result suggests that stochastic gradient descent may indeed play a role in optimizing BNNs.


Subject(s)
Models, Neurological , Nerve Net , Neural Networks, Computer , Stochastic Processes , Supervised Machine Learning , Nerve Net/cytology , Nerve Net/physiology , Deep Learning
SELECTION OF CITATIONS
SEARCH DETAIL
...