Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Cogn Neurodyn ; 18(3): 1323-1335, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38826641

ABSTRACT

In order to comprehend and enhance models that describes various brain regions it is important to study the dynamics of trained recurrent neural networks. Including Dale's law in such models usually presents several challenges. However, this is an important aspect that allows computational models to better capture the characteristics of the brain. Here we present a framework to train networks using such constraint. Then we have used it to train them in simple decision making tasks. We characterized the eigenvalue distributions of the recurrent weight matrices of such networks. Interestingly, we discovered that the non-dominant eigenvalues of the recurrent weight matrix are distributed in a circle with a radius less than 1 for those whose initial condition before training was random normal and in a ring for those whose initial condition was random orthogonal. In both cases, the radius does not depend on the fraction of excitatory and inhibitory units nor the size of the network. Diminution of the radius, compared to networks trained without the constraint, has implications on the activity and dynamics that we discussed here. Supplementary Information: The online version contains supplementary material available at 10.1007/s11571-023-09956-w.

2.
Front Syst Neurosci ; 18: 1269190, 2024.
Article in English | MEDLINE | ID: mdl-38600907

ABSTRACT

Training neural networks to perform different tasks is relevant across various disciplines. In particular, Recurrent Neural Networks (RNNs) are of great interest in Computational Neuroscience. Open-source frameworks dedicated to Machine Learning, such as Tensorflow and Keras have produced significant changes in the development of technologies that we currently use. This work contributes by comprehensively investigating and describing the application of RNNs for temporal processing through a study of a 3-bit Flip Flop memory implementation. We delve into the entire modeling process, encompassing equations, task parametrization, and software development. The obtained networks are meticulously analyzed to elucidate dynamics, aided by an array of visualization and analysis tools. Moreover, the provided code is versatile enough to facilitate the modeling of diverse tasks and systems. Furthermore, we present how memory states can be efficiently stored in the vertices of a cube in the dimensionally reduced space, supplementing previous results with a distinct approach.

3.
J Comput Neurosci ; 51(4): 407-431, 2023 11.
Article in English | MEDLINE | ID: mdl-37561278

ABSTRACT

Recurrent Neural Networks (RNNs) are frequently used to model aspects of brain function and structure. In this work, we trained small fully-connected RNNs to perform temporal and flow control tasks with time-varying stimuli. Our results show that different RNNs can solve the same task by converging to different underlying dynamics and also how the performance gracefully degrades as either network size is decreased, interval duration is increased, or connectivity damage is induced. For the considered tasks, we explored how robust the network obtained after training can be according to task parameterization. In the process, we developed a framework that can be useful to parameterize other tasks of interest in computational neuroscience. Our results are useful to quantify different aspects of the models, which are normally used as black boxes and need to be understood in order to model the biological response of cerebral cortex areas.


Subject(s)
Models, Neurological , Neurosciences , Neural Networks, Computer , Cerebral Cortex
4.
Cogn Neurodyn ; 17(1): 257-275, 2023 Feb.
Article in English | MEDLINE | ID: mdl-35469119

ABSTRACT

Different brain areas, such as the cortex and, more specifically, the prefrontal cortex, show great recurrence in their connections, even in early sensory areas. Several approaches and methods based on trained networks have been proposed to model and describe these regions. It is essential to understand the dynamics behind the models because they are used to build different hypotheses about the functioning of brain areas and to explain experimental results. The main contribution here is the description of the dynamics through the classification and interpretation carried out with a set of numerical simulations. This study sheds light on the multiplicity of solutions obtained for the same tasks and shows the link between the spectra of linearized trained networks and the dynamics of the counterparts. The patterns in the distribution of the eigenvalues of the recurrent weight matrix were studied and properly related to the dynamics in each task. Supplementary Information: The online version contains supplementary material available at 10.1007/s11571-022-09802-5.

5.
MethodsX ; 6: 124-131, 2019.
Article in English | MEDLINE | ID: mdl-30671355

ABSTRACT

In this work a simple implementation of fundamental frequency estimation is presented. The algorithm is based on a frequency-domain approach. It was mainly developed for tonal sounds and it was used in Canary birdsong analysis. The method was implemented but not restricted for this kind of data. It could be easily adapted for other sounds. Python libraries were used to develop a code with a simple algorithm to obtain fundamental frequency. An open source code is provided in the local university repository and Github. •The algorithm and the implementation are very simple and cover a set of potential applications for signal analysis.•Code implementation is written in python, very easy to use and modify.•Present method is proposed to analyze data from sounds of Serinus canaria.

SELECTION OF CITATIONS
SEARCH DETAIL
...