Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 85
Filter
Add more filters










Publication year range
1.
IEEE Trans Artif Intell ; 5(1): 80-91, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38500544

ABSTRACT

Deep learning models perform remarkably well on many classification tasks recently. The superior performance of deep neural networks relies on the large number of training data, which at the same time must have an equal class distribution in order to be efficient. However, in most real-world applications, the labeled data may be limited with high imbalance ratios among the classes, and thus, the learning process of most classification algorithms is adversely affected resulting in unstable predictions and low performance. Three main categories of approaches address the problem of imbalanced learning, i.e., data-level, algorithmic level, and hybrid methods, which combine the two aforementioned approaches. Data generative methods are typically based on generative adversarial networks, which require significant amounts of data, while model-level methods entail extensive domain expert knowledge to craft the learning objectives, thereby being less accessible for users without such knowledge. Moreover, the vast majority of these approaches are designed and applied to imaging applications, less to time series, and extremely rare to both of them. To address the above issues, we introduce GENDA, a generative neighborhood-based deep autoencoder, which is simple yet effective in its design and can be successfully applied to both image and time-series data. GENDA is based on learning latent representations that rely on the neighboring embedding space of the samples. Extensive experiments, conducted on a variety of widely-used real datasets demonstrate the efficacy of the proposed method. Impact Statement­: Imbalanced data classification is an actual and important issue in many real-world learning applications hampering most classification tasks. Fraud detection, biomedical imaging categorizing healthy people versus patients, and object detection are some indicative domains with an economic, social and technological impact, which are greatly affected by inherent imbalanced data distribution. However, the majority of the existing algorithms that address the imbalanced classification problem are designed with a particular application in mind, and thus they can be used with specific datasets and even hyperparameters. The generative model introduced in this paper overcomes this limitation and produces improved results for a large class of imaging and time series data even under severe imbalance ratios, making it quite competitive.

2.
Curr Opin Neurobiol ; 85: 102853, 2024 04.
Article in English | MEDLINE | ID: mdl-38394956

ABSTRACT

The brain is a remarkably capable and efficient system. It can process and store huge amounts of noisy and unstructured information, using minimal energy. In contrast, current artificial intelligence (AI) systems require vast resources for training while still struggling to compete in tasks that are trivial for biological agents. Thus, brain-inspired engineering has emerged as a promising new avenue for designing sustainable, next-generation AI systems. Here, we describe how dendritic mechanisms of biological neurons have inspired innovative solutions for significant AI problems, including credit assignment in multi-layer networks, catastrophic forgetting, and high-power consumption. These findings provide exciting alternatives to existing architectures, showing how dendritic research can pave the way for building more powerful and energy efficient artificial learning systems.


Subject(s)
Gastropoda , Neurology , Animals , Artificial Intelligence , Machine Learning , Brain
3.
Elife ; 122023 Dec 06.
Article in English | MEDLINE | ID: mdl-38054403

ABSTRACT

Pyramidal neurons, a mainstay of cortical regions, receive a plethora of inputs from various areas onto their morphologically distinct apical and basal trees. Both trees differentially contribute to the somatic response, defining distinct anatomical and possibly functional sub-units. To elucidate the contribution of each tree to the encoding of visual stimuli at the somatic level, we modeled the response pattern of a mouse L2/3 V1 pyramidal neuron to orientation tuned synaptic input. Towards this goal, we used a morphologically detailed computational model of a single cell that replicates electrophysiological and two-photon imaging data. Our simulations predict a synergistic effect of apical and basal trees on somatic action potential generation: basal tree activity, in the form of either depolarization or dendritic spiking, is necessary for producing somatic activity, despite the fact that most somatic spikes are heavily driven by apical dendritic spikes. This model provides evidence for synergistic computations taking place in the basal and apical trees of the L2/3 V1 neuron along with mechanistic explanations for tree-specific contributions and emphasizes the potential role of predictive and attentional feedback input in these cells.


Subject(s)
Primary Visual Cortex , Pyramidal Cells , Animals , Mice , Action Potentials/physiology , Dendrites/physiology , Neurons , Pyramidal Cells/physiology
4.
Curr Opin Neurobiol ; 83: 102812, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37980803

ABSTRACT

The brain is a highly efficient system that has evolved to optimize performance under limited resources. In this review, we highlight recent theoretical and experimental studies that support the view that dendrites make information processing and storage in the brain more efficient. This is achieved through the dynamic modulation of integration versus segregation of inputs and activity within a neuron. We argue that under conditions of limited energy and space, dendrites help biological networks to implement complex functions such as processing natural stimuli on behavioral timescales, performing the inference process on those stimuli in a context-specific manner, and storing the information in overlapping populations of neurons. A global picture starts to emerge, in which dendrites help the brain achieve efficiency through a combination of optimization strategies that balance the tradeoff between performance and resource utilization.


Subject(s)
Dendrites , Neurons , Dendrites/physiology , Neurons/physiology , Brain/physiology , Cognition
5.
Front Behav Neurosci ; 17: 1212139, 2023.
Article in English | MEDLINE | ID: mdl-37576932

ABSTRACT

Accumulating evidence from a wide range of studies, including behavioral, cellular, molecular and computational findings, support a key role of dendrites in the encoding and recall of new memories. Dendrites can integrate synaptic inputs in non-linear ways, provide the substrate for local protein synthesis and facilitate the orchestration of signaling pathways that regulate local synaptic plasticity. These capabilities allow them to act as a second layer of computation within the neuron and serve as the fundamental unit of plasticity. As such, dendrites are integral parts of the memory engram, namely the physical representation of memories in the brain and are increasingly studied during learning tasks. Here, we review experimental and computational studies that support a novel, dendritic view of the memory engram that is centered on non-linear dendritic branches as elementary memory units. We highlight the potential implications of dendritic engrams for the learning and memory field and discuss future research directions.

6.
ArXiv ; 2023 Jun 12.
Article in English | MEDLINE | ID: mdl-37396597

ABSTRACT

The brain is a highly efficient system evolved to achieve high performance with limited resources. We propose that dendrites make information processing and storage in the brain more efficient through the segregation of inputs and their conditional integration via nonlinear events, the compartmentalization of activity and plasticity and the binding of information through synapse clustering. In real-world scenarios with limited energy and space, dendrites help biological networks process natural stimuli on behavioral timescales, perform the inference process on those stimuli in a context-specific manner, and store the information in overlapping populations of neurons. A global picture starts to emerge, in which dendrites help the brain achieve efficiency through a combination of optimization strategies balancing the tradeoff between performance and resource utilization.

7.
ArXiv ; 2023 Jun 13.
Article in English | MEDLINE | ID: mdl-37396619

ABSTRACT

The brain is a remarkably capable and efficient system. It can process and store huge amounts of noisy and unstructured information using minimal energy. In contrast, current artificial intelligence (AI) systems require vast resources for training while still struggling to compete in tasks that are trivial for biological agents. Thus, brain-inspired engineering has emerged as a promising new avenue for designing sustainable, next-generation AI systems. Here, we describe how dendritic mechanisms of biological neurons have inspired innovative solutions for significant AI problems, including credit assignment in multilayer networks, catastrophic forgetting, and high energy consumption. These findings provide exciting alternatives to existing architectures, showing how dendritic research can pave the way for building more powerful and energy-efficient artificial learning systems.

8.
Neuron ; 111(20): 3154-3175, 2023 10 18.
Article in English | MEDLINE | ID: mdl-37467748

ABSTRACT

One of the most captivating questions in neuroscience revolves around the brain's ability to efficiently and durably capture and store information. It must process continuous input from sensory organs while also encoding memories that can persist throughout a lifetime. What are the cellular-, subcellular-, and network-level mechanisms that underlie this remarkable capacity for long-term information storage? Furthermore, what contributions do distinct types of GABAergic interneurons make to this process? As the hippocampus plays a pivotal role in memory, our review focuses on three aspects: (1) delineation of hippocampal interneuron types and their connectivity, (2) interneuron plasticity, and (3) activity patterns of interneurons during memory-related rhythms, including the role of long-range interneurons and disinhibition. We explore how these three elements, together showcasing the remarkable diversity of inhibitory circuits, shape the processing of memories in the hippocampus.


Subject(s)
Hippocampus , Interneurons , Interneurons/physiology , Hippocampus/physiology
9.
bioRxiv ; 2023 May 24.
Article in English | MEDLINE | ID: mdl-37292929

ABSTRACT

While artificial machine learning systems achieve superhuman performance in specific tasks such as language processing, image and video recognition, they do so use extremely large datasets and huge amounts of power. On the other hand, the brain remains superior in several cognitively challenging tasks while operating with the energy of a small lightbulb. We use a biologically constrained spiking neural network model to explore how the neural tissue achieves such high efficiency and assess its learning capacity on discrimination tasks. We found that synaptic turnover, a form of structural plasticity, which is the ability of the brain to form and eliminate synapses continuously, increases both the speed and the performance of our network on all tasks tested. Moreover, it allows accurate learning using a smaller number of examples. Importantly, these improvements are most significant under conditions of resource scarcity, such as when the number of trainable parameters is halved and when the task difficulty is increased. Our findings provide new insights into the mechanisms that underlie efficient learning in the brain and can inspire the development of more efficient and flexible machine learning algorithms.

10.
Cell Rep ; 42(1): 111962, 2023 01 31.
Article in English | MEDLINE | ID: mdl-36640337

ABSTRACT

The lateral entorhinal cortex (LEC) provides multisensory information to the hippocampus, directly to the distal dendrites of CA1 pyramidal neurons. LEC neurons perform important functions for episodic memory processing, coding for contextually salient elements of an environment or experience. However, we know little about the functional circuit interactions between the LEC and the hippocampus. We combine functional circuit mapping and computational modeling to examine how long-range glutamatergic LEC projections modulate compartment-specific excitation-inhibition dynamics in hippocampal area CA1. We demonstrate that glutamatergic LEC inputs can drive local dendritic spikes in CA1 pyramidal neurons, aided by the recruitment of a disinhibitory VIP interneuron microcircuit. Our circuit mapping and modeling further reveal that LEC inputs also recruit CCK interneurons that may act as strong suppressors of dendritic spikes. These results highlight a cortically driven GABAergic microcircuit mechanism that gates nonlinear dendritic computations, which may support compartment-specific coding of multisensory contextual features within the hippocampus.


Subject(s)
Entorhinal Cortex , Hippocampus , Entorhinal Cortex/physiology , Hippocampus/physiology , Pyramidal Cells/physiology , Neurons/physiology , Dendrites/physiology , Interneurons/physiology
11.
Nat Commun ; 14(1): 131, 2023 01 10.
Article in English | MEDLINE | ID: mdl-36627284

ABSTRACT

Computational modeling has been indispensable for understanding how subcellular neuronal features influence circuit processing. However, the role of dendritic computations in network-level operations remains largely unexplored. This is partly because existing tools do not allow the development of realistic and efficient network models that account for dendrites. Current spiking neural networks, although efficient, are usually quite simplistic, overlooking essential dendritic properties. Conversely, circuit models with morphologically detailed neuron models are computationally costly, thus impractical for large-network simulations. To bridge the gap between these two extremes and facilitate the adoption of dendritic features in spiking neural networks, we introduce Dendrify, an open-source Python package based on Brian 2. Dendrify, through simple commands, automatically generates reduced compartmental neuron models with simplified yet biologically relevant dendritic and synaptic integrative properties. Such models strike a good balance between flexibility, performance, and biological accuracy, allowing us to explore dendritic contributions to network-level functions while paving the way for developing more powerful neuromorphic systems.


Subject(s)
Neural Networks, Computer , Neurons , Neurons/physiology , Computer Simulation , Dendrites/physiology
12.
J Physiol ; 601(15): 3091-3102, 2023 08.
Article in English | MEDLINE | ID: mdl-36218068

ABSTRACT

For the past seven decades, the Hodgkin-Huxley (HH) formalism has been an invaluable tool in the arsenal of neuroscientists, allowing for robust and reproducible modelling of ionic conductances and the electrophysiological phenomena they underlie. Despite its apparent age, its role as a cornerstone of computational neuroscience has not waned. The discovery of dendritic regenerative events mediated by ionic and synaptic conductances has solidified the importance of HH-based models further, yielding new predictions concerning dendritic integration, synaptic plasticity and neuronal computation. These predictions are often validated through in vivo and in vitro experiments, advancing our understanding of the neuron as a biological system and emphasizing the importance of HH-based detailed computational models as an instrument of dendritic research. In this article, we discuss recent studies in which the HH formalism is used to shed new light on dendritic function and its role in neuronal phenomena.


Subject(s)
Models, Neurological , Neurons , Action Potentials/physiology , Neurons/physiology , Electrophysiological Phenomena , Neuronal Plasticity
13.
Neuron ; 110(20): 3374-3388.e8, 2022 10 19.
Article in English | MEDLINE | ID: mdl-36041433

ABSTRACT

Individual memories are often linked so that the recall of one triggers the recall of another. For example, contextual memories acquired close in time can be linked, and this is known to depend on a temporary increase in excitability that drives the overlap between dorsal CA1 (dCA1) hippocampal ensembles that encode the linked memories. Here, we show that locus coeruleus (LC) cells projecting to dCA1 have a key permissive role in contextual memory linking, without affecting contextual memory formation, and that this effect is mediated by dopamine. Additionally, we found that LC-to-dCA1-projecting neurons modulate the excitability of dCA1 neurons and the extent of overlap between dCA1 memory ensembles as well as the stability of coactivity patterns within these ensembles. This discovery of a neuromodulatory system that specifically affects memory linking without affecting memory formation reveals a fundamental separation between the brain mechanisms modulating these two distinct processes.


Subject(s)
Dopamine , Locus Coeruleus , Locus Coeruleus/physiology , Dopamine/physiology , Memory/physiology , Hippocampus/physiology , Neurons/physiology
14.
Neuroscience ; 489: 1-3, 2022 05 01.
Article in English | MEDLINE | ID: mdl-35465871
15.
Adv Exp Med Biol ; 1359: 25-67, 2022.
Article in English | MEDLINE | ID: mdl-35471534

ABSTRACT

The first step toward understanding the brain is to learn how individual neurons process incoming signals, the vast majority of which arrive in their dendrites. Dendrites were first discovered at the beginning of the twentieth century and were characterized by great anatomical variability, both within and across species. Over the past years, a rich repertoire of active and passive dendritic mechanisms has been unveiled, which greatly influences their integrative power. Yet, our understanding of how dendrites compute remains limited, mainly because technological limitations make it difficult to record from dendrites directly and manipulate them. Computational modeling, on the other hand, is perfectly suited for this task. Biophysical models that account for the morphology as well as passive and active neuronal properties can explain a wide variety of experimental findings, shedding new light on how dendrites contribute to neuronal and circuit computations. This chapter aims to help the interested reader build biophysical models incorporating dendrites by detailing how their electrophysiological properties can be described using simple mathematical frameworks. We start by discussing the passive properties of dendrites and then give an overview of how active conductances can be incorporated, leading to realistic in silico replicas of biological neurons.


Subject(s)
Dendrites , Neurons , Biophysics , Computer Simulation , Dendrites/physiology , Neurons/physiology , Synapses/physiology
16.
Neuroscience ; 489: 34-43, 2022 05 01.
Article in English | MEDLINE | ID: mdl-34843894

ABSTRACT

GABAergic interneurons (INs) are a highly diverse class of neurons in the mammalian brain with a critical role in orchestrating multiple cognitive functions and maintaining the balance of excitation/inhibition across neuronal circuitries. In this perspective, we discuss recent findings regarding the ability of some IN subtypes to integrate incoming inputs in nonlinear ways within their dendritic branches. These recently discovered features may endow the specific INs with advanced computing capabilities, whose breadth and functional contributions remain an open question. Along these lines, we discuss theoretical and experimental evidence regarding the potential role of nonlinear IN dendrites in advancing single neuron computations and contributing to memory formation.


Subject(s)
Dendrites , Interneurons , Animals , Brain , Dendrites/physiology , GABAergic Neurons , Interneurons/physiology , Mammals , Neurons
17.
Neuroscience ; 489: 275-289, 2022 05 01.
Article in English | MEDLINE | ID: mdl-34656706

ABSTRACT

In this paper, we discuss the nonlinear computational power provided by dendrites in biological and artificial neurons. We start by briefly presenting biological evidence about the type of dendritic nonlinearities, respective plasticity rules and their effect on biological learning as assessed by computational models. Four major computational implications are identified as improved expressivity, more efficient use of resources, utilizing internal learning signals, and enabling continual learning. We then discuss examples of how dendritic computations have been used to solve real-world classification problems with performance reported on well known data sets used in machine learning. The works are categorized according to the three primary methods of plasticity used-structural plasticity, weight plasticity, or plasticity of synaptic delays. Finally, we show the recent trend of confluence between concepts of deep learning and dendritic computations and highlight some future research directions.


Subject(s)
Dendrites , Models, Neurological , Dendrites/physiology , Machine Learning , Neuronal Plasticity/physiology , Neurons/physiology
18.
Curr Opin Neurobiol ; 70: 1-10, 2021 10.
Article in English | MEDLINE | ID: mdl-34087540

ABSTRACT

This article highlights specific features of biological neurons and their dendritic trees, whose adoption may help advance artificial neural networks used in various machine learning applications. Advancements could take the form of increased computational capabilities and/or reduced power consumption. Proposed features include dendritic anatomy, dendritic nonlinearities, and compartmentalized plasticity rules, all of which shape learning and information processing in biological networks. We discuss the computational benefits provided by these features in biological neurons and suggest ways to adopt them in artificial neurons in order to exploit the respective benefits in machine learning.


Subject(s)
Models, Neurological , Neural Networks, Computer , Dendrites/physiology , Machine Learning , Neurons/physiology
19.
Trends Cogn Sci ; 25(4): 265-268, 2021 04.
Article in English | MEDLINE | ID: mdl-33608214

ABSTRACT

Legacy conferences are costly and time consuming, and exclude scientists lacking various resources or abilities. During the 2020 pandemic, we created an online conference platform, Neuromatch Conferences (NMC), aimed at developing technological and cultural changes to make conferences more democratic, scalable, and accessible. We discuss the lessons we learned.


Subject(s)
Pandemics , Humans
20.
Neuron ; 108(5): 968-983.e9, 2020 12 09.
Article in English | MEDLINE | ID: mdl-33022227

ABSTRACT

Cortical computations are critically reliant on their local circuit, GABAergic cells. In the hippocampus, a large body of work has identified an unprecedented diversity of GABAergic interneurons with pronounced anatomical, molecular, and physiological differences. Yet little is known about the functional properties and activity dynamics of the major hippocampal interneuron classes in behaving animals. Here we use fast, targeted, three-dimensional (3D) two-photon calcium imaging coupled with immunohistochemistry-based molecular identification to retrospectively map in vivo activity onto multiple classes of interneurons in the mouse hippocampal area CA1 during head-fixed exploration and goal-directed learning. We find examples of preferential subtype recruitment with quantitative differences in response properties and feature selectivity during key behavioral tasks and states. These results provide new insights into the collective organization of local inhibitory circuits supporting navigational and mnemonic functions of the hippocampus.


Subject(s)
CA1 Region, Hippocampal/cytology , CA1 Region, Hippocampal/diagnostic imaging , Imaging, Three-Dimensional/methods , Interneurons/ultrastructure , Microscopy, Fluorescence, Multiphoton/methods , Animals , CA1 Region, Hippocampal/chemistry , Calcium/analysis , Calcium/metabolism , Female , Interneurons/chemistry , Male , Mice , Mice, Transgenic , Microscopy, Confocal/methods
SELECTION OF CITATIONS
SEARCH DETAIL
...