Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 45
Filter
1.
Helicobacter ; 29(4): e13101, 2024.
Article in English | MEDLINE | ID: mdl-38987862

ABSTRACT

BACKGROUND: Latin America has a high prevalence of Helicobacter pylori in children that may lead to peptic ulcer disease and eventually gastric cancer in adulthood. Successful eradication is hindered by rising antimicrobial resistance. We summarize H. pylori resistance rates in Latin American children from 2008 to 2023. MATERIAL AND METHODS: Systematic review following PRISMA guidelines and National Heart, Lung, and Blood Institute checklist to assess risk of bias (PROSPERO CRD42024517108) that included original cross-sectional observational studies reporting resistance to commonly used antibiotics in Latin American children and adolescents. We searched in PubMed, LILACS, and SciELO databases. RESULTS: Of 51 studies, 45 were excluded. The quality of the six analyzed studies (297 H. pylori-positive samples) was satisfactory. Phenotypic methods (N = 3) reported higher resistance rates than genotypic studies (N = 3). Clarithromycin resistance ranged from 8.0% to 26.7% (6 studies; 297 samples), metronidazole from 1.9% to 40.2% (4 studies; 211 samples), amoxicillin from 0% to 10.4% (3 studies; 158 samples), tetracycline resistance was not detected (3 studies; 158 samples), and levofloxacin resistance was 2.8% (1 study; 36 samples). CONCLUSION: Scarce Latin American studies on H. pylori resistance, along with methodological heterogeneity, hinder conclusive findings. Clarithromycin and metronidazole (first-line drugs) resistance is worrisome, likely impacting lower eradication rates. Urgent systematic surveillance or individual testing before treatment is necessary to enhance eradication.


Subject(s)
Anti-Bacterial Agents , Drug Resistance, Bacterial , Helicobacter Infections , Helicobacter pylori , Humans , Helicobacter pylori/drug effects , Helicobacter pylori/genetics , Helicobacter pylori/isolation & purification , Helicobacter Infections/microbiology , Helicobacter Infections/drug therapy , Helicobacter Infections/epidemiology , Latin America/epidemiology , Adolescent , Child , Anti-Bacterial Agents/pharmacology , Child, Preschool , Microbial Sensitivity Tests , Cross-Sectional Studies
2.
Chaos ; 34(5)2024 May 01.
Article in English | MEDLINE | ID: mdl-38809907

ABSTRACT

The properties of complex networked systems arise from the interplay between the dynamics of their elements and the underlying topology. Thus, to understand their behavior, it is crucial to convene as much information as possible about their topological organization. However, in large systems, such as neuronal networks, the reconstruction of such topology is usually carried out from the information encoded in the dynamics on the network, such as spike train time series, and by measuring the transfer entropy between system elements. The topological information recovered by these methods does not necessarily capture the connectivity layout, but rather the causal flow of information between elements. New theoretical frameworks, such as Integrated Information Decomposition (Φ-ID), allow one to explore the modes in which information can flow between parts of a system, opening a rich landscape of interactions between network topology, dynamics, and information. Here, we apply Φ-ID on in silico and in vitro data to decompose the usual transfer entropy measure into different modes of information transfer, namely, synergistic, redundant, or unique. We demonstrate that the unique information transfer is the most relevant measure to uncover structural topological details from network activity data, while redundant information only introduces residual information for this application. Although the retrieved network connectivity is still functional, it captures more details of the underlying structural topology by avoiding to take into account emergent high-order interactions and information redundancy between elements, which are important for the functional behavior, but mask the detection of direct simple interactions between elements constituted by the structural network topology.


Subject(s)
Computer Simulation , Models, Neurological , Nerve Net , Neurons , Nerve Net/physiology , Neurons/physiology , Animals , Entropy , Action Potentials/physiology
3.
Front Comput Neurosci ; 16: 836532, 2022.
Article in English | MEDLINE | ID: mdl-35465268

ABSTRACT

The last decade has witnessed a remarkable progress in our understanding of the brain. This has mainly been based on the scrutiny and modeling of the transmission of activity among neurons across lively synapses. A main conclusion, thus far, is that essential features of the mind rely on collective phenomena that emerge from a willful interaction of many neurons that, mediating other cells, form a complex network whose details keep constantly adapting to their activity and surroundings. In parallel, theoretical and computational studies developed to understand many natural and artificial complex systems, which have truthfully explained their amazing emergent features and precise the role of the interaction dynamics and other conditions behind the different collective phenomena they happen to display. Focusing on promising ideas that arise when comparing these neurobiology and physics studies, the present perspective article shortly reviews such fascinating scenarios looking for clues about how high-level cognitive processes such as consciousness, intelligence, and identity can emerge. We, thus, show that basic concepts of physics, such as dynamical phases and non-equilibrium phase transitions, become quite relevant to the brain activity while determined by factors at the subcellular, cellular, and network levels. We also show how these transitions depend on details of the processing mechanism of stimuli in a noisy background and, most important, that one may detect them in familiar electroencephalogram (EEG) recordings. Thus, we associate the existence of such phases, which reveal a brain operating at (non-equilibrium) criticality, with the emergence of most interesting phenomena during memory tasks.

4.
Biology (Basel) ; 10(7)2021 Jul 11.
Article in English | MEDLINE | ID: mdl-34356502

ABSTRACT

We here study a network of synaptic relations mingling excitatory and inhibitory neuron nodes that displays oscillations quite similar to electroencephalogram (EEG) brain waves, and identify abrupt variations brought about by swift synaptic mediations. We thus conclude that corresponding changes in EEG series surely come from the slowdown of the activity in neuron populations due to synaptic restrictions. The latter happens to generate an imbalance between excitation and inhibition causing a quick explosive increase of excitatory activity, which turns out to be a (first-order) transition among dynamic mental phases. Moreover, near this phase transition, our model system exhibits waves with a strong component in the so-called delta-theta domain that coexist with fast oscillations. These findings provide a simple explanation for the observed delta-gamma and theta-gamma modulation in actual brains, and open a serious and versatile path to understand deeply large amounts of apparently erratic, easily accessible brain data.

5.
Neural Netw ; 142: 44-56, 2021 Oct.
Article in English | MEDLINE | ID: mdl-33984735

ABSTRACT

The interplay between structure and function affects the emerging properties of many natural systems. Here we use an adaptive neural network model that couples activity and topological dynamics and reproduces the experimental temporal profiles of synaptic density observed in the brain. We prove that the existence of a transient period of relatively high synaptic connectivity is critical for the development of the system under noise circumstances, such that the resulting network can recover stored memories. Moreover, we show that intermediate synaptic densities provide optimal developmental paths with minimum energy consumption, and that ultimately it is the transient heterogeneity in the network that determines its evolution. These results could explain why the pruning curves observed in actual brain areas present their characteristic temporal profiles and they also suggest new design strategies to build biologically inspired neural networks with particular information processing capabilities.


Subject(s)
Brain , Neural Networks, Computer
6.
Phys Rev Lett ; 124(21): 218301, 2020 May 29.
Article in English | MEDLINE | ID: mdl-32530670

ABSTRACT

The higher-order interactions of complex systems, such as the brain, are captured by their simplicial complex structure and have a significant effect on dynamics. However, the existing dynamical models defined on simplicial complexes make the strong assumption that the dynamics resides exclusively on the nodes. Here we formulate the higher-order Kuramoto model which describes the interactions between oscillators placed not only on nodes but also on links, triangles, and so on. We show that higher-order Kuramoto dynamics can lead to an explosive synchronization transition by using an adaptive coupling dependent on the solenoidal and the irrotational component of the dynamics.

7.
Neural Netw ; 126: 108-117, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32208304

ABSTRACT

Here we study the emergence of chimera states, a recently reported phenomenon referring to the coexistence of synchronized and unsynchronized dynamical units, in a population of Morris-Lecar neurons which are coupled by both electrical and chemical synapses, constituting a hybrid synaptic architecture, as in actual brain connectivity. This scheme consists of a nonlocal network where the nearest neighbor neurons are coupled by electrical synapses, while the synapses from more distant neurons are of the chemical type. We demonstrate that peculiar dynamical behaviors, including chimera state and traveling wave, exist in such a hybrid coupled neural system, and analyze how the relative abundance of chemical and electrical synapses affects the features of chimera and different synchrony states (i.e. incoherent, traveling wave and coherent) and the regions in the space of relevant parameters for their emergence. Additionally, we show that, when the relative population of chemical synapses increases further, a new intriguing chaotic dynamical behavior appears above the region for chimera states. This is characterized by the coexistence of two distinct synchronized states with different amplitude, and an unsynchronized state, that we denote as a chaotic amplitude chimera. We also discuss about the computational implications of such state.


Subject(s)
Electrical Synapses/physiology , Models, Neurological , Neurons/physiology , Animals , Brain/physiology , Connectome , Humans
8.
Front Comput Neurosci ; 13: 22, 2019.
Article in English | MEDLINE | ID: mdl-31057385

ABSTRACT

Nature exhibits countless examples of adaptive networks, whose topology evolves constantly coupled with the activity due to its function. The brain is an illustrative example of a system in which a dynamic complex network develops by the generation and pruning of synaptic contacts between neurons while memories are acquired and consolidated. Here, we consider a recently proposed brain developing model to study how mechanisms responsible for the evolution of brain structure affect and are affected by memory storage processes. Following recent experimental observations, we assume that the basic rules for adding and removing synapses depend on local synaptic currents at the respective neurons in addition to global mechanisms depending on the mean connectivity. In this way a feedback loop between "form" and "function" spontaneously emerges that influences the ability of the system to optimally store and retrieve sensory information in patterns of brain activity or memories. In particular, we report here that, as a consequence of such a feedback-loop, oscillations in the activity of the system among the memorized patterns can occur, depending on parameters, reminding mind dynamical processes. Such oscillations have their origin in the destabilization of memory attractors due to the pruning dynamics, which induces a kind of structural disorder or noise in the system at a long-term scale. This constantly modifies the synaptic disorder induced by the interference among the many patterns of activity memorized in the system. Such new intriguing oscillatory behavior is to be associated only to long-term synaptic mechanisms during the network evolution dynamics, and it does not depend on short-term synaptic processes, as assumed in other studies, that are not present in our model.

9.
Phys Rev E ; 99(2-1): 022307, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30934278

ABSTRACT

Recently there is a surge of interest in network geometry and topology. Here we show that the spectral dimension plays a fundamental role in establishing a clear relation between the topological and geometrical properties of a network and its dynamics. Specifically we explore the role of the spectral dimension in determining the synchronization properties of the Kuramoto model. We show that the synchronized phase can only be thermodynamically stable for spectral dimensions above four and that phase entrainment of the oscillators can only be found for spectral dimensions greater than two. We numerically test our analytical predictions on the recently introduced model of network geometry called complex network manifolds, which displays a tunable spectral dimension.

10.
Neural Netw ; 110: 131-140, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30550865

ABSTRACT

We observe and study a self-organized phenomenon whereby the activity in a network of spiking neurons spontaneously terminates. We consider different types of populations, consisting of bistable model neurons connected electrically by gap junctions, or by either excitatory or inhibitory synapses, in a scale-free connection topology. We find that strongly synchronized population spiking events lead to complete cessation of activity in excitatory networks, but not in gap junction or inhibitory networks. We identify the underlying mechanism responsible for this phenomenon by examining the particular shape of the excitatory postsynaptic currents that arise in the neurons. We also examine the effects of the synaptic time constant, coupling strength, and channel noise on the occurrence of the phenomenon.


Subject(s)
Action Potentials , Neural Networks, Computer , Neurons , Action Potentials/physiology , Cortical Synchronization/physiology , Gap Junctions/physiology , Humans , Models, Neurological , Nerve Net/physiology , Neurons/physiology , Synapses/physiology
11.
Sci Rep ; 8(1): 9910, 2018 07 02.
Article in English | MEDLINE | ID: mdl-29967410

ABSTRACT

The dynamics of networks of neuronal cultures has been recently shown to be strongly dependent on the network geometry and in particular on their dimensionality. However, this phenomenon has been so far mostly unexplored from the theoretical point of view. Here we reveal the rich interplay between network geometry and synchronization of coupled oscillators in the context of a simplicial complex model of manifolds called Complex Network Manifold. The networks generated by this model combine small world properties (infinite Hausdorff dimension) and a high modular structure with finite and tunable spectral dimension. We show that the networks display frustrated synchronization for a wide range of the coupling strength of the oscillators, and that the synchronization properties are directly affected by the spectral dimension of the network.

12.
PLoS Comput Biol ; 13(7): e1005646, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28692643

ABSTRACT

Inverse Stochastic Resonance (ISR) is a phenomenon in which the average spiking rate of a neuron exhibits a minimum with respect to noise. ISR has been studied in individual neurons, but here, we investigate ISR in scale-free networks, where the average spiking rate is calculated over the neuronal population. We use Hodgkin-Huxley model neurons with channel noise (i.e., stochastic gating variable dynamics), and the network connectivity is implemented via electrical or chemical connections (i.e., gap junctions or excitatory/inhibitory synapses). We find that the emergence of ISR depends on the interplay between each neuron's intrinsic dynamical structure, channel noise, and network inputs, where the latter in turn depend on network structure parameters. We observe that with weak gap junction or excitatory synaptic coupling, network heterogeneity and sparseness tend to favor the emergence of ISR. With inhibitory coupling, ISR is quite robust. We also identify dynamical mechanisms that underlie various features of this ISR behavior. Our results suggest possible ways of experimentally observing ISR in actual neuronal systems.


Subject(s)
Action Potentials/physiology , Models, Neurological , Nerve Net/physiology , Neurons/physiology , Algorithms , Computational Biology , Humans , Stochastic Processes
13.
Phys Rev E ; 95(1-1): 012404, 2017 Jan.
Article in English | MEDLINE | ID: mdl-28208458

ABSTRACT

We investigate the behavior of a model neuron that receives a biophysically realistic noisy postsynaptic current based on uncorrelated spiking activity from a large number of afferents. We show that, with static synapses, such noise can give rise to inverse stochastic resonance (ISR) as a function of the presynaptic firing rate. We compare this to the case with dynamic synapses that feature short-term synaptic plasticity and show that the interval of presynaptic firing rate over which ISR exists can be extended or diminished. We consider both short-term depression and facilitation. Interestingly, we find that a double inverse stochastic resonance (DISR), with two distinct wells centered at different presynaptic firing rates, can appear.


Subject(s)
Models, Neurological , Neurons/physiology , Synapses/physiology , Action Potentials , Animals , Neuronal Plasticity/physiology , Stochastic Processes
14.
Chaos ; 26(6): 065101, 2016 Jun.
Article in English | MEDLINE | ID: mdl-27368790

ABSTRACT

In the last years, network scientists have directed their interest to the multi-layer character of real-world systems, and explicitly considered the structural and dynamical organization of graphs made of diverse layers between its constituents. Most complex systems include multiple subsystems and layers of connectivity and, in many cases, the interdependent components of systems interact through many different channels. Such a new perspective is indeed found to be the adequate representation for a wealth of features exhibited by networked systems in the real world. The contributions presented in this Focus Issue cover, from different points of view, the many achievements and still open questions in the field of multi-layer networks, such as: new frameworks and structures to represent and analyze heterogeneous complex systems, different aspects related to synchronization and centrality of complex networks, interplay between layers, and applications to logistic, biological, social, and technological fields.


Subject(s)
Models, Theoretical , Algorithms
15.
PLoS One ; 11(1): e0145830, 2016.
Article in English | MEDLINE | ID: mdl-26730737

ABSTRACT

In this paper we analyze the interplay between the subthreshold oscillations of a single neuron conductance-based model and the short-term plasticity of a dynamic synapse with a depressing mechanism. In previous research, the computational properties of subthreshold oscillations and dynamic synapses have been studied separately. Our results show that dynamic synapses can influence different aspects of the dynamics of neuronal subthreshold oscillations. Factors such as maximum hyperpolarization level, oscillation amplitude and frequency or the resulting firing threshold are modulated by synaptic depression, which can even make subthreshold oscillations disappear. This influence reshapes the postsynaptic neuron's resonant properties arising from subthreshold oscillations and leads to specific input/output relations. We also study the neuron's response to another simultaneous input in the context of this modulation, and show a distinct contextual processing as a function of the depression, in particular for detection of signals through weak synapses. Intrinsic oscillations dynamics can be combined with the characteristic time scale of the modulatory input received by a dynamic synapse to build cost-effective cell/channel-specific information discrimination mechanisms, beyond simple resonances. In this regard, we discuss the functional implications of synaptic depression modulation on intrinsic subthreshold dynamics.


Subject(s)
Inhibitory Postsynaptic Potentials , Neurons/physiology , Synapses/physiology , Computer Simulation , Humans , Models, Neurological , Neurons/cytology
16.
Sci Rep ; 5: 12216, 2015 Jul 20.
Article in English | MEDLINE | ID: mdl-26193453

ABSTRACT

We here illustrate how a well-founded study of the brain may originate in assuming analogies with phase-transition phenomena. Analyzing to what extent a weak signal endures in noisy environments, we identify the underlying mechanisms, and it results a description of how the excitability associated to (non-equilibrium) phase changes and criticality optimizes the processing of the signal. Our setting is a network of integrate-and-fire nodes in which connections are heterogeneous with rapid time-varying intensities mimicking fatigue and potentiation. Emergence then becomes quite robust against wiring topology modification--in fact, we considered from a fully connected network to the Homo sapiens connectome--showing the essential role of synaptic flickering on computations. We also suggest how to experimentally disclose significant changes during actual brain operation.


Subject(s)
Brain/physiology , Nervous System Physiological Phenomena , Computer Simulation , Humans , Models, Neurological , Time Factors
17.
PLoS One ; 10(3): e0121156, 2015.
Article in English | MEDLINE | ID: mdl-25799449

ABSTRACT

We investigate the efficient transmission and processing of weak, subthreshold signals in a realistic neural medium in the presence of different levels of the underlying noise. Assuming Hebbian weights for maximal synaptic conductances--that naturally balances the network with excitatory and inhibitory synapses--and considering short-term synaptic plasticity affecting such conductances, we found different dynamic phases in the system. This includes a memory phase where population of neurons remain synchronized, an oscillatory phase where transitions between different synchronized populations of neurons appears and an asynchronous or noisy phase. When a weak stimulus input is applied to each neuron, increasing the level of noise in the medium we found an efficient transmission of such stimuli around the transition and critical points separating different phases for well-defined different levels of stochasticity in the system. We proved that this intriguing phenomenon is quite robust, as it occurs in different situations including several types of synaptic plasticity, different type and number of stored patterns and diverse network topologies, namely, diluted networks and complex topologies such as scale-free and small-world networks. We conclude that the robustness of the phenomenon in different realistic scenarios, including spiking neurons, short-term synaptic plasticity and complex networks topologies, make very likely that it could also occur in actual neural systems as recent psycho-physical experiments suggest.


Subject(s)
Neurons/physiology , Synaptic Transmission , Computational Biology/methods , Models, Neurological , Neuronal Plasticity
18.
Bol Asoc Med P R ; 106(1): 54-6, 2014.
Article in English | MEDLINE | ID: mdl-24791367

ABSTRACT

Brenner tumor accounts for 1.5 to 2.5% of ovarian tumors. Nearly all are benign and 1% malignant. Less than twenty-five cases of borderline Brenner tumor have been reported worldwide. Our case is the first one related to a bilateral ovarian serous cystadenofibroma and endometrioid adenocarcinoma. This unusual case increases the limited data for borderline Brenner tumors.


Subject(s)
Brenner Tumor/pathology , Cystadenoma, Serous/pathology , Endometrial Neoplasms/pathology , Estrogens , Neoplasms, Hormone-Dependent/pathology , Neoplasms, Multiple Primary/pathology , Ovarian Neoplasms/pathology , Antineoplastic Agents, Hormonal/adverse effects , Antineoplastic Agents, Hormonal/therapeutic use , Biomarkers, Tumor/analysis , Breast Neoplasms/drug therapy , Breast Neoplasms/surgery , Brenner Tumor/metabolism , Brenner Tumor/surgery , Carcinoma, Ductal, Breast/drug therapy , Carcinoma, Ductal, Breast/surgery , Combined Modality Therapy , Cystadenoma, Serous/surgery , Endometrial Neoplasms/chemically induced , Endometrial Neoplasms/etiology , Endometrial Neoplasms/surgery , Estrogens/metabolism , Female , Humans , Hysterectomy , Middle Aged , Neoplasms, Hormone-Dependent/chemically induced , Neoplasms, Hormone-Dependent/etiology , Neoplasms, Hormone-Dependent/surgery , Neoplasms, Multiple Primary/surgery , Neoplasms, Second Primary/chemically induced , Neoplasms, Second Primary/pathology , Neoplasms, Second Primary/surgery , Ovarian Cysts/complications , Ovarian Neoplasms/metabolism , Ovarian Neoplasms/surgery , Ovariectomy , Salpingectomy , Tamoxifen/adverse effects , Tamoxifen/therapeutic use
19.
Article in English | MEDLINE | ID: mdl-23637657

ABSTRACT

In this paper we review our research on the effect and computational role of dynamical synapses on feed-forward and recurrent neural networks. Among others, we report on the appearance of a new class of dynamical memories which result from the destabilization of learned memory attractors. This has important consequences for dynamic information processing allowing the system to sequentially access the information stored in the memories under changing stimuli. Although storage capacity of stable memories also decreases, our study demonstrated the positive effect of synaptic facilitation to recover maximum storage capacity and to enlarge the capacity of the system for memory recall in noisy conditions. Possibly, the new dynamical behavior can be associated with the voltage transitions between up and down states observed in cortical areas in the brain. We investigated the conditions for which the permanence times in the up state are power-law distributed, which is a sign for criticality, and concluded that the experimentally observed large variability of permanence times could be explained as the result of noisy dynamic synapses with large recovery times. Finally, we report how short-term synaptic processes can transmit weak signals throughout more than one frequency range in noisy neural networks, displaying a kind of stochastic multi-resonance. This effect is due to competition between activity-dependent synaptic fluctuations (due to dynamic synapses) and the existence of neuron firing threshold which adapts to the incoming mean synaptic input.

20.
PLoS One ; 8(1): e50276, 2013.
Article in English | MEDLINE | ID: mdl-23349664

ABSTRACT

Short-term memory in the brain cannot in general be explained the way long-term memory can--as a gradual modification of synaptic weights--since it takes place too quickly. Theories based on some form of cellular bistability, however, do not seem able to account for the fact that noisy neurons can collectively store information in a robust manner. We show how a sufficiently clustered network of simple model neurons can be instantly induced into metastable states capable of retaining information for a short time (a few seconds). The mechanism is robust to different network topologies and kinds of neural model. This could constitute a viable means available to the brain for sensory and/or short-term memory with no need of synaptic learning. Relevant phenomena described by neurobiology and psychology, such as local synchronization of synaptic inputs and power-law statistics of forgetting avalanches, emerge naturally from this mechanism, and we suggest possible experiments to test its viability in more biological settings.


Subject(s)
Memory/physiology , Models, Neurological , Cluster Analysis , Humans , Neurons/cytology , Synapses/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...