Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 14 de 14
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Artigo em Inglês | MEDLINE | ID: mdl-25974542

RESUMO

We derive explicit, closed-form expressions for the cumulant densities of a multivariate, self-exciting Hawkes point process, generalizing a result of Hawkes in his earlier work on the covariance density and Bartlett spectrum of such processes. To do this, we represent the Hawkes process in terms of a Poisson cluster process and show how the cumulant density formulas can be derived by enumerating all possible "family trees," representing complex interactions between point events. We also consider the problem of computing the integrated cumulants, characterizing the average measure of correlated activity between events of different types, and derive the relevant equations.


Assuntos
Modelos Estatísticos
2.
Curr Opin Neurobiol ; 32: 38-44, 2015 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-25463563

RESUMO

Our ability to collect large amounts of data from many cells has been paralleled by the development of powerful statistical models for extracting information from this data. Here we discuss how the activity of cell assemblies can be analyzed using these models, focusing on the generalized linear models and the maximum entropy models and describing a number of recent studies that employ these tools for analyzing multi-neuronal activity. We show results from simulations comparing inferred functional connectivity, pairwise correlations and the real synaptic connections in simulated networks demonstrating the power of statistical models in inferring functional connectivity. Further development of network reconstruction techniques based on statistical models should lead to more powerful methods of understanding functional anatomy of cell assemblies.


Assuntos
Modelos Estatísticos , Rede Nervosa/fisiologia , Neurônios/fisiologia , Animais
3.
Math Biosci Eng ; 11(1): 149-65, 2014 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-24245678

RESUMO

We derive learning rules for finding the connections between units in stochastic dynamical networks from the recorded history of a "visible'' subset of the units. We consider two models. In both of them, the visible units are binary and stochastic. In one model the "hidden'' units are continuous-valued, with sigmoidal activation functions, and in the other they are binary and stochastic like the visible ones. We derive exact learning rules for both cases. For the stochastic case, performing the exact calculation requires, in general, repeated summations over an number of configurations that grows exponentially with the size of the system and the data length, which is not feasible for large systems. We derive a mean field theory, based on a factorized ansatz for the distribution of hidden-unit states, which offers an attractive alternative for large systems. We present the results of some numerical calculations that illustrate key features of the two models and, for the stochastic case, the exact and approximate calculations.


Assuntos
Potenciais de Ação/fisiologia , Modelos Neurológicos , Redes Neurais de Computação , Neurônios/fisiologia , Algoritmos , Simulação por Computador , Potenciais Pós-Sinápticos Excitadores , Humanos , Potenciais Pós-Sinápticos Inibidores , Modelos Teóricos , Probabilidade , Processos Estocásticos
4.
Phys Rev Lett ; 110(21): 210601, 2013 May 24.
Artigo em Inglês | MEDLINE | ID: mdl-23745850

RESUMO

We describe how the couplings in an asynchronous kinetic Ising model can be inferred. We consider two cases: one in which we know both the spin history and the update times and one in which we know only the spin history. For the first case, we show that one can average over all possible choices of update times to obtain a learning rule that depends only on spin correlations and can also be derived from the equations of motion for the correlations. For the second case, the same rule can be derived within a further decoupling approximation. We study all methods numerically for fully asymmetric Sherrington-Kirkpatrick models, varying the data length, system size, temperature, and external field. Good convergence is observed in accordance with the theoretical expectations.


Assuntos
Funções Verossimilhança , Modelos Químicos , Cinética
5.
Phys Rev Lett ; 106(4): 048702, 2011 Jan 28.
Artigo em Inglês | MEDLINE | ID: mdl-21405370

RESUMO

There has been recent progress on inferring the structure of interactions in complex networks when they are in stationary states satisfying detailed balance, but little has been done for nonequilibrium systems. Here we introduce an approach to this problem, considering, as an example, the question of recovering the interactions in an asymmetrically coupled, synchronously updated Sherrington-Kirkpatrick model. We derive an exact iterative inversion algorithm and develop efficient approximations based on dynamical mean-field and Thouless-Anderson-Palmer equations that express the interactions in terms of equal-time and one-time-step-delayed correlation functions.

6.
Neural Comput ; 22(2): 427-47, 2010 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-19842988

RESUMO

Neuronal firing correlations are studied using simulations of a simple network model for a cortical column in a high-conductance state with dynamically balanced excitation and inhibition. Although correlations between individual pairs of neurons exhibit considerable heterogeneity, population averages show systematic behavior. When the network is in a stationary state, the average correlations are generically small: correlation coefficients are of order 1/N, where N is the number of neurons in the network. However, when the input to the network varies strongly in time, much larger values are found. In this situation, the network is out of balance, and the synaptic conductance is low, at times when the strongest firing occurs. However, examination of the correlation functions of synaptic currents reveals that after these bursts, balance is restored within a few milliseconds by a rapid increase in inhibitory synaptic conductance. These findings suggest an extension of the notion of the balanced state to include balanced fluctuations of synaptic currents, with a characteristic timescale of a few milliseconds.


Assuntos
Potenciais de Ação/fisiologia , Córtex Cerebral/fisiologia , Rede Nervosa/fisiologia , Redes Neurais de Computação , Neurônios/fisiologia , Transmissão Sináptica/fisiologia , Algoritmos , Animais , Inteligência Artificial , Simulação por Computador , Humanos , Conceitos Matemáticos
7.
Artigo em Inglês | MEDLINE | ID: mdl-19949460

RESUMO

Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the mean values and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

8.
Phys Rev E Stat Nonlin Soft Matter Phys ; 79(5 Pt 1): 051915, 2009 May.
Artigo em Inglês | MEDLINE | ID: mdl-19518488

RESUMO

We study pairwise Ising models for describing the statistics of multineuron spike trains, using data from a simulated cortical network. We explore efficient ways of finding the optimal couplings in these models and examine their statistical properties. To do this, we extract the optimal couplings for subsets of size up to 200 neurons, essentially exactly, using Boltzmann learning. We then study the quality of several approximate methods for finding the couplings by comparing their results with those found from Boltzmann learning. Two of these methods--inversion of the Thouless-Anderson-Palmer equations and an approximation proposed by Sessak and Monasson--are remarkably accurate. Using these approximations for larger subsets of neurons, we find that extracting couplings using data from a subset smaller than the full network tends systematically to overestimate their magnitude. This effect is described qualitatively by infinite-range spin-glass theory for the normal phase. We also show that a globally correlated input to the neurons in the network leads to a small increase in the average coupling. However, the pair-to-pair variation in the couplings is much larger than this and reflects intrinsic properties of the network. Finally, we study the quality of these models by comparing their entropies with that of the data. We find that they perform well for small subsets of the neurons in the network, but the fit quality starts to deteriorate as the subset size grows, signaling the need to include higher-order correlations to describe the statistics of large networks.


Assuntos
Potenciais de Ação/fisiologia , Relógios Biológicos/fisiologia , Modelos Neurológicos , Rede Nervosa/fisiologia , Plasticidade Neuronal/fisiologia , Neurônios/fisiologia , Transmissão Sináptica/fisiologia , Simulação por Computador , Retroalimentação/fisiologia
9.
Network ; 17(2): 131-50, 2006 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-16818394

RESUMO

We present a complete mean field theory for a balanced state of a simple model of an orientation hypercolumn, with a numerical procedure for solving the mean-field equations quantitatively. With our treatment, one can determine self-consistently both the firing rates and the firing correlations, without being restricted to specific neuron models. Here, we solve the mean-field equations numerically for integrate-and-fire neurons. Several known key properties of orientation selective cortical neurons emerge naturally from the description: Irregular firing with statistics close to - but not restricted to - Poisson statistics; an almost linear gain function (firing frequency as a function of stimulus contrast) of the neurons within the network; and a contrast-invariant tuning width of the neuronal firing. We find that the irregularity in firing depends sensitively on synaptic strengths. If the Fano factor is considerably larger (smaller) than 1 at some stimulus orientation, then it is also larger (resp. maller) than 1 for all other stimulus orientations that elicit firing. We also find that the tuning of the noise in the input current is the same as the tuning of the external input, while that for the mean input current depends on both the external input and the intracortical connectivity.


Assuntos
Sensibilidades de Contraste/fisiologia , Modelos Neurológicos , Neurônios/fisiologia , Orientação/fisiologia , Córtex Visual/citologia , Campos Visuais/fisiologia , Potenciais de Ação/fisiologia , Animais , Simulação por Computador , Humanos , Redes Neurais de Computação , Estatística como Assunto , Córtex Visual/fisiologia , Vias Visuais/fisiologia
10.
Neural Comput ; 18(3): 634-59, 2006 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-16483411

RESUMO

We study the spike statistics of neurons in a network with dynamically balanced excitation and inhibition. Our model, intended to represent a generic cortical column, comprises randomly connected excitatory and inhibitory leaky integrate-and-fire neurons, driven by excitatory input from an external population. The high connectivity permits a mean field description in which synaptic currents can be treated as gaussian noise, the mean and autocorrelation function of which are calculated self-consistently from the firing statistics of single model neurons. Within this description, a wide range of Fano factors is possible. We find that the irregularity of spike trains is controlled mainly by the strength of the synapses relative to the difference between the firing threshold and the postfiring reset level of the membrane potential. For moderately strong synapses, we find spike statistics very similar to those observed in primary visual cortex.


Assuntos
Potenciais de Ação/fisiologia , Córtex Cerebral/fisiologia , Rede Nervosa/fisiologia , Vias Neurais/fisiologia , Neurônios/fisiologia , Algoritmos , Animais , Membrana Celular/fisiologia , Corpos Geniculados/fisiologia , Humanos , Inibição Neural/fisiologia , Redes Neurais de Computação , Transmissão Sináptica/fisiologia , Córtex Visual/fisiologia , Vias Visuais/fisiologia
11.
Phys Rev E Stat Nonlin Soft Matter Phys ; 70(3 Pt 1): 031105, 2004 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-15524504

RESUMO

We present a dynamical description and analysis of nonequilibrium transitions in the noisy one-dimensional Ginzburg-Landau equation for an extensive system based on a weak noise canonical phase space formulation of the Freidlin-Wentzel or Martin-Siggia-Rose methods. We derive propagating nonlinear domain wall or soliton solutions of the resulting canonical field equations with superimposed diffusive modes. The transition pathways are characterized by the nucleation and subsequent propagation of domain walls. We discuss the general switching scenario in terms of a dilute gas of propagating domain walls and evaluate the Arrhenius factor in terms of the associated action. We find excellent agreement with recent numerical optimization studies.

12.
Biosystems ; 71(3): 311-7, 2003 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-14563571

RESUMO

Large-scale expression data are today measured for thousands of genes simultaneously. This development has been followed by an exploration of theoretical tools to get as much information out of these data as possible. Several groups have used principal component analysis (PCA) for this task. However, since this approach is data-driven, care must be taken in order not to analyze the noise instead of the data. As a strong warning towards uncritical use of the output from a PCA, we employ a newly developed procedure to judge the effective dimensionality of a specific data set. Although this data set is obtained during the development of rat central nervous system, our finding is a general property of noisy time series data. Based on knowledge of the noise-level for the data, we find that the effective number of dimensions that are meaningful to use in a PCA is much lower than what could be expected from the number of measurements. We attribute this fact both to effects of noise and the lack of independence of the expression levels. Finally, we explore the possibility to increase the dimensionality by performing more measurements within one time series, and conclude that this is not a fruitful approach.


Assuntos
Algoritmos , Interpretação Estatística de Dados , Bases de Dados Genéticas , Perfilação da Expressão Gênica/métodos , Regulação da Expressão Gênica/genética , Análise de Componente Principal , Análise de Sequência de DNA/métodos , Animais , Ratos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
13.
Neural Comput ; 14(10): 2371-96, 2002 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-12396567

RESUMO

We introduce a model of generalized Hebbian learning and retrieval in oscillatory neural networks modeling cortical areas such as hippocampus and olfactory cortex. Recent experiments have shown that synaptic plasticity depends on spike timing, especially on synapses from excitatory pyramidal cells, in hippocampus, and in sensory and cerebellar cortex. Here we study how such plasticity can be used to form memories and input representations when the neural dynamics are oscillatory, as is common in the brain (particularly in the hippocampus and olfactory cortex). Learning is assumed to occur in a phase of neural plasticity, in which the network is clamped to external teaching signals. By suitable manipulation of the nonlinearity of the neurons or the oscillation frequencies during learning, the model can be made, in a retrieval phase, either to categorize new inputs or to map them, in a continuous fashion, onto the space spanned by the imprinted patterns. We identify the first of these possibilities with the function of olfactory cortex and the second with the observed response characteristics of place cells in hippocampus. We investigate both kinds of networks analytically and by computer simulations, and we link the models with experimental findings, exploring, in particular, how the spike timing dependence of the synaptic plasticity constrains the computational function of the network and vice versa.


Assuntos
Fixação Psicológica Instintiva , Modelos Neurológicos , Redes Neurais de Computação , Periodicidade , Hipocampo/fisiologia , Rememoração Mental/fisiologia , Plasticidade Neuronal/fisiologia , Dinâmica não Linear , Condutos Olfatórios/fisiologia , Células Piramidais/fisiologia , Sinapses/fisiologia
14.
Biosystems ; 65(2-3): 147-56, 2002.
Artigo em Inglês | MEDLINE | ID: mdl-12069725

RESUMO

Large-scale expression data are today measured for thousands of genes simultaneously. This development is followed by an exploration of theoretical tools to get as much information out of these data as possible. One line is to try to extract the underlying regulatory network. The models used thus far, however, contain many parameters, and a careful investigation is necessary in order not to over-fit the models. We employ principal component analysis to show how, in the context of linear additive models, one can get a rough estimate of the effective dimensionality (the number of information-carrying dimensions) of large-scale gene expression datasets. We treat both the lack of independence of different measurements in a time series and the fact that that measurements are subject to some level of noise, both of which reduce the effective dimensionality and thereby constrain the complexity of models which can be built from the data.


Assuntos
Perfilação da Expressão Gênica
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...