Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
PLoS Comput Biol ; 8(5): e1002522, 2012.
Article in English | MEDLINE | ID: mdl-22615555

ABSTRACT

The functional networks of cultured neurons exhibit complex network properties similar to those found in vivo. Starting from random seeding, cultures undergo significant reorganization during the initial period in vitro, yet despite providing an ideal platform for observing developmental changes in neuronal connectivity, little is known about how a complex functional network evolves from isolated neurons. In the present study, evolution of functional connectivity was estimated from correlations of spontaneous activity. Network properties were quantified using complex measures from graph theory and used to compare cultures at different stages of development during the first 5 weeks in vitro. Networks obtained from young cultures (14 days in vitro) exhibited a random topology, which evolved to a small-world topology during maturation. The topology change was accompanied by an increased presence of highly connected areas (hubs) and network efficiency increased with age. The small-world topology balances integration of network areas with segregation of specialized processing units. The emergence of such network structure in cultured neurons, despite a lack of external input, points to complex intrinsic biological mechanisms. Moreover, the functional network of cultures at mature ages is efficient and highly suited to complex processing tasks.


Subject(s)
Action Potentials/physiology , Models, Neurological , Models, Statistical , Nerve Net/physiology , Neurogenesis/physiology , Neurons/physiology , Animals , Cell Proliferation , Cells, Cultured , Computer Simulation , Humans
2.
IEEE Trans Biomed Eng ; 59(1): 30-4, 2012 Jan.
Article in English | MEDLINE | ID: mdl-21997245

ABSTRACT

Cultures of cortical neurons grown on multielectrode arrays exhibit spontaneous, robust, and recurrent patterns of highly synchronous activity called bursts. These bursts play a crucial role in the development and topological self-organization of neuronal networks. Thus, understanding the evolution of synchrony within these bursts could give insight into network growth and the functional processes involved in learning and memory. Functional connectivity networks can be constructed by observing patterns of synchrony that evolve during bursts. To capture this evolution, a modeling approach is adopted using a framework of emergent evolving complex networks and, through taking advantage of the multiple time scales of the system, aims to show the importance of sequential and ordered synchronization in network function.


Subject(s)
Action Potentials/physiology , Nerve Net/physiology , Neural Networks, Computer , Neurons/physiology , Synaptic Transmission/physiology , Animals , Cells, Cultured , Computer Simulation , Rats
3.
IEEE Trans Neural Syst Rehabil Eng ; 19(4): 345-55, 2011 Aug.
Article in English | MEDLINE | ID: mdl-21622081

ABSTRACT

In order to harness the computational capacity of dissociated cultured neuronal networks, it is necessary to understand neuronal dynamics and connectivity on a mesoscopic scale. To this end, this paper uncovers dynamic spatiotemporal patterns emerging from electrically stimulated neuronal cultures using hidden Markov models (HMMs) to characterize multi-channel spike trains as a progression of patterns of underlying states of neuronal activity. However, experimentation aimed at optimal choice of parameters for such models is essential and results are reported in detail. Results derived from ensemble neuronal data revealed highly repeatable patterns of state transitions in the order of milliseconds in response to probing stimuli.


Subject(s)
Electrodes , Neurons/physiology , Algorithms , Cells, Cultured , Choice Behavior , Markov Chains , Models, Neurological , Models, Statistical , Neural Networks, Computer , User-Computer Interface
4.
IEEE Trans Neural Netw ; 16(4): 983-8, 2005 Jul.
Article in English | MEDLINE | ID: mdl-16121739

ABSTRACT

Dynamic neural networks (DNNs), which are also known as recurrent neural networks, are often used for nonlinear system identification. The main contribution of this letter is the introduction of an efficient parameterization of a class of DNNs. Having to adjust less parameters simplifies the training problem and leads to more parsimonious models. The parameterization is based on approximation theory dealing with the ability of a class of DNNs to approximate finite trajectories of nonautonomous systems. The use of the proposed parameterization is illustrated through a numerical example, using data from a nonlinear model of a magnetic levitation system.


Subject(s)
Algorithms , Models, Statistical , Neural Networks, Computer , Nonlinear Dynamics , Pattern Recognition, Automated/methods , Computer Simulation
SELECTION OF CITATIONS
SEARCH DETAIL
...