Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Chaos ; 33(7)2023 Jul 01.
Article in English | MEDLINE | ID: mdl-37486668

ABSTRACT

Adaptivity is a dynamical feature that is omnipresent in nature, socio-economics, and technology. For example, adaptive couplings appear in various real-world systems, such as the power grid, social, and neural networks, and they form the backbone of closed-loop control strategies and machine learning algorithms. In this article, we provide an interdisciplinary perspective on adaptive systems. We reflect on the notion and terminology of adaptivity in different disciplines and discuss which role adaptivity plays for various fields. We highlight common open challenges and give perspectives on future research directions, looking to inspire interdisciplinary approaches.

2.
PLoS Comput Biol ; 19(1): e1010813, 2023 01.
Article in English | MEDLINE | ID: mdl-36716332

ABSTRACT

The advent of comprehensive synaptic wiring diagrams of large neural circuits has created the field of connectomics and given rise to a number of open research questions. One such question is whether it is possible to reconstruct the information stored in a recurrent network of neurons, given its synaptic connectivity matrix. Here, we address this question by determining when solving such an inference problem is theoretically possible in specific attractor network models and by providing a practical algorithm to do so. The algorithm builds on ideas from statistical physics to perform approximate Bayesian inference and is amenable to exact analysis. We study its performance on three different models, compare the algorithm to standard algorithms such as PCA, and explore the limitations of reconstructing stored patterns from synaptic connectivity.


Subject(s)
Neural Networks, Computer , Neurons , Bayes Theorem , Neurons/physiology , Algorithms , Models, Neurological
3.
Proc Natl Acad Sci U S A ; 119(40): e2201854119, 2022 10 04.
Article in English | MEDLINE | ID: mdl-36161906

ABSTRACT

Exploiting data invariances is crucial for efficient learning in both artificial and biological neural circuits. Understanding how neural networks can discover appropriate representations capable of harnessing the underlying symmetries of their inputs is thus crucial in machine learning and neuroscience. Convolutional neural networks, for example, were designed to exploit translation symmetry, and their capabilities triggered the first wave of deep learning successes. However, learning convolutions directly from translation-invariant data with a fully connected network has so far proven elusive. Here we show how initially fully connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs, resulting in localized, space-tiling receptive fields. These receptive fields match the filters of a convolutional network trained on the same task. By carefully designing data models for the visual scene, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs, which has long been recognized as the hallmark of natural images. We provide an analytical and numerical characterization of the pattern formation mechanism responsible for this phenomenon in a simple model and find an unexpected link between receptive field formation and tensor decomposition of higher-order input correlations. These results provide a perspective on the development of low-level feature detectors in various sensory modalities and pave the way for studying the impact of higher-order statistics on learning in neural networks.


Subject(s)
Machine Learning , Neural Networks, Computer , Neurosciences
4.
J Stat Mech ; 2020(12): 124010, 2020 Dec.
Article in English | MEDLINE | ID: mdl-34262607

ABSTRACT

Deep neural networks achieve stellar generalisation even when they have enough parameters to easily fit all their training data. We study this phenomenon by analysing the dynamics and the performance of over-parameterised two-layer neural networks in the teacher-student setup, where one network, the student, is trained on data generated by another network, called the teacher. We show how the dynamics of stochastic gradient descent (SGD) is captured by a set of differential equations and prove that this description is asymptotically exact in the limit of large inputs. Using this framework, we calculate the final generalisation error of student networks that have more parameters than their teachers. We find that the final generalisation error of the student increases with network size when training only the first layer, but stays constant or even decreases with size when training both layers. We show that these different behaviours have their root in the different solutions SGD finds for different activation functions. Our results indicate that achieving good generalisation in neural networks goes beyond the properties of SGD alone and depends on the interplay of at least the algorithm, the model architecture, and the data set.

5.
Phys Rev Lett ; 118(1): 010601, 2017 Jan 06.
Article in English | MEDLINE | ID: mdl-28106416

ABSTRACT

Virtually every organism gathers information about its noisy environment and builds models from those data, mostly using neural networks. Here, we use stochastic thermodynamics to analyze the learning of a classification rule by a neural network. We show that the information acquired by the network is bounded by the thermodynamic cost of learning and introduce a learning efficiency η≤1. We discuss the conditions for optimal learning and analyze Hebbian learning in the thermodynamic limit.

6.
Adv Protein Chem Struct Biol ; 90: 67-117, 2013.
Article in English | MEDLINE | ID: mdl-23582202

ABSTRACT

Zinc finger domains are one of the most common structural motifs in eukaryotic cells, which employ the motif in some of their most important proteins (including TFIIIA, CTCF, and ZiF268). These DNA binding proteins contain up to 37 zinc finger domains connected by flexible linker regions. They have been shown to be important organizers of the 3D structure of chromosomes and as such are called the master weaver of the genome. Using NMR and numerical simulations, much progress has been made during the past few decades in understanding their various functions and their ways of binding to the DNA, but a large knowledge gap remains to be filled. One problem of the hitherto existing theoretical models of zinc finger protein DNA binding in this context is that they are aimed at describing specific binding. Furthermore, they exclusively focus on the microscopic details or approach the problem without considering such details at all. We present the Flexible Linker Model, which aims explicitly at describing nonspecific binding. It takes into account the most important effects of flexible linkers and allows a qualitative investigation of the effects of these linkers on the nonspecific binding affinity of zinc finger proteins to DNA. Our results indicate that the binding affinity is increased by the flexible linkers by several orders of magnitude. Moreover, they show that the binding map for proteins with more than one domain presents interesting structures, which have been neither observed nor described before, and can be interpreted to fit very well with existing theories of facilitated target location. The effect of the increased binding affinity is also in agreement with recent experiments that until now have lacked an explanation. We further explore the class of proteins with flexible linkers, which are unstructured until they bind. We have developed a methodology to characterize these flexible proteins. Employing the concept of barcodes, we propose a measure to compare such flexible proteins in terms of a similarity measure. This measure is validated by a comparison between a geometric similarity measure and the topological similarity measure that takes geometry as well as topology into account.


Subject(s)
Chromosomes/metabolism , DNA-Binding Proteins/metabolism , Models, Biological , Amino Acid Sequence , Animals , CCCTC-Binding Factor , Cell Nucleus/chemistry , Chromatin/chemistry , Chromatin/metabolism , Chromosomes/chemistry , DNA-Binding Proteins/chemistry , Humans , Molecular Sequence Data , Protein Binding , Repressor Proteins/chemistry , Repressor Proteins/metabolism , Xenopus laevis/metabolism , Zinc Fingers
SELECTION OF CITATIONS
SEARCH DETAIL
...