Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros










Base de datos
Intervalo de año de publicación
1.
IMA J Math Appl Med Biol ; 14(3): 227-39, 1997 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-9306676

RESUMEN

A principle of information-entropy maximization is introduced in order to characterize the optimal representation of an arbitrarily varying quantity by a neural output confined to a finite interval. We then study the conditions under which a neuron can effectively fulfil the requirements imposed by this information-theoretic optimal principle. We show that this can be achieved with the natural properties available to the neuron. Specifically, we first deduce that neural (monotonically increasing and saturating) nonlinearities are potentially efficient for achieving the entropy maximization, for any given input signal. Secondly, we derive simple laws which adaptively adjust modifiable parameters of a neuron toward maximum entropy. Remarkably, the adaptation laws that realize entropy maximization are found to belong to the class of anti-Hebbian laws (a class having experimental groundings), with a special, yet simple, nonlinear form. The present results highlight the usefulness of general information-theoretic principles in contributing to the understanding of neural systems and their remarkable performances for information processing.


Asunto(s)
Matemática , Modelos Neurológicos , Neuronas Aferentes/fisiología , Adaptación Fisiológica , Animales , Humanos , Dinámicas no Lineales , Transducción de Señal/fisiología , Transmisión Sináptica/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...