Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Front Neurosci ; 17: 1180066, 2023.
Article in English | MEDLINE | ID: mdl-37781257

ABSTRACT

Introduction: Extracting regularities from ongoing stimulus streams to form predictions is crucial for adaptive behavior. Such regularities exist in terms of the content of the stimuli and their timing, both of which are known to interactively modulate sensory processing. In real-world stimulus streams such as music, regularities can occur at multiple levels, both in terms of contents (e.g., predictions relating to individual notes vs. their more complex groups) and timing (e.g., pertaining to timing between intervals vs. the overall beat of a musical phrase). However, it is unknown whether the brain integrates predictions in a manner that is mutually congruent (e.g., if "beat" timing predictions selectively interact with "what" predictions falling on pulses which define the beat), and whether integrating predictions in different timing conditions relies on dissociable neural correlates. Methods: To address these questions, our study manipulated "what" and "when" predictions at different levels - (local) interval-defining and (global) beat-defining - within the same stimulus stream, while neural activity was recorded using electroencephalogram (EEG) in participants (N = 20) performing a repetition detection task. Results: Our results reveal that temporal predictions based on beat or interval timing modulated mismatch responses to violations of "what" predictions happening at the predicted time points, and that these modulations were shared between types of temporal predictions in terms of the spatiotemporal distribution of EEG signals. Effective connectivity analysis using dynamic causal modeling showed that the integration of "what" and "when" predictions selectively increased connectivity at relatively late cortical processing stages, between the superior temporal gyrus and the fronto-parietal network. Discussion: Taken together, these results suggest that the brain integrates different predictions with a high degree of mutual congruence, but in a shared and distributed cortical network. This finding contrasts with recent studies indicating separable mechanisms for beat-based and memory-based predictive processing.

2.
Hear Res ; 438: 108857, 2023 10.
Article in English | MEDLINE | ID: mdl-37639922

ABSTRACT

Perception is sensitive to statistical regularities in the environment, including temporal characteristics of sensory inputs. Interestingly, implicit learning of temporal patterns in one modality can also improve their processing in another modality. However, it is unclear how cross-modal learning transfer affects neural responses to sensory stimuli. Here, we recorded neural activity of human volunteers using electroencephalography (EEG), while participants were exposed to brief sequences of randomly timed auditory or visual pulses. Some trials consisted of a repetition of the temporal pattern within the sequence, and subjects were tasked with detecting these trials. Unknown to the participants, some trials reappeared throughout the experiment across both modalities (Transfer) or only within a modality (Control), enabling implicit learning in one modality and its transfer. Using a novel method of analysis of single-trial EEG responses, we showed that learning temporal structures within and across modalities is reflected in neural learning curves. These putative neural correlates of learning transfer were similar both when temporal information learned in audition was transferred to visual stimuli and vice versa. The modality-specific mechanisms for learning of temporal information and general mechanisms which mediate learning transfer across modalities had distinct physiological signatures: temporal learning within modalities relied on modality-specific brain regions while learning transfer affected beta-band activity in frontal regions.


Subject(s)
Auditory Perception , Learning , Humans , Electroencephalography , Frontal Lobe , Healthy Volunteers
3.
Curr Biol ; 32(11): 2548-2555.e5, 2022 06 06.
Article in English | MEDLINE | ID: mdl-35487221

ABSTRACT

Recent studies have shown that stimulus history can be decoded via the use of broadband sensory impulses to reactivate mnemonic representations.1-4. However, memories of previous stimuli can also be used to form sensory predictions about upcoming stimuli.5,6 Predictive mechanisms allow the brain to create a probable model of the outside world, which can be updated when errors are detected between the model predictions and external inputs. 7-10 Direct recordings in the auditory cortex of awake mice established neural mechanisms for how encoding mechanisms might handle working memory and predictive processes without "overwriting" recent sensory events in instances where predictive mechanisms are triggered by oddballs within a sequence.11 However, it remains unclear whether mnemonic and predictive information can be decoded from cortical activity simultaneously during passive, implicit sequence processing, even in anesthetized models. Here, we recorded neural activity elicited by repeated stimulus sequences using electrocorticography (ECoG) in the auditory cortex of anesthetized rats, where events within the sequence (referred to henceforth as "vowels," for simplicity) were occasionally replaced with a broadband noise burst or omitted entirely. We show that both stimulus history and predicted stimuli can be decoded from neural responses to broadband impulses, at overlapping latencies but based on independent and uncorrelated data features. We also demonstrate that predictive representations are dynamically updated over the course of stimulation.


Subject(s)
Auditory Cortex , Acoustic Stimulation , Animals , Auditory Cortex/physiology , Auditory Perception/physiology , Electrocorticography , Memory, Short-Term/physiology , Mice , Rats
4.
Cereb Cortex ; 31(7): 3226-3236, 2021 06 10.
Article in English | MEDLINE | ID: mdl-33625488

ABSTRACT

In contrast to classical views of working memory (WM) maintenance, recent research investigating activity-silent neural states has demonstrated that persistent neural activity in sensory cortices is not necessary for active maintenance of information in WM. Previous studies in humans have measured putative memory representations indirectly, by decoding memory contents from neural activity evoked by a neutral impulse stimulus. However, it is unclear whether memory contents can also be decoded in different species and attentional conditions. Here, we employ a cross-species approach to test whether auditory memory contents can be decoded from electrophysiological signals recorded in different species. Awake human volunteers (N = 21) were exposed to auditory pure tone and noise burst stimuli during an auditory sensory memory task using electroencephalography. In a closely matching paradigm, anesthetized female rats (N = 5) were exposed to comparable stimuli while neural activity was recorded using electrocorticography from the auditory cortex. In both species, the acoustic frequency could be decoded from neural activity evoked by pure tones as well as neutral frozen noise burst stimuli. This finding demonstrates that memory contents can be decoded in different species and different states using homologous methods, suggesting that the mechanisms of sensory memory encoding are evolutionarily conserved across species.


Subject(s)
Acoustic Stimulation/methods , Auditory Cortex/physiology , Auditory Perception/physiology , Memory, Short-Term/physiology , Adult , Animals , Electrocorticography/methods , Electroencephalography/methods , Female , Humans , Male , Middle Aged , Rats , Rats, Wistar , Reaction Time/physiology , Species Specificity , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...