Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 52
Filter
Add more filters










Publication year range
1.
Elife ; 122024 Jun 21.
Article in English | MEDLINE | ID: mdl-38904659

ABSTRACT

Dynamic attending theory proposes that the ability to track temporal cues in the auditory environment is governed by entrainment, the synchronization between internal oscillations and regularities in external auditory signals. Here, we focused on two key properties of internal oscillators: their preferred rate, the default rate in the absence of any input; and their flexibility, how they adapt to changes in rhythmic context. We developed methods to estimate oscillator properties (Experiment 1) and compared the estimates across tasks and individuals (Experiment 2). Preferred rates, estimated as the stimulus rates with peak performance, showed a harmonic relationship across measurements and were correlated with individuals' spontaneous motor tempo. Estimates from motor tasks were slower than those from the perceptual task, and the degree of slowing was consistent for each individual. Task performance decreased with trial-to-trial changes in stimulus rate, and responses on individual trials were biased toward the preceding trial's stimulus properties. Flexibility, quantified as an individual's ability to adapt to faster-than-previous rates, decreased with age. These findings show domain-specific rate preferences for the assumed oscillatory system underlying rhythm perception and production, and that this system loses its ability to flexibly adapt to changes in the external rhythmic context during aging.


Subject(s)
Attention , Auditory Perception , Humans , Adult , Attention/physiology , Female , Male , Young Adult , Aged , Auditory Perception/physiology , Middle Aged , Aging/physiology , Acoustic Stimulation , Adolescent
2.
Atten Percept Psychophys ; 86(4): 1400-1416, 2024 May.
Article in English | MEDLINE | ID: mdl-38557941

ABSTRACT

Music training is associated with better beat processing in the auditory modality. However, it is unknown how rhythmic training that emphasizes visual rhythms, such as dance training, might affect beat processing, nor whether training effects in general are modality specific. Here we examined how music and dance training interacted with modality during audiovisual integration and synchronization to auditory and visual isochronous sequences. In two experiments, musicians, dancers, and controls completed an audiovisual integration task and an audiovisual target-distractor synchronization task using dynamic visual stimuli (a bouncing figure). The groups performed similarly on the audiovisual integration tasks (Experiments 1 and 2). However, in the finger-tapping synchronization task (Experiment 1), musicians were more influenced by auditory distractors when synchronizing to visual sequences, while dancers were more influenced by visual distractors when synchronizing to auditory sequences. When participants synchronized with whole-body movements instead of finger-tapping (Experiment 2), all groups were more influenced by the visual distractor than the auditory distractor. Taken together, this study highlights how training is associated with audiovisual processing, and how different types of visual rhythmic stimuli and different movements alter beat perception and production outcome measures. Implications for the modality appropriateness hypothesis are discussed.


Subject(s)
Attention , Dancing , Music , Psychomotor Performance , Humans , Dancing/psychology , Female , Male , Young Adult , Attention/physiology , Psychomotor Performance/physiology , Adult , Auditory Perception/physiology , Time Perception , Practice, Psychological , Pattern Recognition, Visual/physiology , Adolescent , Visual Perception/physiology , Reaction Time
3.
Elife ; 122024 Jan 30.
Article in English | MEDLINE | ID: mdl-38289225

ABSTRACT

Synchronization between auditory stimuli and brain rhythms is beneficial for perception. In principle, auditory perception could be improved by facilitating neural entrainment to sounds via brain stimulation. However, high inter-individual variability of brain stimulation effects questions the usefulness of this approach. Here we aimed to modulate auditory perception by modulating neural entrainment to frequency modulated (FM) sounds using transcranial alternating current stimulation (tACS). In addition, we evaluated the advantage of using tACS montages spatially optimized for each individual's anatomy and functional data compared to a standard montage applied to all participants. Across two different sessions, 2 Hz tACS was applied targeting auditory brain regions. Concurrent with tACS, participants listened to FM stimuli with modulation rate matching the tACS frequency but with different phase lags relative to the tACS, and detected silent gaps embedded in the FM sound. We observed that tACS modulated the strength of behavioral entrainment to the FM sound in a phase-lag specific manner. Both the optimal tACS lag and the magnitude of the tACS effect were variable across participants and sessions. Inter-individual variability of tACS effects was best explained by the strength of the inward electric field, depending on the field focality and proximity to the target brain region. Although additional evidence is necessary, our results also provided suggestive insights that spatially optimizing the electrode montage could be a promising tool to reduce inter-individual variability of tACS effects. This work demonstrates that tACS effectively modulates entrainment to sounds depending on the optimality of the electric field. However, the lack of reliability on optimal tACS lags calls for caution when planning tACS experiments based on separate sessions.


Subject(s)
Transcranial Direct Current Stimulation , Humans , Acoustic Stimulation , Reproducibility of Results , Sound , Electric Stimulation
4.
Eur J Neurosci ; 57(9): 1529-1545, 2023 05.
Article in English | MEDLINE | ID: mdl-36895107

ABSTRACT

A growing body of evidence suggests that steady-state evoked potentials may be a useful measure of beat perception, particularly when obtaining traditional, explicit measures of beat perception is difficult, such as with infants or non-human animals. Although attending to a stimulus is not necessary for most traditional applications of steady-state evoked potentials, it is unknown how attention affects steady-state evoked potentials that arise in response to beat perception. Additionally, most applications of steady-state evoked potentials to measure beat perception have used repeating rhythms or real music. Therefore, it is unclear how the steady-state response relates to the robust beat perception that occurs with non-repeating rhythms. Here, we used electroencephalography to record participants' brain activity as they listened to non-repeating musical rhythms while either attending to the rhythms or while distracted by a concurrent visual task. Non-repeating auditory rhythms elicited steady-state evoked potentials at perceived beat frequencies (perception was validated in a separate sensorimotor synchronization task) that were larger when participants attended to the rhythms compared with when they were distracted by the visual task. Therefore, although steady-state evoked potentials appear to index beat perception to non-repeating musical rhythms, this technique may be limited to when participants are known to be attending to the stimulus.


Subject(s)
Evoked Potentials , Music , Electroencephalography , Auditory Perception/physiology , Attention/physiology
5.
Neuroimage ; 268: 119883, 2023 03.
Article in English | MEDLINE | ID: mdl-36657693

ABSTRACT

Listening in everyday life requires attention to be deployed dynamically - when listening is expected to be difficult and when relevant information is expected to occur - to conserve mental resources. Conserving mental resources may be particularly important for older adults who often experience difficulties understanding speech. In the current study, we use electro- and magnetoencephalography to investigate the neural and behavioral mechanics of attention regulation during listening and the effects that aging has on these. We first show in younger adults (17-31 years) that neural alpha oscillatory activity indicates when in time attention is deployed (Experiment 1) and that deployment depends on listening difficulty (Experiment 2). Experiment 3 investigated age-related changes in auditory attention regulation. Middle-aged and older adults (54-72 years) show successful attention regulation but appear to utilize timing information differently compared to younger adults (20-33 years). We show a notable age-group dissociation in recruited brain regions. In younger adults, superior parietal cortex underlies alpha power during attention regulation, whereas, in middle-aged and older adults, alpha power emerges from more ventro-lateral areas (posterior temporal cortex). This difference in the sources of alpha activity between age groups only occurred during task performance and was absent during rest (Experiment S1). In sum, our study suggests that middle-aged and older adults employ different neural control strategies compared to younger adults to regulate attention in time under listening challenges.


Subject(s)
Aging , Speech Perception , Middle Aged , Humans , Aged , Aging/physiology , Auditory Perception/physiology , Brain/physiology , Magnetoencephalography , Temporal Lobe , Speech Perception/physiology
6.
Sci Rep ; 12(1): 20466, 2022 11 28.
Article in English | MEDLINE | ID: mdl-36443344

ABSTRACT

Rhythmic structure in speech, music, and other auditory signals helps us track, anticipate, and understand the sounds in our environment. The dynamic attending framework proposes that biological systems possess internal rhythms, generated via oscillatory mechanisms, that synchronize with (entrain to) rhythms in the external world. Here, we focused on two properties of internal oscillators: preferred rate, the default rate of an oscillator in the absence of any input, and flexibility, the oscillator's ability to adapt to changes in external rhythmic context. We aimed to develop methods that can reliably estimate preferred rate and flexibility on an individual basis. The experiment was a synchronization-continuation finger tapping paradigm with a unique design: the stimulus rates were finely sampled over a wide range of rates and were presented only once. Individuals tapped their finger to 5-event isochronous stimulus sequences and continued the rhythm at the same pace. Preferred rate was estimated by assessing the best-performance conditions where the difference between the stimulus rate and continuation tapping rate (tempo-matching error) was minimum. The results revealed harmonically related, multiple preferred rates for each individual. We maximized the differences in stimulus rate between consecutive trials to challenge individuals' flexibility, which was then estimated by how much tempo-matching errors in synchronization tapping increase with this manipulation. Both measures showed test-retest reliability. The findings demonstrate the influence of properties of the auditory context on rhythmic entrainment, and have implications for development of methods that can improve attentional synchronization and hearing.


Subject(s)
Physical Therapy Modalities , Vitelliform Macular Dystrophy , Humans , Reproducibility of Results , Upper Extremity , Fingers , Hearing
7.
Elife ; 112022 09 12.
Article in English | MEDLINE | ID: mdl-36094165

ABSTRACT

Neural activity in the auditory system synchronizes to sound rhythms, and brain-environment synchronization is thought to be fundamental to successful auditory perception. Sound rhythms are often operationalized in terms of the sound's amplitude envelope. We hypothesized that - especially for music - the envelope might not best capture the complex spectro-temporal fluctuations that give rise to beat perception and synchronized neural activity. This study investigated (1) neural synchronization to different musical features, (2) tempo-dependence of neural synchronization, and (3) dependence of synchronization on familiarity, enjoyment, and ease of beat perception. In this electroencephalography study, 37 human participants listened to tempo-modulated music (1-4 Hz). Independent of whether the analysis approach was based on temporal response functions (TRFs) or reliable components analysis (RCA), the spectral flux of music - as opposed to the amplitude envelope - evoked strongest neural synchronization. Moreover, music with slower beat rates, high familiarity, and easy-to-perceive beats elicited the strongest neural response. Our results demonstrate the importance of spectro-temporal fluctuations in music for driving neural synchronization, and highlight its sensitivity to musical tempo, familiarity, and beat salience.


When we listen to a melody, the activity of our neurons synchronizes to the music: in fact, it is likely that the closer the match, the better we can perceive the piece. However, it remains unclear exactly which musical features our brain cells synchronize to. Previous studies, which have often used 'simplified' music, have highlighted that the amplitude envelope (how the intensity of the sounds changes over time) could be involved in this phenomenon, alongside factors such as musical training, attention, familiarity with the piece or even enjoyment. Whether differences in neural synchronization could explain why musical tastes vary between people is also still a matter of debate. In their study, Weineck et al. aim to better understand what drives neuronal synchronization to music. A technique known as electroencephalography was used to record brain activity in 37 volunteers listening to instrumental music whose tempo ranged from 60 to 240 beats per minute. The tunes varied across an array of features such as familiarity, enjoyment and how easy the beat was to perceive. Two different approaches were then used to calculate neural synchronization, which yielded converging results. The analyses revealed that three types of factors were associated with a strong neural synchronization. First, amongst the various cadences, a tempo of 60-120 beats per minute elicited the strongest match with neuronal activity. Interestingly, this beat is commonly found in Western pop music, is usually preferred by listeners, and often matches spontaneous body rhythms such as walking pace. Second, synchronization was linked to variations in pitch and sound quality (known as 'spectral flux') rather than in the amplitude envelope. And finally, familiarity and perceived beat saliency ­ but not enjoyment or musical expertise ­ were connected to stronger synchronization. These findings help to better understand how our brains allow us to perceive and connect with music. The work conducted by Weineck et al. should help other researchers to investigate this field; in particular, it shows how important it is to consider spectral flux rather than amplitude envelope in experiments that use actual music.


Subject(s)
Music , Auditory Perception/physiology , Brain , Electroencephalography , Humans , Recognition, Psychology
8.
J Exp Psychol Hum Percept Perform ; 48(7): 755-770, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35587435

ABSTRACT

Detecting and learning structure in sounds is fundamental to human auditory perception. Evidence for auditory perceptual learning comes from previous studies where listeners were better at detecting repetitions of a short noise snippet embedded in longer, ongoing noise when the same snippet recurred across trials compared with when the snippet was novel in each trial. However, previous work has mainly used (a) temporally regular presentations of the repeating noise snippet and (b) highly predictable intertrial onset timings for the snippet sequences. As a result, it is unclear how these temporal features affect perceptual learning. In five online experiments, participants judged whether a repeating noise snippet was present, unaware that the snippet could be unique to that trial or used in multiple trials. In two experiments, temporal regularity was manipulated by jittering the timing of noise-snippet repetitions within a trial. In two subsequent experiments, temporal onset certainty was manipulated by varying the onset time of the entire snippet sequence across trials. We found that both temporal jittering and onset uncertainty reduced auditory perceptual learning. In addition, we observed that these reductions in perceptual learning were ameliorated when the same snippet occurred in both temporally manipulated and unmanipulated trials. Our study demonstrates the importance of temporal regularity and onset certainty for auditory perceptual learning. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Subject(s)
Auditory Perception , Learning , Acoustic Stimulation , Humans
9.
J Neurosci ; 42(5): 894-908, 2022 02 02.
Article in English | MEDLINE | ID: mdl-34893547

ABSTRACT

Auditory stimuli are often rhythmic in nature. Brain activity synchronizes with auditory rhythms via neural entrainment, and entrainment seems to be beneficial for auditory perception. However, it is not clear to what extent neural entrainment in the auditory system is reliable over time, which is a necessary prerequisite for targeted intervention. The current study aimed to establish the reliability of neural entrainment over time and to predict individual differences in auditory perception from associated neural activity. Across two different sessions, human listeners (21 females, 17 males) detected silent gaps presented at different phase locations of a 2 Hz frequency-modulated (FM) noise while EEG activity was recorded. As expected, neural activity was entrained by the 2 Hz FM noise. Moreover, gap detection was sinusoidally modulated by the phase of the 2 Hz FM into which the gap fell. Critically, both the strength of neural entrainment as well as the modulation of performance by the stimulus rhythm were highly reliable over sessions. Moreover, gap detection was predictable from pregap neural 2 Hz phase and alpha amplitude. Our results demonstrate that neural entrainment in the auditory system and the resulting behavioral modulation are reliable over time, and both entrained delta and nonentrained alpha oscillatory activity contribute to near-threshold stimulus perception. The latter suggests that improving auditory perception might require simultaneously targeting entrained brain rhythms as well as the alpha rhythm.SIGNIFICANCE STATEMENT Neural activity synchronizes to the rhythms in sounds via neural entrainment, which seems to be important for successful auditory perception. A natural hypothesis is that improving neural entrainment, for example, via brain stimulation, should benefit perception. However, the extent to which neural entrainment is reliable over time, a necessary prerequisite for targeted intervention, has not been established. Using electroencephalogram recordings, we demonstrate that both neural entrainment to FM sounds and stimulus-induced behavioral modulation are reliable over time. Moreover, moment-by-moment fluctuations in perception are best predicted by entrained delta phase and nonentrained alpha amplitude. This work suggests that improving auditory perception might require simultaneously targeting entrained brain rhythms as well as the alpha rhythm.


Subject(s)
Alpha Rhythm , Auditory Cortex/physiology , Delta Rhythm , Periodicity , Adult , Auditory Pathways/physiology , Auditory Perception , Evoked Potentials, Auditory , Female , Humans , Male
10.
Philos Trans R Soc Lond B Biol Sci ; 376(1835): 20200336, 2021 10 11.
Article in English | MEDLINE | ID: mdl-34420382

ABSTRACT

In this perspective paper, we focus on the study of synchronization abilities across the animal kingdom. We propose an ecological approach to studying nonhuman animal synchronization that begins from observations about when, how and why an animal might synchronize spontaneously with natural environmental rhythms. We discuss what we consider to be the most important, but thus far largely understudied, temporal, physical, perceptual and motivational constraints that must be taken into account when designing experiments to test synchronization in nonhuman animals. First and foremost, different species are likely to be sensitive to and therefore capable of synchronizing at different timescales. We also argue that it is fruitful to consider the latent flexibility of animal synchronization. Finally, we discuss the importance of an animal's motivational state for showcasing synchronization abilities. We demonstrate that the likelihood that an animal can successfully synchronize with an environmental rhythm is context-dependent and suggest that the list of species capable of synchronization is likely to grow when tested with ecologically honest, species-tuned experiments. This article is part of the theme issue 'Synchrony and rhythm interaction: from the brain to behavioural ecology'.


Subject(s)
Ethology/methods , Invertebrates/physiology , Vertebrates/physiology , Animals , Behavior, Animal , Ecology/methods , Periodicity
12.
J Acoust Soc Am ; 149(4): 2546, 2021 04.
Article in English | MEDLINE | ID: mdl-33940875

ABSTRACT

Humans can perceive a regular psychological pulse in music known as the beat. The evolutionary origins and neural mechanisms underlying this ability are hypothetically linked to imitative vocal learning, a rare trait found only in some species of mammals and birds. Beat perception has been demonstrated in vocal learning parrots but not in songbirds. We trained European starlings (Sturnus vulgaris) on two sound discriminations to investigate their perception of the beat and temporal structure in rhythmic patterns. First, we trained birds on a two-choice discrimination between rhythmic patterns of tones that contain or lack a regular beat. Despite receiving extensive feedback, the starlings were unable to distinguish the first two patterns. Next, we probed the temporal cues that starlings use for discriminating rhythms in general. We trained birds to discriminate a baseline set of isochronous and triplet tone sequences. On occasional probe trials, we presented transformations of the baseline patterns. The starlings' responses to the probes suggest they relied on absolute temporal features to sort the sounds into "fast" and "slow" and otherwise ignored patterns that were present. Our results support that starlings attend to local features in rhythms and are less sensitive to the global temporal organization.


Subject(s)
Starlings , Animals , Auditory Perception , Cues , Discrimination, Psychological , Humans , Learning
13.
Curr Biol ; 29(13): 2237-2243.e4, 2019 07 08.
Article in English | MEDLINE | ID: mdl-31257140

ABSTRACT

Our visual system provides a distance-invariant percept of object size by integrating retinal image size with viewing distance (size constancy). Single-unit studies with animals have shown that some distance cues, especially oculomotor cues such as vergence and accommodation, can modulate the signals in the thalamus or V1 at the initial processing stage [1-7]. Accordingly, one might predict that size constancy emerges much earlier in time [8-10], even as visual signals are being processed in the thalamus. So far, the studies that have looked directly at size coding have either used fMRI (poor temporal resolution [11-13]) or relied on inadequate stimuli (pictorial illusions presented on a monitor at a fixed distance [11, 12, 14, 15]). Here, we physically moved the monitor to different distances, a more ecologically valid paradigm that emulates what happens in everyday life and is an example of the increasing trend of "bringing the real world into the lab." Using this paradigm in combination with electroencephalography (EEG), we examined the computation of size constancy in real time with real-world viewing conditions. Our study provides strong evidence that, even though oculomotor distance cues have been shown to modulate the spiking rate of neurons in the thalamus and in V1, the integration of viewing distance cues and retinal image size takes at least 150 ms to unfold, which suggests that the size-constancy-related activation patterns in V1 reported in previous fMRI studies (e.g., [12, 13]) reflect the later processing within V1 and/or top-down input from other high-level visual areas.


Subject(s)
Distance Perception/physiology , Eye Movements/physiology , Visual Cortex/physiology , Adult , Cues , Electroencephalography , Female , Humans , Illusions , Male , Middle Aged , Young Adult
14.
Neuroimage ; 185: 96-101, 2019 01 15.
Article in English | MEDLINE | ID: mdl-30336253

ABSTRACT

Neural activity phase-locks to rhythm in both music and speech. However, the literature currently lacks a direct test of whether cortical tracking of comparable rhythmic structure is comparable across domains. Moreover, although musical training improves multiple aspects of music and speech perception, the relationship between musical training and cortical tracking of rhythm has not been compared directly across domains. We recorded the electroencephalograms (EEG) from 28 participants (14 female) with a range of musical training who listened to melodies and sentences with identical rhythmic structure. We compared cerebral-acoustic coherence (CACoh) between the EEG signal and single-trial stimulus envelopes (as measure of cortical entrainment) across domains and correlated years of musical training with CACoh. We hypothesized that neural activity would be comparably phase-locked across domains, and that the amount of musical training would be associated with increasingly strong phase locking in both domains. We found that participants with only a few years of musical training had a comparable cortical response to music and speech rhythm, partially supporting the hypothesis. However, the cortical response to music rhythm increased with years of musical training while the response to speech rhythm did not, leading to an overall greater cortical response to music rhythm across all participants. We suggest that task demands shaped the asymmetric cortical tracking across domains.


Subject(s)
Cerebral Cortex/physiology , Music , Pitch Perception/physiology , Speech Perception/physiology , Adult , Brain Mapping/methods , Electroencephalography/methods , Female , Humans , Male , Young Adult
15.
Atten Percept Psychophys ; 81(2): 571-589, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30488190

ABSTRACT

Listeners resolve ambiguities in speech perception using multiple sources, including non-local or distal speech rate (i.e., the speech rate of material surrounding a particular region). The ability to resolve ambiguities is particularly important for the perception of casual, everyday productions, which are often produced using phonetically reduced forms. Here, we examine whether the distal speech rate effect is specific to a lexical class of words and/or to particular lexical or phonological contexts. In Experiment 1, we examined whether distal speech rate influenced perception of phonologically similar content words differing in number of syllables (e.g., form/forum). In Experiment 2, we used both transcription and word-monitoring tasks to examine whether distal speech rate influenced perception of a reduced vowel, causing lexical reorganization (e.g., cease, see us). Distal speech rate influenced perception of lexical content in both experiments. This demonstrates that distal rate influences perception of a lexical class other than function words and affects perception in a variety of phonological and lexical contexts. These results support a view that distal speech rate is a pervasive source of information with far-reaching consequences for perception of lexical content and word segmentation.


Subject(s)
Phonetics , Recognition, Psychology , Speech Perception/physiology , Verbal Behavior , Adolescent , Adult , Female , Humans , Male , Young Adult
16.
J Neurosci ; 38(34): 7428-7439, 2018 08 22.
Article in English | MEDLINE | ID: mdl-30012685

ABSTRACT

Increased memory load is often signified by enhanced neural oscillatory power in the alpha range (8-13 Hz), which is taken to reflect inhibition of task-irrelevant brain regions. The corresponding neural correlates of memory decay, however, are not yet well understood. In the current study, we investigated auditory short-term memory decay in humans using a delayed matching-to-sample task with pure-tone sequences. First, in a behavioral experiment, we modeled memory performance over six different delay-phase durations. Second, in a MEG experiment, we assessed alpha-power modulations over three different delay-phase durations. In both experiments, the temporal expectation for the to-be-remembered sound was manipulated so that it was either temporally expected or not. In both studies, memory performance declined over time, but this decline was weaker when the onset time of the to-be-remembered sound was expected. Similarly, patterns of alpha power in and alpha-tuned connectivity between sensory cortices changed parametrically with delay duration (i.e., decrease in occipitoparietal regions, increase in temporal regions). Temporal expectation not only counteracted alpha-power decline in heteromodal brain areas (i.e., supramarginal gyrus), but also had a beneficial effect on memory decay, counteracting memory performance decline. Correspondingly, temporal expectation also boosted alpha connectivity within attention networks known to play an active role during memory maintenance. The present data show how patterns of alpha power orchestrate short-term memory decay and encourage a more nuanced perspective on alpha power across brain space and time beyond its inhibitory role.SIGNIFICANCE STATEMENT Our sensory memories of the physical world fade quickly. We show here that this decay of short-term memory can be counteracted by so-called temporal expectation; that is, knowledge of when to expect a sensory event that an individual must remember. We also show that neural oscillations in the "alpha" (8-13 Hz) range index both the degree of memory decay (for brief sound patterns) and the respective memory benefit from temporal expectation. Spatially distributed cortical patterns of alpha power show opposing effects in auditory versus visual sensory cortices. Moreover, alpha-tuned connectivity changes within supramodal attention networks reflect the allocation of neural resources as short-term memory representations fade.


Subject(s)
Alpha Rhythm/physiology , Anticipation, Psychological/physiology , Memory, Short-Term/physiology , Time Factors , Acoustic Stimulation , Adult , Attention/physiology , Auditory Perception/physiology , Female , Humans , Magnetoencephalography , Male , Signal Detection, Psychological , Young Adult
17.
PLoS Biol ; 15(11): e1002615, 2017 11.
Article in English | MEDLINE | ID: mdl-29091710

ABSTRACT

[This corrects the article DOI: 10.1371/journal.pbio.2002794.].

18.
PLoS Biol ; 15(9): e2002794, 2017 09.
Article in English | MEDLINE | ID: mdl-28926570
19.
Nat Commun ; 8: 15801, 2017 06 27.
Article in English | MEDLINE | ID: mdl-28654081

ABSTRACT

Healthy aging is accompanied by listening difficulties, including decreased speech comprehension, that stem from an ill-understood combination of sensory and cognitive changes. Here, we use electroencephalography to demonstrate that auditory neural oscillations of older adults entrain less firmly and less flexibly to speech-paced (∼3 Hz) rhythms than younger adults' during attentive listening. These neural entrainment effects are distinct in magnitude and origin from the neural response to sound per se. Non-entrained parieto-occipital alpha (8-12 Hz) oscillations are enhanced in young adults, but suppressed in older participants, during attentive listening. Entrained neural phase and task-induced alpha amplitude exert opposite, complementary effects on listening performance: higher alpha amplitude is associated with reduced entrainment-driven behavioural performance modulation. Thus, alpha amplitude as a task-driven, neuro-modulatory signal can counteract the behavioural corollaries of neural entrainment. Balancing these two neural strategies may present new paths for intervention in age-related listening difficulties.


Subject(s)
Aging/physiology , Auditory Perception , Brain/physiology , Acoustic Stimulation , Adolescent , Adult , Aged , Brain/diagnostic imaging , Electroencephalography , Female , Humans , Male , Middle Aged , Young Adult
20.
PLoS One ; 12(2): e0172454, 2017.
Article in English | MEDLINE | ID: mdl-28225796

ABSTRACT

Entrainment of neural oscillations on multiple time scales is important for the perception of speech. Musical rhythms, and in particular the perception of a regular beat in musical rhythms, is also likely to rely on entrainment of neural oscillations. One recently proposed approach to studying beat perception in the context of neural entrainment and resonance (the "frequency-tagging" approach) has received an enthusiastic response from the scientific community. A specific version of the approach involves comparing frequency-domain representations of acoustic rhythm stimuli to the frequency-domain representations of neural responses to those rhythms (measured by electroencephalography, EEG). The relative amplitudes at specific EEG frequencies are compared to the relative amplitudes at the same stimulus frequencies, and enhancements at beat-related frequencies in the EEG signal are interpreted as reflecting an internal representation of the beat. Here, we show that frequency-domain representations of rhythms are sensitive to the acoustic features of the tones making up the rhythms (tone duration, onset/offset ramp duration); in fact, relative amplitudes at beat-related frequencies can be completely reversed by manipulating tone acoustics. Crucially, we show that changes to these acoustic tone features, and in turn changes to the frequency-domain representations of rhythms, do not affect beat perception. Instead, beat perception depends on the pattern of onsets (i.e., whether a rhythm has a simple or complex metrical structure). Moreover, we show that beat perception can differ for rhythms that have numerically identical frequency-domain representations. Thus, frequency-domain representations of rhythms are dissociable from beat perception. For this reason, we suggest caution in interpreting direct comparisons of rhythms and brain signals in the frequency domain. Instead, we suggest that combining EEG measurements of neural signals with creative behavioral paradigms is of more benefit to our understanding of beat perception.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Music , Periodicity , Time Perception/physiology , Acoustic Stimulation , Electroencephalography , Female , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...