Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
Behav Brain Sci ; 44: e73, 2021 09 30.
Article in English | MEDLINE | ID: mdl-34588047

ABSTRACT

Music uses the evolutionarily unique temporal sensitivity of the auditory system and its tight coupling to the motor system to create a common neurophysiological clock between individuals that facilitates action coordination. We propose that this shared common clock arises from entrainment to musical rhythms, the process by which partners' brains and bodies become temporally aligned to the same rhythmic pulse.


Subject(s)
Music , Auditory Perception , Brain , Humans
2.
Front Hum Neurosci ; 15: 717810, 2021.
Article in English | MEDLINE | ID: mdl-34588966

ABSTRACT

Interpersonal synchrony refers to the temporal coordination of actions between individuals and is a common feature of social behaviors, from team sport to ensemble music performance. Interpersonal synchrony of many rhythmic (periodic) behaviors displays dynamics of coupled biological oscillators. The current study addresses oscillatory dynamics on the levels of brain and behavior between music duet partners performing at spontaneous (uncued) rates. Wireless EEG was measured from N = 20 pairs of pianists as they performed a melody first in Solo performance (at their spontaneous rate of performance), and then in Duet performances at each partner's spontaneous rate. Influences of partners' spontaneous rates on interpersonal synchrony were assessed by correlating differences in partners' spontaneous rates of Solo performance with Duet tone onset asynchronies. Coupling between partners' neural oscillations was assessed by correlating amplitude envelope fluctuations of cortical oscillations at the Duet performance frequency between observed partners and between surrogate (re-paired) partners, who performed the same melody but at different times. Duet synchronization was influenced by partners' spontaneous rates in Solo performance. The size and direction of the difference in partners' spontaneous rates were mirrored in the size and direction of the Duet asynchronies. Moreover, observed Duet partners showed greater inter-brain correlations of oscillatory amplitude fluctuations than did surrogate partners, suggesting that performing in synchrony with a musical partner is reflected in coupled cortical dynamics at the performance frequency. The current study provides evidence that dynamics of oscillator coupling are reflected in both behavioral and neural measures of temporal coordination during musical joint action.

3.
J Neurosci ; 41(33): 7065-7075, 2021 08 18.
Article in English | MEDLINE | ID: mdl-34261698

ABSTRACT

At any given moment our sensory systems receive multiple, often rhythmic, inputs from the environment. Processing of temporally structured events in one sensory modality can guide both behavioral and neural processing of events in other sensory modalities, but whether this occurs remains unclear. Here, we used human electroencephalography (EEG) to test the cross-modal influences of a continuous auditory frequency-modulated (FM) sound on visual perception and visual cortical activity. We report systematic fluctuations in perceptual discrimination of brief visual stimuli in line with the phase of the FM-sound. We further show that this rhythmic modulation in visual perception is related to an accompanying rhythmic modulation of neural activity recorded over visual areas. Importantly, in our task, perceptual and neural visual modulations occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. As such, the results provide a critical validation for the existence and functional role of cross-modal entrainment and demonstrates its utility for organizing the perception of multisensory stimulation in the natural environment.SIGNIFICANCE STATEMENT Our sensory environment is filled with rhythmic structures that are often multi-sensory in nature. Here, we show that the alignment of neural activity to the phase of an auditory frequency-modulated (FM) sound has cross-modal consequences for vision: yielding systematic fluctuations in perceptual discrimination of brief visual stimuli that are mediated by accompanying rhythmic modulation of neural activity recorded over visual areas. These cross-modal effects on visual neural activity and perception occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. The current work shows that continuous auditory fluctuations in the natural environment can provide a pacing signal for neural activity and perception across the senses.


Subject(s)
Acoustic Stimulation , Periodicity , Visual Cortex/physiology , Visual Perception/physiology , Adult , Association Learning/physiology , Electroencephalography , Female , Humans , Male , Young Adult
4.
Trends Cogn Sci ; 24(6): 481-495, 2020 06.
Article in English | MEDLINE | ID: mdl-32317142

ABSTRACT

At any given moment, we receive multiple signals from our different senses. Prior research has shown that signals in one sensory modality can influence neural activity and behavioural performance associated with another sensory modality. Recent human and nonhuman primate studies suggest that such cross-modal influences in sensory cortices are mediated by the synchronisation of ongoing neural oscillations. In this review, we consider two mechanisms proposed to facilitate cross-modal influences on sensory processing, namely cross-modal phase resetting and neural entrainment. We consider how top-down processes may further influence cross-modal processing in a flexible manner, and we highlight fruitful directions for further research.


Subject(s)
Auditory Perception , Parietal Lobe , Acoustic Stimulation
5.
Curr Opin Psychol ; 29: 254-260, 2019 10.
Article in English | MEDLINE | ID: mdl-31302478

ABSTRACT

Human performance fluctuates over time. Rather than random, the complex time course of variation reflects, among other factors, influences from regular periodic processes operating at multiple time scales. In this review, we consider evidence for how our performance ebbs and flows over fractions of seconds as we engage with sensory objects, over minutes as we perform tasks, and over hours according to homeostatic factors. We propose that rhythms of performance at these multiple tempos arise from the interplay among three sources of influence: intrinsic fluctuations in brain activity, periodicity of external stimulation, and the anticipation of the temporal structure of external stimulation by the brain.


Subject(s)
Brain/physiology , Task Performance and Analysis , Circadian Rhythm , Electroencephalography , Evoked Potentials , Humans , Periodicity
6.
Psychophysiology ; 56(5): e13331, 2019 05.
Article in English | MEDLINE | ID: mdl-30657185

ABSTRACT

Spatiotemporal context plays an important role in episodic memory. While temporal context effects have been frequently studied in the laboratory, ecologically valid spatial context manipulations are difficult to implement in stationary conditions. We investigated whether the neural correlates of successful encoding (subsequent memory effect) can be captured in a real-world environment. An off-the-shelf Android smartphone was used for wireless mobile EEG acquisition and stimulus presentation. Participants encoded single words, each of which was presented at a different location on a university campus. Locations were approximately 10-12 m away from each other, half of them with striking features (landmarks) nearby. We predicted landmarks would improve recall performance. After a first free recall task of verbal stimuli indoors, participants performed a subsequent recall outdoors, in which words and locations were recalled. As predicted, significantly more words presented at landmark locations as well as significantly more landmark than nonlandmark locations were recalled. ERP analysis yielded a larger posterior positive deflection during encoding for hits compared to misses in the 400-800 ms interval. Likewise, time-frequency analysis revealed a significant difference during encoding for hits compared to misses in the form of stronger alpha (200-300 ms) and theta (300-400 ms) power increases. Our results confirm that a vibrant spatial context is beneficial in episodic memory processing and that the underlying neural correlates can be captured with unobtrusive smartphone EEG technology. The advent of mobile EEG technology promises to unveil the relevance of natural physical activity and natural environments on memory.


Subject(s)
Brain Waves/physiology , Cerebral Cortex/physiology , Electroencephalography/instrumentation , Evoked Potentials/physiology , Executive Function/physiology , Memory, Episodic , Mental Recall/physiology , Pattern Recognition, Visual/physiology , Space Perception/physiology , Adult , Female , Humans , Male , Monitoring, Ambulatory , Neurophysiological Monitoring , Smartphone , Young Adult
7.
Brain Res ; 1716: 27-38, 2019 08 01.
Article in English | MEDLINE | ID: mdl-28693821

ABSTRACT

Although music performance has been widely studied in the behavioural sciences, less work has addressed the underlying neural mechanisms, perhaps due to technical difficulties in acquiring high-quality neural data during tasks requiring natural motion. The advent of wireless electroencephalography (EEG) presents a solution to this problem by allowing for neural measurement with minimal motion artefacts. In the current study, we provide the first validation of a mobile wireless EEG system for capturing the neural dynamics associated with piano performance. First, we propose a novel method for synchronously recording music performance and wireless mobile EEG. Second, we provide results of several timing tests that characterize the timing accuracy of our system. Finally, we report EEG time domain and frequency domain results from N=40 pianists demonstrating that wireless EEG data capture the unique temporal signatures of musicians' performances with fine-grained precision and accuracy. Taken together, we demonstrate that mobile wireless EEG can be used to measure the neural dynamics of piano performance with minimal motion constraints. This opens many new possibilities for investigating the brain mechanisms underlying music performance.


Subject(s)
Electroencephalography/instrumentation , Motor Skills/physiology , Adult , Brain/physiology , Electroencephalography/methods , Female , Humans , Male , Music/psychology , Psychomotor Performance/physiology , Wireless Technology/instrumentation
8.
Front Neurosci ; 12: 309, 2018.
Article in English | MEDLINE | ID: mdl-29867321

ABSTRACT

Electroencephalography (EEG) source localization approaches are often used to disentangle the spatial patterns mixed up in scalp EEG recordings. However, approaches differ substantially between experiments, may be strongly parameter-dependent, and results are not necessarily meaningful. In this paper we provide a pipeline for EEG source estimation, from raw EEG data pre-processing using EEGLAB functions up to source-level analysis as implemented in Brainstorm. The pipeline is tested using a data set of 10 individuals performing an auditory attention task. The analysis approach estimates sources of 64-channel EEG data without the prerequisite of individual anatomies or individually digitized sensor positions. First, we show advanced EEG pre-processing using EEGLAB, which includes artifact attenuation using independent component analysis (ICA). ICA is a linear decomposition technique that aims to reveal the underlying statistical sources of mixed signals and is further a powerful tool to attenuate stereotypical artifacts (e.g., eye movements or heartbeat). Data submitted to ICA are pre-processed to facilitate good-quality decompositions. Aiming toward an objective approach on component identification, the semi-automatic CORRMAP algorithm is applied for the identification of components representing prominent and stereotypic artifacts. Second, we present a step-wise approach to estimate active sources of auditory cortex event-related processing, on a single subject level. The presented approach assumes that no individual anatomy is available and therefore the default anatomy ICBM152, as implemented in Brainstorm, is used for all individuals. Individual noise modeling in this dataset is based on the pre-stimulus baseline period. For EEG source modeling we use the OpenMEEG algorithm as the underlying forward model based on the symmetric Boundary Element Method (BEM). We then apply the method of dynamical statistical parametric mapping (dSPM) to obtain physiologically plausible EEG source estimates. Finally, we show how to perform group level analysis in the time domain on anatomically defined regions of interest (auditory scout). The proposed pipeline needs to be tailored to the specific datasets and paradigms. However, the straightforward combination of EEGLAB and Brainstorm analysis tools may be of interest to others performing EEG source localization.

9.
Ann N Y Acad Sci ; 2018 May 14.
Article in English | MEDLINE | ID: mdl-29756657

ABSTRACT

A major question facing cognitive neuroscience is measurement of interbrain synchrony between individuals performing joint actions. We describe the application of a novel method for measuring musicians' interbrain synchrony: amplitude envelope correlations (AECs). Amplitude envelopes (AEs) reflect energy fluctuations in cortical oscillations over time; AE correlations measure the degree to which two envelope fluctuations are temporally correlated, such as cortical oscillations arising from two individuals performing a joint action. Wireless electroencephalography was recorded from two pianists performing a musical duet; an analysis pipeline is described for computing AEs of cortical oscillations at the duet performance frequency (number of tones produced per second) to test whether these oscillations reflect the temporal dynamics of partners' performances. The pianists' AE correlations were compared with correlations based on a distribution of AEs simulated from white noise signals using the same methods. The AE method was also applied to the temporal characteristics of the pianists' performances, to show that the observed pair's AEs reflect the temporal dynamics of their performance. AE correlations offer a promising approach for assessing interbrain correspondences in cortical activity associated with performing joint tasks.

10.
Brain Topogr ; 31(5): 811-826, 2018 09.
Article in English | MEDLINE | ID: mdl-29488040

ABSTRACT

The acoustic envelope of human speech correlates with the syllabic rate (4-8 Hz) and carries important information for intelligibility, which is typically compromised in multi-talker, noisy environments. In order to better understand the dynamics of selective auditory attention to low frequency modulated sound sources, we conducted a two-stream auditory steady-state response (ASSR) selective attention electroencephalogram (EEG) study. The two streams consisted of 4 and 7 Hz amplitude and frequency modulated sounds presented from the left and right side. One of two streams had to be attended while the other had to be ignored. The attended stream always contained a target, allowing for the behavioral confirmation of the attention manipulation. EEG ASSR power analysis revealed a significant increase in 7 Hz power for the attend compared to the ignore conditions. There was no significant difference in 4 Hz power when the 4 Hz stream had to be attended compared to when it had to be ignored. This lack of 4 Hz attention modulation could be explained by a distracting effect of a third frequency at 3 Hz (beat frequency) perceivable when the 4 and 7 Hz streams are presented simultaneously. Taken together our results show that low frequency modulations at syllabic rate are modulated by selective spatial attention. Whether attention effects act as enhancement of the attended stream or suppression of to be ignored stream may depend on how well auditory streams can be segregated.


Subject(s)
Acoustic Stimulation , Attention/physiology , Electroencephalography , Adult , Auditory Cortex/physiology , Auditory Perception , Cues , Female , Functional Laterality/physiology , Healthy Volunteers , Humans , Male , Middle Aged , Psychomotor Performance/physiology , Space Perception/physiology , Young Adult
11.
Neuroimage ; 167: 396-407, 2018 02 15.
Article in English | MEDLINE | ID: mdl-29170070

ABSTRACT

Neural oscillations can synchronize to external rhythmic stimuli, as for example in speech and music. While previous studies have mainly focused on elucidating the fundamental concept of neural entrainment, less is known about the time course of entrainment. In this human electroencephalography (EEG) study, we unravel the temporal evolution of neural entrainment by contrasting short and long periods of rhythmic stimulation. Listeners had to detect short silent gaps that were systematically distributed with respect to the phase of a 3 Hz frequency-modulated tone. We found that gap detection performance was modulated by the stimulus stream with a consistent stimulus phase across participants for short and long stimulation. Electrophysiological analysis confirmed neural entrainment effects at 3 Hz and the 6 Hz harmonic for both short and long stimulation lengths. 3 Hz source level analysis revealed that longer stimulation resulted in a phase shift of a participant's neural phase relative to the stimulus phase. Phase coupling increased over the first second of stimulation, but no effects for phase coupling strength were observed over time. The dynamic evolution of phase alignment suggests that the brain attunes to external rhythmic stimulation by adapting the brain's internal representation of incoming environmental stimuli.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Brain Waves/physiology , Electroencephalography Phase Synchronization/physiology , Acoustic Stimulation , Adult , Female , Humans , Male , Time Factors , Young Adult
12.
Brain Res ; 1626: 198-210, 2015 Nov 11.
Article in English | MEDLINE | ID: mdl-25934332

ABSTRACT

The dynamic attending theory as originally proposed by Jones, 1976. Psychol. Rev. 83(5), 323-355 posits that tone sequences presented at a regular rhythm entrain attentional oscillations and thereby facilitate the processing of sounds presented in phase with this rhythm. The increased interest in neural correlates of dynamic attending requires robust behavioral indicators of the phenomenon. Here we aimed to replicate and complement the most prominent experimental implementation of dynamic attending (Jones et al., 2002. Psychol. Sci. 13(4), 313-319). The paradigm uses a pitch comparison task in which two tones, the initial and the last of a longer series, have to be compared. In-between the two, distractor tones with variable pitch are presented, at a regular pace. A comparison tone presented in phase with the entrained rhythm is hypothesized to lead to better behavioral performance. Aiming for a conceptual replication, four different variations of the original paradigm were created which were followed by an exact replication attempt. Across all five experiments, only 40 of the 140 tested participants showed the hypothesized pattern of an inverted U-shaped profile in task accuracy, and the group average effects did not replicate the pattern reported by Jones et al., 2002. Psychol. Sci. 13(4), 313-319 in any of the five experiments. However, clear evidence for a relationship between musicality and overall behavioral performance was found. This study casts doubt on the suitability of the pitch comparison task for demonstrating auditory dynamic attending. We discuss alternative tasks that have been shown to support dynamic attending theory, thus lending themselves more readily to studying its neural correlates. This article is part of a Special Issue entitled SI: Prediction and Attention.


Subject(s)
Attention , Pitch Discrimination , Acoustic Stimulation , Adult , Female , Humans , Male , Models, Psychological , Reproducibility of Results , Research Design , Young Adult
13.
Psychophysiology ; 52(4): 600-4, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25353087

ABSTRACT

Every individual has a preferred musical tempo, which peaks slightly above 120 beats per minute and is subject to interindividual variation. The preferred tempo is believed to be associated with rhythmic body movements as well as motor cortex activity. However, a long-standing question is whether preferred tempo is determined biologically. To uncover the neural correlates of preferred tempo, we first determined an individual's preferred tempo using a multistep procedure. Subsequently, we correlated the preferred tempo with a general EEG timing parameter as well as perceptual and motor EEG correlates-namely, individual alpha frequency, auditory evoked gamma band response, and motor beta activity. Results showed a significant relation between preferred tempo and the frequency of motor beta activity. These findings suggest that individual tempo preferences result from neural activity in the motor cortex, explaining the interindividual variation.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Beta Rhythm/physiology , Individuality , Music , Acoustic Stimulation , Adult , Electroencephalography , Female , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...