Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
Add more filters










Publication year range
1.
Neuropsychologia ; 182: 108521, 2023 04 15.
Article in English | MEDLINE | ID: mdl-36870471

ABSTRACT

Congenital amusia is a neurodevelopmental disorder of musical processing. Previous research demonstrates that although explicit musical processing is impaired in congenital amusia, implicit musical processing can be intact. However, little is known about whether implicit knowledge could improve explicit musical processing in individuals with congenital amusia. To this end, we developed a training method utilizing redescription-associate learning, aiming at transferring implicit representations of perceptual states into explicit forms through verbal description and then establishing the associations between the perceptual states reported and responses via feedback, to investigate whether the explicit processing of melodic structure could be improved in individuals with congenital amusia. Sixteen amusics and 11 controls rated the degree of expectedness of melodies during EEG recording before and after training. In the interim, half of the amusics received nine training sessions on melodic structure, while the other half received no training. Results, based on effect size estimation, showed that at pretest, amusics but not controls failed to explicitly distinguish the regular from the irregular melodies and to exhibit an ERAN in response to the irregular endings. At posttest, trained but not untrained amusics performed as well as controls at both the behavioral and neural levels. At the 3-month follow-up, the training effects still maintained. These findings present novel electrophysiological evidence of neural plasticity in the amusic brain, suggesting that redescription-associate learning may be an effective method to remediate impaired explicit processes for individuals with other neurodevelopmental disorders who have intact implicit knowledge.


Subject(s)
Auditory Perceptual Disorders , Music , Humans , Acoustic Stimulation/methods , Learning , Pitch Perception/physiology
2.
Psychon Bull Rev ; 29(4): 1472-1479, 2022 Aug.
Article in English | MEDLINE | ID: mdl-35318581

ABSTRACT

Musical expertise is known to affect speech perception at units below clause/sentence. This study investigated whether the musician's advantage extends to a higher and more central level of speech processing (i.e., clause segmentation). Two groups of participants (musician vs. nonmusician) were presented with sentences that contain an internal clause boundary. The acoustic correlates of the boundary were manipulated in six conditions: all-cue, pause-only, final-lengthening-only, pitch-reset-only, pause-and-final-lengthening-in-combination, and no-cue conditions. Participants were asked to judge whether the sentence they heard had an internal boundary. Results showed that the musicians detected more boundaries than the nonmusicians in the all-cue and the pause-only conditions, but fewer boundaries in the no-cue condition. Further analyses of cue weight showed that both musicians and nonmusicians placed more importance on pause than the other two cues, but this weighting bias was more pronounced for the musicians. These results suggest that music training is associated with increased perceptual acuity not only to the acoustic markings of speech boundaries but also to the weighting of the cues. Our findings extend the role of musical expertise to sentence-level speech processing.


Subject(s)
Music , Speech Perception , Acoustic Stimulation , Humans , Language , Pitch Perception , Speech
3.
Cogn Emot ; 36(2): 211-229, 2022 03.
Article in English | MEDLINE | ID: mdl-34702138

ABSTRACT

People tend to choose smaller, immediate rewards over larger, delayed rewards. This phenomenon is thought to be associated with emotional engagement. However, few studies have demonstrated the real-time impact of incidental emotions on intertemporal choices. This research investigated the effects of music-induced incidental emotions on intertemporal choices, during which happy or sad music was played simultaneously. We found that music-induced happiness made participants prefer smaller-but-sooner rewards (SS), whereas music-induced sadness made participants prefer larger-but-later rewards (LL). Time perception partially mediated this effect: the greater the perceived temporal difference, the more likely they were to prefer SS. Tempo and mode were then manipulated to disentangle the effects of arousal and mood on intertemporal choices. Only tempo-induced arousal, but not mode-induced mood, affected intertemporal choices. These results suggest the role of arousal in intertemporal decision making and provide evidence in support of equate-to-differentiate theory with regard to the non-compensatory mechanism in intertemporal choices.


Subject(s)
Delay Discounting , Music , Arousal , Decision Making , Emotions , Humans , Reward
4.
Autism Res ; 15(2): 222-240, 2022 02.
Article in English | MEDLINE | ID: mdl-34792299

ABSTRACT

Whether autism spectrum disorder (ASD) is associated with a global processing deficit remains controversial. Global integration requires extraction of regularity across various timescales, yet little is known about how individuals with ASD process regularity at local (short timescale) versus global (long timescale) levels. To this end, we used event-related potentials to investigate whether individuals with ASD would show different neural responses to local (within trial) versus global (across trials) emotion regularities extracted from sequential facial expressions; and if so, whether this visual abnormality would generalize to the music (auditory) domain. Twenty individuals with ASD and 21 age- and IQ-matched individuals with typical development participated in this study. At an early processing stage, ASD participants exhibited preserved neural responses to violations of local emotion regularity for both faces and music. At a later stage, however, there was an absence of neural responses in ASD to violations of global emotion regularity for both faces and music. These findings suggest that the autistic brain responses to emotion regularity are modulated by the timescale of sequential stimuli, and provide insight into the neural mechanisms underlying emotional processing in ASD.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Music , Autism Spectrum Disorder/psychology , Brain , Emotions/physiology , Facial Expression , Humans
5.
Iperception ; 11(6): 2041669520971655, 2020.
Article in English | MEDLINE | ID: mdl-33282171

ABSTRACT

Absolute pitch (AP) is a superior ability to identify or produce musical tones without a reference tone. Although a few studies have investigated the relationship between AP and high-level music processing such as tonality and syntactic processing, very little is known about whether AP is related to musical tension processing. To address this issue, 20 AP possessors and 20 matched non-AP possessors listened to major and minor melodies and rated the levels of perceived and felt musical tension using a continuous response digital interface dial. Results indicated that the major melodies were perceived and felt as less tense than the minor ones by AP and non-AP possessors. However, there was weak evidence for no differences between AP and non-AP possessors in the perception and experience of musical tension, suggesting that AP may be independent of the processing of musical tension. The implications of these findings are discussed.

6.
Psychophysiology ; 57(9): e13598, 2020 09.
Article in English | MEDLINE | ID: mdl-32449180

ABSTRACT

The processing of temporal structure has been widely investigated, but evidence on how the brain processes temporal and nontemporal structures simultaneously is sparse. Using event-related potentials (ERPs), we examined how the brain responds to temporal (metric) and nontemporal (harmonic) structures in music simultaneously, and whether these processes are impacted by musical expertise. Fifteen musicians and 15 nonmusicians rated the degree of completeness of musical sequences with or without violations in metric or harmonic structures. In the single violation conditions, the ERP results showed that both musicians and nonmusicians exhibited an early right anterior negativity (ERAN) as well as an N5 to temporal violations ("when"), and only an N5-like response to nontemporal violations ("what"), which were consistent with the behavioral results. In the double violation condition, however, only the ERP results, but not the behavioral results, revealed a significant interaction between temporal and nontemporal violations at a later integrative stage, as manifested by an enlarged N5 effect compared to the single violation conditions. These findings provide the first evidence that the human brain uses different neural mechanisms in processing metric and harmonic structures in music, which may shed light on how the brain generates predictions for "what" and "when" events in the natural environment.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Evoked Potentials/physiology , Music , Adult , Female , Humans , Male , Time Perception , Young Adult
7.
Brain Cogn ; 135: 103577, 2019 10.
Article in English | MEDLINE | ID: mdl-31202155

ABSTRACT

This study investigated whether individuals with congenital amusia, a neurogenetic disorder of musical pitch perception, were able to process musical emotions in single chords either automatically or consciously. In Experiments 1 and 2, we used a cross-modal affective priming paradigm to elicit automatic emotional processing through ERPs, in which target facial expressions were preceded by either affectively congruent or incongruent chords with a stimulus onset asynchrony (SOA) of 200 msec. Results revealed automatic emotional processing of major/minor triads (Experiment 1) and consonant/dissonant chords (Experiment 2) in controls, who showed longer reaction times and increased N400 for incongruent than congruent trials, while amusics failed to exhibit such a priming effect at both behavioral and electrophysiological levels. In Experiment 3, we further examined conscious emotional evaluation of the same chords in amusia. Results showed that amusics were unable to consciously differentiate the emotions conveyed by major and minor chords and by consonant and dissonant chords, as compared with controls. These findings suggest the impairment in automatic and conscious emotional processing of music in amusia. The implications of these findings in relation to musical emotional processing are discussed.


Subject(s)
Auditory Perceptual Disorders/physiopathology , Emotions/physiology , Evoked Potentials, Auditory/physiology , Music/psychology , Acoustic Stimulation , Adult , Auditory Perceptual Disorders/psychology , Electroencephalography , Female , Humans , Male , Pitch Perception/physiology , Reaction Time/physiology , Young Adult
8.
Psychophysiology ; 56(9): e13394, 2019 09.
Article in English | MEDLINE | ID: mdl-31111968

ABSTRACT

In music, harmonic syntactic structures are organized hierarchically through local and long-distance dependencies. This study investigated whether congenital amusia, a neurodevelopmental disorder of pitch perception, is associated with impaired processing of harmonic syntactic structures. For stimuli, we used harmonic sequences containing two phrases, where the first phrase ended with a half cadence and the second with an authentic cadence. In Experiment 1, we manipulated the ending chord of the authentic cadence to be either syntactically regular or irregular based on local dependencies. Sixteen amusics and 16 controls judged the expectedness of these chords while their EEG waveforms were recorded. In comparison to the regular endings, irregular endings elicited an ERAN, an N5, and a late positive component in controls but not in amusics, indicating that amusics were impaired in processing local syntactic dependencies. In Experiment 2, we manipulated the half cadence of the harmonic sequences to either adhere to or violate long-distance syntactic dependencies. In response to irregular harmonic sequences, an ERAN-like component and an N5 were elicited in controls but not in amusics, suggesting that amusics were impaired in processing long-distance syntactic dependencies. Furthermore, for controls, the neural processing of local and long-distance syntactic dependencies was correlated at the later integration stage but not at the early detection stage. These findings indicate that amusia is associated with impairment in the detection and integration of local and long-distance syntactic violations. The implications of these findings in terms of hierarchical music-syntactic processing are discussed.


Subject(s)
Auditory Perception/physiology , Auditory Perceptual Disorders/physiopathology , Evoked Potentials, Auditory/physiology , Music , Adult , Electroencephalography , Female , Humans , Male , Young Adult
9.
Behav Brain Res ; 359: 362-369, 2019 02 01.
Article in English | MEDLINE | ID: mdl-30458161

ABSTRACT

Music can convey meanings by imitating phenomena of the extramusical world, and these imitation-induced musical meanings can be understood by listeners. Although the human mirror system (HMS) is implicated in imitation, little is known about the HMS's role in making sense of meaning that derives from musical imitation. To answer this question, we used fMRI to examine listeners' brain activities during the processing of imitation-induced musical meaning with a cross-modal semantic priming paradigm. Eleven normal individuals and 11 individuals with congenital amusia, a neurodevelopmental disorder of musical processing, participated in the experiment. Target pictures with either an upward or downward movement were primed by semantically congruent or incongruent melodic sequences characterized by the direction of pitch change (upward or downward). When contrasting the incongruent with the congruent condition between the two groups, we found greater activations in the left supramarginal gyrus/inferior parietal lobule and inferior frontal gyrus in normals but not in amusics. The implications of these findings in terms of the role of the HMS in understanding imitation-induced musical meaning are discussed.


Subject(s)
Auditory Perception/physiology , Auditory Perceptual Disorders/physiopathology , Brain/physiopathology , Imitative Behavior/physiology , Music , Association , Auditory Perceptual Disorders/diagnostic imaging , Brain/diagnostic imaging , Brain/physiology , Brain Mapping , Female , Humans , Judgment/physiology , Magnetic Resonance Imaging , Male , Repetition Priming/physiology , Semantics , Visual Perception/physiology , Young Adult
10.
Psychophysiology ; 55(2)2018 02.
Article in English | MEDLINE | ID: mdl-28833189

ABSTRACT

Syntactic processing is essential for musical understanding. Although the processing of harmonic syntax has been well studied, very little is known about the neural mechanisms underlying rhythmic syntactic processing. The present study investigated the neural processing of rhythmic syntax and whether and to what extent long-term musical training impacts such processing. Fourteen musicians and 14 nonmusicians listened to syntactic-regular or syntactic-irregular rhythmic sequences and judged the completeness of these sequences. Nonmusicians, as well as musicians, showed a P600 effect to syntactic-irregular endings, indicating that musical exposure and perceptual learning of music are sufficient to enable nonmusicians to process rhythmic syntax at the late stage. However, musicians, but not nonmusicians, also exhibited an early right anterior negativity (ERAN) response to syntactic-irregular endings, which suggests that musical training only modulates the early but not the late stage of rhythmic syntactic processing. These findings revealed for the first time the neural mechanisms underlying the processing of rhythmic syntax in music, which has important implications for theories of hierarchically organized music cognition and comparative studies of syntactic processing in music and language.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Cognition/physiology , Evoked Potentials/physiology , Music , Professional Competence , Adult , Brain Mapping , Electroencephalography , Female , Humans , Male , Young Adult
11.
Neuropsychologia ; 96: 29-38, 2017 02.
Article in English | MEDLINE | ID: mdl-28039057

ABSTRACT

Music is a unique communication system for human beings. Iconic musical meaning is one dimension of musical meaning, which emerges from musical information resembling sounds of objects, qualities of objects, or qualities of abstract concepts. The present study investigated whether congenital amusia, a disorder of musical pitch perception, impacts the processing of iconic musical meaning. With a cross-modal semantic priming paradigm, target images were primed by semantically congruent or incongruent musical excerpts, which were characterized by direction (upward or downward) of pitch change (Experiment 1), or were selected from natural music (Experiment 2). Twelve Mandarin-speaking amusics and 12 controls performed a recognition (implicit) and a semantic congruency judgment (explicit) task while their EEG waveforms were recorded. Unlike controls, amusics failed to elicit an N400 effect when musical meaning was represented by direction of pitch change, regardless of the nature of the tasks (implicit versus explicit). However, the N400 effect in response to musical meaning in natural musical excerpts was observed for both the groups in both types of tasks. These results indicate that amusics are able to process iconic musical meaning through multiple acoustic cues in natural musical excerpts, but not through the direction of pitch change. This is the first study to investigate the processing of musical meaning in congenital amusia, providing evidence in support of the "melodic contour deafness hypothesis" with regard to iconic musical meaning processing in this disorder.


Subject(s)
Auditory Perceptual Disorders/pathology , Brain Mapping , Evoked Potentials/physiology , Pitch Perception/physiology , Sound Localization/physiology , Acoustic Stimulation , Adult , Analysis of Variance , Auditory Perceptual Disorders/diagnostic imaging , Cues , Electroencephalography , Female , Humans , Male , Music , Photic Stimulation , Recognition, Psychology/physiology , Statistics as Topic , Young Adult
12.
Neuropsychologia ; 91: 490-498, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27659874

ABSTRACT

Hierarchical structure with units of different timescales is a key feature of music. For the perception of such structures, the detection of each boundary is crucial. Here, using electroencephalography (EEG), we explore the perception of hierarchical boundaries in music, and test whether musical expertise modifies such processing. Musicians and non-musicians were presented with musical excerpts containing boundaries at three hierarchical levels, including section, phrase and period boundaries. Non-boundary was chosen as a baseline condition. Recordings from musicians showed CPS (closure positive shift) was evoked at all the three boundaries, and their amplitude increased as the hierarchical level became higher, which suggest that musicians could represent music events at different timescales in a hierarchical way. For non-musicians, the CPS was only elicited at the period boundary and undistinguishable negativities were induced at all the three boundaries. The results indicate that a different and less clear way was used by non-musicians in boundary perception. Our findings reveal, for the first time, an ERP correlate of perceiving hierarchical boundaries in music, and show that the phrasing ability could be enhanced by musical expertise.


Subject(s)
Brain/physiology , Evoked Potentials, Auditory/physiology , Music , Pitch Perception/physiology , Professional Competence , Acoustic Stimulation , Adult , Analysis of Variance , Brain Mapping , Electroencephalography , Female , Humans , Male , Psychoacoustics , Reaction Time/physiology , Young Adult
13.
Neuropsychologia ; 77: 128-36, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26254996

ABSTRACT

This study on event-related brain potential investigated whether music can convey the concept of movement. Using a semantic priming paradigm, natural musical excerpts were presented to non-musicians, followed by semantically congruent or incongruent pictures that depicted objects either in motion or at rest. The priming effects were tested in object decision and implicit recognition tasks to distinguish the effects of automatic conceptual activation from response competition. Results showed that in both tasks, pictures that were incongruent to preceding musical excerpts elicited larger N400 than congruent pictures, suggesting that music can prime the representations of movement concepts. Results of the multiple regression analysis showed that movement expression could be well predicted by specific acoustic and musical features, indicating the associations between music per se and the processing of iconic musical meaning.


Subject(s)
Auditory Perception/physiology , Brain/physiology , Motion Perception/physiology , Music/psychology , Acoustic Stimulation/methods , Electroencephalography , Emotions/physiology , Evoked Potentials , Female , Humans , Male , Neuropsychological Tests , Photic Stimulation/methods , Recognition, Psychology/physiology , Regression Analysis , Signal Processing, Computer-Assisted , Young Adult
14.
Psychophysiology ; 51(6): 520-8, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24611598

ABSTRACT

The processing of extramusical meaning can be reflected in N400 effects of the ERP. However, how conceptual representations can be activated in music still needs to be specified. We investigated the activation of iconic meaningful representations in music by using a cross-modal semantic priming paradigm with an implicit task. Pictures of spatial scenes were semantically congruent or incongruent to preceding music in three stimulus onset asynchrony (SOA) conditions. The results revealed that the semantically incongruent target pictures elicited larger N400 amplitude than the congruent target pictures. Moreover, the semantic priming effect was modulated by the SOAs. The N400 effect was observed in the 200-ms and 800-ms SOA conditions, but not in the 1,200-ms SOA condition. These results suggest that extramusical meaning purely due to iconic sign quality can be activated, and that the conceptual activation in music can be rapid and automatic.


Subject(s)
Evoked Potentials/physiology , Music/psychology , Space Perception/physiology , Visual Perception/physiology , Electroencephalography , Female , Humans , Male , Photic Stimulation , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...