Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
Add more filters










Publication year range
1.
Dev Sci ; 27(2): e13436, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37551932

ABSTRACT

The environment in which infants learn language is multimodal and rich with social cues. Yet, the effects of such cues, such as eye contact, on early speech perception have not been closely examined. This study assessed the role of ostensive speech, signalled through the speaker's eye gaze direction, on infants' word segmentation abilities. A familiarisation-then-test paradigm was used while electroencephalography (EEG) was recorded. Ten-month-old Dutch-learning infants were familiarised with audio-visual stories in which a speaker recited four sentences with one repeated target word. The speaker addressed them either with direct or with averted gaze while speaking. In the test phase following each story, infants heard familiar and novel words presented via audio-only. Infants' familiarity with the words was assessed using event-related potentials (ERPs). As predicted, infants showed a negative-going ERP familiarity effect to the isolated familiarised words relative to the novel words over the left-frontal region of interest during the test phase. While the word familiarity effect did not differ as a function of the speaker's gaze over the left-frontal region of interest, there was also a (not predicted) positive-going early ERP familiarity effect over right fronto-central and central electrodes in the direct gaze condition only. This study provides electrophysiological evidence that infants can segment words from audio-visual speech, regardless of the ostensiveness of the speaker's communication. However, the speaker's gaze direction seems to influence the processing of familiar words. RESEARCH HIGHLIGHTS: We examined 10-month-old infants' ERP word familiarity response using audio-visual stories, in which a speaker addressed infants with direct or averted gaze while speaking. Ten-month-old infants can segment and recognise familiar words from audio-visual speech, indicated by their negative-going ERP response to familiar, relative to novel, words. This negative-going ERP word familiarity effect was present for isolated words over left-frontal electrodes regardless of whether the speaker offered eye contact while speaking. An additional positivity in response to familiar words was observed for direct gaze only, over right fronto-central and central electrodes.


Subject(s)
Speech Perception , Speech , Infant , Humans , Speech/physiology , Fixation, Ocular , Language , Evoked Potentials/physiology , Speech Perception/physiology
2.
Dev Cogn Neurosci ; 64: 101297, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37778275

ABSTRACT

Eye gaze is a powerful ostensive cue in infant-caregiver interactions, with demonstrable effects on language acquisition. While the link between gaze following and later vocabulary is well-established, the effects of eye gaze on other aspects of language, such as speech processing, are less clear. In this EEG study, we examined the effects of the speaker's eye gaze on ten-month-old infants' neural tracking of naturalistic audiovisual speech, a marker for successful speech processing. Infants watched videos of a speaker telling stories, addressing the infant with direct or averted eye gaze. We assessed infants' speech-brain coherence at stress (1-1.75 Hz) and syllable (2.5-3.5 Hz) rates, tested for differences in attention by comparing looking times and EEG theta power in the two conditions, and investigated whether neural tracking predicts later vocabulary. Our results showed that infants' brains tracked the speech rhythm both at the stress and syllable rates, and that infants' neural tracking at the syllable rate predicted later vocabulary. However, speech-brain coherence did not significantly differ between direct and averted gaze conditions and infants did not show greater attention to direct gaze. Overall, our results suggest significant neural tracking at ten months, related to vocabulary development, but not modulated by speaker's gaze.


Subject(s)
Fixation, Ocular , Speech Perception , Infant , Humans , Speech , Language Development , Language , Brain
3.
Neuroimage ; 252: 119049, 2022 05 15.
Article in English | MEDLINE | ID: mdl-35248707

ABSTRACT

Music is often described in the laboratory and in the classroom as a beneficial tool for memory encoding and retention, with a particularly strong effect when words are sung to familiar compared to unfamiliar melodies. However, the neural mechanisms underlying this memory benefit, especially for benefits related to familiar music are not well understood. The current study examined whether neural tracking of the slow syllable rhythms of speech and song is modulated by melody familiarity. Participants became familiar with twelve novel melodies over four days prior to MEG testing. Neural tracking of the same utterances spoken and sung revealed greater cerebro-acoustic phase coherence for sung compared to spoken utterances, but did not show an effect of familiar melody when stimuli were grouped by their assigned (trained) familiarity. However, when participant's subjective ratings of perceived familiarity were used to group stimuli, a large effect of familiarity was observed. This effect was not specific to song, as it was observed in both sung and spoken utterances. Exploratory analyses revealed some in-session learning of unfamiliar and spoken utterances, with increased neural tracking for untrained stimuli by the end of the MEG testing session. Our results indicate that top-down factors like familiarity are strong modulators of neural tracking for music and language. Participants' neural tracking was related to their perception of familiarity, which was likely driven by a combination of effects from repeated listening, stimulus-specific melodic simplicity, and individual differences. Beyond simply the acoustic features of music, top-down factors built into the music listening experience, like repetition and familiarity, play a large role in the way we attend to and encode information presented in a musical context.


Subject(s)
Music , Singing , Auditory Perception , Humans , Recognition, Psychology , Speech
4.
Neurobiol Lang (Camb) ; 3(3): 495-514, 2022.
Article in English | MEDLINE | ID: mdl-37216063

ABSTRACT

During speech processing, neural activity in non-autistic adults and infants tracks the speech envelope. Recent research in adults indicates that this neural tracking relates to linguistic knowledge and may be reduced in autism. Such reduced tracking, if present already in infancy, could impede language development. In the current study, we focused on children with a family history of autism, who often show a delay in first language acquisition. We investigated whether differences in tracking of sung nursery rhymes during infancy relate to language development and autism symptoms in childhood. We assessed speech-brain coherence at either 10 or 14 months of age in a total of 22 infants with high likelihood of autism due to family history and 19 infants without family history of autism. We analyzed the relationship between speech-brain coherence in these infants and their vocabulary at 24 months as well as autism symptoms at 36 months. Our results showed significant speech-brain coherence in the 10- and 14-month-old infants. We found no evidence for a relationship between speech-brain coherence and later autism symptoms. Importantly, speech-brain coherence in the stressed syllable rate (1-3 Hz) predicted later vocabulary. Follow-up analyses showed evidence for a relationship between tracking and vocabulary only in 10-month-olds but not in 14-month-olds and indicated possible differences between the likelihood groups. Thus, early tracking of sung nursery rhymes is related to language development in childhood.

5.
Front Psychol ; 12: 680882, 2021.
Article in English | MEDLINE | ID: mdl-34552527

ABSTRACT

Rhyme perception is an important predictor for future literacy. Assessing rhyme abilities, however, commonly requires children to make explicit rhyme judgements on single words. Here we explored whether infants already implicitly process rhymes in natural rhyming contexts (child songs) and whether this response correlates with later vocabulary size. In a passive listening ERP study, 10.5 month-old Dutch infants were exposed to rhyming and non-rhyming child songs. Two types of rhyme effects were analysed: (1) ERPs elicited by the first rhyme occurring in each song (rhyme sensitivity) and (2) ERPs elicited by rhymes repeating after the first rhyme in each song (rhyme repetition). Only for the latter a tentative negativity for rhymes from 0 to 200 ms after the onset of the rhyme word was found. This rhyme repetition effect correlated with productive vocabulary at 18 months-old, but not with any other vocabulary measure (perception at 10.5 or 18 months-old). While awaiting future replication, the study indicates precursors of phonological awareness already during infancy and with ecologically valid linguistic stimuli.

6.
Front Hum Neurosci ; 15: 629648, 2021.
Article in English | MEDLINE | ID: mdl-34163338

ABSTRACT

The nature of phonological representations has been extensively studied in phonology and psycholinguistics. While full specification is still the norm in psycholinguistic research, underspecified representations may better account for perceptual asymmetries. In this paper, we report on a mismatch negativity (MMN) study with Dutch listeners who took part in a passive oddball paradigm to investigate when the brain notices the difference between expected and observed vowels. In particular, we tested neural discrimination (indicating perceptual discrimination) of the tense mid vowel pairs /o/-/ø/ (place contrast), /e/-/ø/ (labiality or rounding contrast), and /e/-/o/ (place and labiality contrast). Our results show (a) a perceptual asymmetry for place in the /o/-/ø/ contrast, supporting underspecification of [CORONAL] and replicating earlier results for German, and (b) a perceptual asymmetry for labiality for the /e/-/ø/ contrast, which was not reported in the German study. A labial deviant [ø] (standard /e/) yielded a larger MMN than a deviant [e] (standard /ø/). No asymmetry was found for the two-feature contrast. This study partly replicates a similar MMN study on German vowels, and partly presents new findings indicating cross-linguistic differences. Although the vowel inventory of Dutch and German is to a large extent comparable, their (morpho)phonological systems are different, which is reflected in processing.

7.
Infancy ; 25(5): 699-718, 2020 09.
Article in English | MEDLINE | ID: mdl-32794372

ABSTRACT

Infants exploit acoustic boundaries to perceptually organize phrases in speech. This prosodic parsing ability is well-attested and is a cornerstone to the development of speech perception and grammar. However, infants also receive linguistic input in child songs. This study provides evidence that infants parse songs into meaningful phrasal units and replicates previous research for speech. Six-month-old Dutch infants (n = 80) were tested in the song or speech modality in the head-turn preference procedure. First, infants were familiarized to two versions of the same word sequence: One version represented a well-formed unit, and the other contained a phrase boundary halfway through. At test, infants were presented two passages, each containing one version of the familiarized sequence. The results for speech replicated the previously observed preference for the passage containing the well-formed sequence, but only in a more fine-grained analysis. The preference for well-formed phrases was also observed in the song modality, indicating that infants recognize phrase structure in song. There were acoustic differences between stimuli of the current and previous studies, suggesting that infants are flexible in their processing of boundary cues while also providing a possible explanation for differences in effect sizes.


Subject(s)
Child Development/physiology , Choice Behavior/physiology , Infant Behavior/physiology , Recognition, Psychology/physiology , Singing , Speech Perception/physiology , Female , Humans , Infant , Male
8.
Brain Sci ; 10(1)2020 Jan 09.
Article in English | MEDLINE | ID: mdl-31936586

ABSTRACT

Children's songs are omnipresent and highly attractive stimuli in infants' input. Previous work suggests that infants process linguistic-phonetic information from simplified sung melodies. The present study investigated whether infants learn words from ecologically valid children's songs. Testing 40 Dutch-learning 10-month-olds in a familiarization-then-test electroencephalography (EEG) paradigm, this study asked whether infants can segment repeated target words embedded in songs during familiarization and subsequently recognize those words in continuous speech in the test phase. To replicate previous speech work and compare segmentation across modalities, infants participated in both song and speech sessions. Results showed a positive event-related potential (ERP) familiarity effect to the final compared to the first target occurrences during both song and speech familiarization. No evidence was found for word recognition in the test phase following either song or speech. Comparisons across the stimuli of the present and a comparable previous study suggested that acoustic prominence and speech rate may have contributed to the polarity of the ERP familiarity effect and its absence in the test phase. Overall, the present study provides evidence that 10-month-old infants can segment words embedded in songs, and it raises questions about the acoustic and other factors that enable or hinder infant word segmentation from songs and speech.

9.
Front Psychol ; 11: 589096, 2020.
Article in English | MEDLINE | ID: mdl-33584424

ABSTRACT

Eye gaze is a ubiquitous cue in child-caregiver interactions, and infants are highly attentive to eye gaze from very early on. However, the question of why infants show gaze-sensitive behavior, and what role this sensitivity to gaze plays in their language development, is not yet well-understood. To gain a better understanding of the role of eye gaze in infants' language learning, we conducted a broad systematic review of the developmental literature for all studies that investigate the role of eye gaze in infants' language development. Across 77 peer-reviewed articles containing data from typically developing human infants (0-24 months) in the domain of language development, we identified two broad themes. The first tracked the effect of eye gaze on four developmental domains: (1) vocabulary development, (2) word-object mapping, (3) object processing, and (4) speech processing. Overall, there is considerable evidence that infants learn more about objects and are more likely to form word-object mappings in the presence of eye gaze cues, both of which are necessary for learning words. In addition, there is good evidence for longitudinal relationships between infants' gaze following abilities and later receptive and expressive vocabulary. However, many domains (e.g., speech processing) are understudied; further work is needed to decide whether gaze effects are specific to tasks, such as word-object mapping or whether they reflect a general learning enhancement mechanism. The second theme explored the reasons why eye gaze might be facilitative for learning, addressing the question of whether eye gaze is treated by infants as a specialized socio-cognitive cue. We concluded that the balance of evidence supports the idea that eye gaze facilitates infants' learning by enhancing their arousal, memory, and attentional capacities to a greater extent than other low-level attentional cues. However, as yet, there are too few studies that directly compare the effect of eye gaze cues and non-social, attentional cues for strong conclusions to be drawn. We also suggest that there might be a developmental effect, with eye gaze, over the course of the first 2 years of life, developing into a truly ostensive cue that enhances language learning across the board.

10.
Infant Behav Dev ; 52: 130-139, 2018 08.
Article in English | MEDLINE | ID: mdl-30086413

ABSTRACT

Children's songs often contain rhyming words at phrase endings. In this study, we investigated whether infants can already recognize this phonological pattern in songs. Earlier studies using lists of spoken words were equivocal on infants' spontaneous processing of rhymes (Hayes et al., 2000; Jusczyk et al., 1999). Songs, however, constitute an ecologically valid rhyming stimulus, which could allow for spontaneous processing of this phonological pattern in infants. Novel children's songs with rhyming and non-rhyming lyrics using pseudo-words were presented to 35 9-month-old Dutch infants using the Headturn Preference Procedure. Infants on average listened longer to the non-rhyming songs, with around half of the infants however exhibiting a preference for the rhyming songs. These results highlight that infants have the processing abilities to benefit from their natural rhyming input for the development of their phonological abilities.


Subject(s)
Auditory Perception/physiology , Music , Phonetics , Female , Germany , Humans , Infant , Male
11.
Brain Lang ; 172: 16-21, 2017 09.
Article in English | MEDLINE | ID: mdl-27059522

ABSTRACT

The CNTNAP2 gene encodes a cell-adhesion molecule that influences the properties of neural networks and the morphology and density of neurons and glial cells. Previous studies have shown association of CNTNAP2 variants with language-related phenotypes in health and disease. Here, we report associations of a common CNTNAP2 polymorphism (rs7794745) with variation in grey matter in a region in the dorsal visual stream. We tried to replicate an earlier study on 314 subjects by Tan et al. (2010), but now in a substantially larger group of more than 1700 subjects. Carriers of the T allele showed reduced grey matter volume in left superior occipital gyrus, while we did not replicate associations with grey matter volume in other regions identified by Tan et al. (2010). Our work illustrates the importance of independent replication in neuroimaging genetic studies of language-related candidate genes.


Subject(s)
Gray Matter/pathology , Membrane Proteins/genetics , Nerve Tissue Proteins/genetics , Occipital Lobe/metabolism , Occipital Lobe/pathology , Polymorphism, Single Nucleotide/genetics , Alleles , Female , Genetic Association Studies , Humans , Language , Male , Neuroimaging
12.
Neuropsychologia ; 95: 21-29, 2017 01 27.
Article in English | MEDLINE | ID: mdl-27939189

ABSTRACT

In everyday communication speakers often refer in speech and/or gesture to objects in their immediate environment, thereby shifting their addressee's attention to an intended referent. The neurobiological infrastructure involved in the comprehension of such basic multimodal communicative acts remains unclear. In an event-related fMRI study, we presented participants with pictures of a speaker and two objects while they concurrently listened to her speech. In each picture, one of the objects was singled out, either through the speaker's index-finger pointing gesture or through a visual cue that made the object perceptually more salient in the absence of gesture. A mismatch (compared to a match) between speech and the object singled out by the speaker's pointing gesture led to enhanced activation in left IFG and bilateral pMTG, showing the importance of these areas in conceptual matching between speech and referent. Moreover, a match (compared to a mismatch) between speech and the object made salient through a visual cue led to enhanced activation in the mentalizing system, arguably reflecting an attempt to converge on a jointly attended referent in the absence of pointing. These findings shed new light on the neurobiological underpinnings of the core communicative process of comprehending a speaker's multimodal referential act and stress the power of pointing as an important natural device to link speech to objects.


Subject(s)
Brain/physiology , Gestures , Speech Perception/physiology , Visual Perception/physiology , Adolescent , Adult , Brain Mapping , Comprehension/physiology , Cues , Female , Humans , Magnetic Resonance Imaging , Male , Neuropsychological Tests , Young Adult
13.
Brain Lang ; 163: 22-31, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27639117

ABSTRACT

Native speakers of Dutch do not always adhere to prescriptive grammar rules in their daily speech. These grammatical norm violations can elicit emotional reactions in language purists, mostly high-educated people, who claim that for them these constructions are truly ungrammatical. However, linguists generally assume that grammatical norm violations are in fact truly grammatical, especially when they occur frequently in a language. In an fMRI study we investigated the processing of grammatical norm violations in the brains of language purists, and compared them with truly grammatical and truly ungrammatical sentences. Grammatical norm violations were found to be unique in that their processing resembled not only the processing of truly grammatical sentences (in left medial Superior Frontal Gyrus and Angular Gyrus), but also that of truly ungrammatical sentences (in Inferior Frontal Gyrus), despite what theories of grammar would usually lead us to believe.


Subject(s)
Brain/physiology , Linguistics , Speech Perception/physiology , Speech , Adult , Brain Mapping , Female , Frontal Lobe/physiology , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Netherlands
14.
Front Psychol ; 6: 667, 2015.
Article in English | MEDLINE | ID: mdl-26074838

ABSTRACT

Both categorization and segmentation processes play a crucial role in face perception. However, the functional relation between these subprocesses is currently unclear. The present study investigates the temporal relation between segmentation-related and category-selective responses in the brain, using electroencephalography (EEG). Surface segmentation and category content were both manipulated using texture-defined objects, including faces. This allowed us to study brain activity related to segmentation and to categorization. In the main experiment, participants viewed texture-defined objects for a duration of 800 ms. EEG results revealed that segmentation-related responses precede category-selective responses. Three additional experiments revealed that the presence and timing of categorization depends on stimulus properties and presentation duration. Photographic objects were presented for a long and short (92 ms) duration and evoked fast category-selective responses in both cases. On the other hand, presentation of texture-defined objects for a short duration only evoked segmentation-related but no category-selective responses. Category-selective responses were much slower when evoked by texture-defined than by photographic objects. We suggest that in case of categorization of objects under suboptimal conditions, such as when low-level stimulus properties are not sufficient for fast object categorization, segmentation facilitates the slower categorization process.

15.
J Autism Dev Disord ; 44(2): 443-51, 2014 Feb.
Article in English | MEDLINE | ID: mdl-23838729

ABSTRACT

Superiority in visual search for individuals diagnosed with autism spectrum disorder (ASD) is a well-reported finding. We administered two visual search tasks to individuals with ASD and matched controls. One showed no difference between the groups, and one did show the expected superior performance for individuals with ASD. These results offer an explanation, formulated in terms of load theory. We suggest that there is a limit to the superiority in visual search for individuals with ASD, related to the perceptual load of the stimuli. When perceptual load becomes so high that no additional task-(ir)relevant information can be processed, performance will be based on single stimulus identification, in which no differences between individuals with ASD and controls have been demonstrated.


Subject(s)
Child Development Disorders, Pervasive/physiopathology , Visual Perception , Adolescent , Case-Control Studies , Eye Movements , Female , Humans , Male , Reaction Time , Young Adult
16.
Neuroimage Clin ; 3: 65-72, 2013.
Article in English | MEDLINE | ID: mdl-24179850

ABSTRACT

Atypical visual perception in people with autism spectrum disorders (ASD) is hypothesized to stem from an imbalance in excitatory and inhibitory processes in the brain. We used neuronal oscillations in the gamma frequency range (30-90 Hz), which emerge from a balanced interaction of excitation and inhibition in the brain, to assess contextual modulation processes in early visual perception. Electroencephalography was recorded in 12 high-functioning adults with ASD and 12 age- and IQ-matched control participants. Oscillations in the gamma frequency range were analyzed in response to stimuli consisting of small line-like elements. Orientation-specific contextual modulation was manipulated by parametrically increasing the amount of homogeneously oriented elements in the stimuli. The stimuli elicited a strong steady-state gamma response around the refresh-rate of 60 Hz, which was larger for controls than for participants with ASD. The amount of orientation homogeneity (contextual modulation) influenced the gamma response in control subjects, while for subjects with ASD this was not the case. The atypical steady-state gamma response to contextual modulation in subjects with ASD may capture the link between an imbalance in excitatory and inhibitory neuronal processing and atypical visual processing in ASD.

17.
Brain Connect ; 3(1): 41-9, 2013.
Article in English | MEDLINE | ID: mdl-23259692

ABSTRACT

Communication and integration of information between brain regions plays a key role in healthy brain function. Conversely, disruption in brain communication may lead to cognitive and behavioral problems. Autism is a neurodevelopmental disorder that is characterized by impaired social interactions and aberrant basic information processing. Aberrant brain connectivity patterns have indeed been hypothesized to be a key neural underpinning of autism. In this study, graph analytical tools are used to explore the possible deviant functional brain network organization in autism at a very early stage of brain development. Electroencephalography (EEG) recordings in 12 toddlers with autism (mean age 3.5 years) and 19 control subjects were used to assess interregional functional brain connectivity, with functional brain networks constructed at the level of temporal synchronization between brain regions underlying the EEG electrodes. Children with autism showed a significantly increased normalized path length and reduced normalized clustering, suggesting a reduced global communication capacity already during early brain development. In addition, whole brain connectivity was found to be significantly reduced in these young patients suggesting an overall under-connectivity of functional brain networks in autism. Our findings support the hypothesis of abnormal neural communication in autism, with deviating effects already present at the early stages of brain development.


Subject(s)
Autistic Disorder/physiopathology , Brain/physiopathology , Neural Pathways/physiopathology , Child, Preschool , Electroencephalography , Female , Humans , Male , Nerve Net/physiopathology , Signal Processing, Computer-Assisted
18.
Proc Natl Acad Sci U S A ; 109(52): 21504-9, 2012 Dec 26.
Article in English | MEDLINE | ID: mdl-23236162

ABSTRACT

The human brain has the extraordinary capability to transform cluttered sensory input into distinct object representations. For example, it is able to rapidly and seemingly without effort detect object categories in complex natural scenes. Surprisingly, category tuning is not sufficient to achieve conscious recognition of objects. What neural process beyond category extraction might elevate neural representations to the level where objects are consciously perceived? Here we show that visible and invisible faces produce similar category-selective responses in the ventral visual cortex. The pattern of neural activity evoked by visible faces could be used to decode the presence of invisible faces and vice versa. However, only visible faces caused extensive response enhancements and changes in neural oscillatory synchronization, as well as increased functional connectivity between higher and lower visual areas. We conclude that conscious face perception is more tightly linked to neural processes of sustained information integration and binding than to processes accommodating face category tuning.


Subject(s)
Consciousness/physiology , Neurons/physiology , Visual Cortex/physiology , Visual Perception/physiology , Face , Female , Humans , Male
19.
PLoS One ; 7(10): e46995, 2012.
Article in English | MEDLINE | ID: mdl-23115634

ABSTRACT

The genetic FOXP2-CNTNAP2 pathway has been shown to be involved in the language capacity. We investigated whether a common variant of CNTNAP2 (rs7794745) is relevant for syntactic and semantic processing in the general population by using a visual sentence processing paradigm while recording ERPs in 49 healthy adults. While both AA homozygotes and T-carriers showed a standard N400 effect to semantic anomalies, the response to subject-verb agreement violations differed across genotype groups. T-carriers displayed an anterior negativity preceding the P600 effect, whereas for the AA group only a P600 effect was observed. These results provide another piece of evidence that the neuronal architecture of the human faculty of language is shaped differently by effects that are genetically determined.


Subject(s)
Evoked Potentials , Language , Membrane Proteins/physiology , Nerve Tissue Proteins/physiology , Adolescent , Adult , Electroencephalography , Humans , Membrane Proteins/genetics , Nerve Tissue Proteins/genetics , Reference Values , Young Adult
20.
Neuroimage ; 52(4): 1633-44, 2010 Oct 01.
Article in English | MEDLINE | ID: mdl-20493954

ABSTRACT

In a recent fMRI study we showed that left posterior middle temporal gyrus (LpMTG) subserves the retrieval of a word's lexical-syntactic properties from the mental lexicon (long-term memory), while left posterior inferior frontal gyrus (LpIFG) is involved in unifying (on-line integration of) this information into a sentence structure (Snijders et al., 2009). In addition, the right IFG, right MTG, and the right striatum were involved in the unification process. Here we report results from a psychophysical interactions (PPI) analysis in which we investigated the effective connectivity between LpIFG and LpMTG during unification, and how the right hemisphere areas and the striatum are functionally connected to the unification network. LpIFG and LpMTG both showed enhanced connectivity during the unification process with a region slightly superior to our previously reported LpMTG. Right IFG better predicted right temporal activity when unification processes were more strongly engaged, just as LpIFG better predicted left temporal activity. Furthermore, the striatum showed enhanced coupling to LpIFG and LpMTG during unification. We conclude that bilateral inferior frontal and posterior temporal regions are functionally connected during sentence-level unification. Cortico-subcortical connectivity patterns suggest cooperation between inferior frontal and striatal regions in performing unification operations on lexical-syntactic representations retrieved from LpMTG.


Subject(s)
Brain/physiology , Comprehension/physiology , Language , Magnetic Resonance Imaging , Memory/physiology , Nerve Net/physiology , Neural Pathways/physiology , Semantics , Adolescent , Adult , Female , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...