Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
Add more filters










Publication year range
1.
Front Hum Neurosci ; 18: 1358380, 2024.
Article in English | MEDLINE | ID: mdl-38638804

ABSTRACT

Auditory processing of speech and non-speech stimuli oftentimes involves the analysis and acquisition of non-adjacent sound patterns. Previous studies using speech material have demonstrated (i) children's early emerging ability to extract non-adjacent dependencies (NADs) and (ii) a relation between basic auditory perception and this ability. Yet, it is currently unclear whether children show similar sensitivities and similar perceptual influences for NADs in the non-linguistic domain. We conducted an event-related potential study with 3-year-old children using a sine-tone-based oddball task, which simultaneously tested for NAD learning and auditory perception by means of varying sound intensity. Standard stimuli were A × B sine-tone sequences, in which specific A elements predicted specific B elements after variable × elements. NAD deviants violated the dependency between A and B and intensity deviants were reduced in amplitude. Both elicited similar frontally distributed positivities, suggesting successful deviant detection. Crucially, there was a predictive relationship between the amplitude of the sound intensity discrimination effect and the amplitude of the NAD learning effect. These results are taken as evidence that NAD learning in the non-linguistic domain is functional in 3-year-olds and that basic auditory processes are related to the learning of higher-order auditory regularities also outside the linguistic domain.

2.
iScience ; 26(7): 106977, 2023 Jul 21.
Article in English | MEDLINE | ID: mdl-37332672

ABSTRACT

A critical component of language is the ability to recombine sounds into larger structures. Although animals also reuse sound elements across call combinations to generate meaning, examples are generally limited to pairs of distinct elements, even when repertoires contain sufficient sounds to generate hundreds of combinations. This combinatoriality might be constrained by the perceptual-cognitive demands of disambiguating between complex sound sequences that share elements. We test this hypothesis by probing the capacity of chestnut-crowned babblers to process combinations of two versus three distinct acoustic elements. We found babblers responded quicker and for longer toward playbacks of recombined versus familiar bi-element sequences, but no evidence of differential responses toward playbacks of recombined versus familiar tri-element sequences, suggesting a cognitively prohibitive jump in processing demands. We propose that overcoming constraints in the ability to process increasingly complex combinatorial signals was necessary for the productive combinatoriality that is characteristic of language to emerge.

3.
Dev Cogn Neurosci ; 57: 101149, 2022 Oct.
Article in English | MEDLINE | ID: mdl-36084447

ABSTRACT

Language acquisition requires infants' ability to track dependencies between distant speech elements. Infants as young as 3 months have been shown to successfully identify such non-adjacent dependencies between syllables, and this ability has been related to the maturity of infants' pitch processing. The present study tested whether 8- to 10-month-old infants (N = 68) can also learn dependencies at smaller segmental levels and whether the relation between dependency and pitch processing extends to other auditory features. Infants heard either syllable sequences encoding an item-specific dependency between non-adjacent vowels or between consonants. These frequent standard sequences were interspersed with infrequent intensity deviants and dependency deviants, which violated the non-adjacent relationship. Both vowel and consonant groups showed electrophysiological evidence for detection of the intensity manipulation. However, evidence for dependency learning was only found for infants hearing the dependencies across vowels, not consonants, and only in a subgroup of infants who had an above-average language score in a behavioral test. In a correlation analysis, we found no relation between intensity and dependency processing. We conclude that item-specific, segment-based non-adjacent dependencies are not easily learned by infants and if so, vowels are more accessible to the task, but only to infants who display advanced language skills.

4.
Cortex ; 152: 36-52, 2022 07.
Article in English | MEDLINE | ID: mdl-35504050

ABSTRACT

Despite humans' ability to communicate about concepts relating to different senses, word learning research tends to largely focus on labeling visual objects. Although sensory modality is known to influence memory and learning, its specific role for word learning remains largely unclear. We investigated associative word learning in adults, that is the association of an object with its label, by means of event-related brain potentials (ERPs). We evaluated how learning is affected by object modality (auditory vs visual) and temporal synchrony of object-label presentations (sequential vs simultaneous). Across 4 experiments, adults were, in training phases, presented either visual objects (real-world images) or auditory objects (environmental sounds) in temporal synchrony with or followed by novel pseudowords (2 × 2 design). Objects and pseudowords were paired either in a consistent or an inconsistent manner. In subsequent testing phases, the consistent pairs were presented in matching or violated pairings. Here, behavioral and ERP responses should reveal whether consistent object-pseudoword pairs had been successfully associated with one another during training. The visual-object experiments yielded behavioral learning effects and an increased N400 amplitude for violated versus matched pairings indicating short-term retention of object-word associations, in both the simultaneous and sequential presentation conditions. For the auditory-object experiments, only the simultaneous, but not the sequential presentation, revealed similar results. Across all experiments, we found behavioral and ERP correlates of associative word learning to be affected by both sensory modality and partly, by temporal synchrony of object-label combinations. Based on our findings, we argue for independent advantages of temporal synchrony and visual modality in associative word learning.


Subject(s)
Electroencephalography , Evoked Potentials , Adult , Conditioning, Classical , Electroencephalography/methods , Evoked Potentials/physiology , Female , Humans , Learning , Male , Verbal Learning/physiology
5.
J Cogn Neurosci ; 34(8): 1467-1487, 2022 07 01.
Article in English | MEDLINE | ID: mdl-35604359

ABSTRACT

Successful language processing entails tracking (morpho)syntactic relationships between distant units of speech, so-called nonadjacent dependencies (NADs). Many cues to such dependency relations have been identified, yet the linguistic elements encoding them have received little attention. In the present investigation, we tested whether and how these elements, here syllables, consonants, and vowels, affect behavioral learning success as well as learning-related changes in neural activity in relation to item-specific NAD learning. In a set of two EEG studies with adults, we compared learning under conditions where either all segment types (Experiment 1) or only one segment type (Experiment 2) was informative. The collected behavioral and ERP data indicate that, when all three segment types are available, participants mainly rely on the syllable for NAD learning. With only one segment type available for learning, adults also perform most successfully with syllable-based dependencies. Although we find no evidence for successful learning across vowels in Experiment 2, dependencies between consonants seem to be identified at least passively at the phonetic-feature level. Together, these results suggest that successful item-specific NAD learning may depend on the availability of syllabic information. Furthermore, they highlight consonants' distinctive power to support lexical processes. Although syllables show a clear facilitatory function for NAD learning, the underlying mechanisms of this advantage require further research.


Subject(s)
NAD , Speech Perception , Adult , Humans , Language , Phonetics , Speech
6.
J Psycholinguist Res ; 50(6): 1487-1509, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34424452

ABSTRACT

Existing work on the acquisition of polarity-sensitive expressions (PSIs) suggests that children show an early sensitivity to the restricted distribution of negative polarity items (NPIs), but may be delayed in the acquisition of positive polarity items (PPIs). However, past studies primarily targeted PSIs that are highly frequent in children's language input. In this paper, we report an experimental investigation on children's comprehension of two NPIs and two PPIs in German. Based on corpus data indicating that the four tested PSIs are present in child-directed speech but rare in young children's utterances, we conducted an auditory rating task with adults and 11- to 12-year-old children. The results demonstrate that, even at 11-12 years of age, children do not yet show a completely target-like comprehension of the investigated PSIs. While they are adult-like in their responses to one of the tested NPIs, their responses did not demonstrate a categorical distinction between licensed and unlicensed PSI uses for the other tested expressions. The effect was led by a higher acceptance of sentences containing unlicensed PSIs, indicating a lack of awareness for their distributional restrictions. The results of our study pose new questions for the developmental time scale of the acquisition of polarity items.


Subject(s)
Comprehension , Speech Perception , Adult , Child , Child, Preschool , Humans , Language , Language Development , Speech
7.
Sci Adv ; 7(30)2021 Jul.
Article in English | MEDLINE | ID: mdl-34290100

ABSTRACT

Rawski et al. revisit our recent findings suggesting the latent ability to process nonadjacent dependencies ("Non-ADs") in monkeys and apes. Specifically, the authors question the relevance of our findings for the evolution of human syntax. We argue that (i) these conclusions hinge upon an assumption that language processing is necessarily hierarchical, which remains an open question, and (ii) our goal was to probe the foundational cognitive mechanisms facilitating the processing of syntactic Non-ADs-namely, the ability to recognize predictive relationships in the input.

8.
Dev Cogn Neurosci ; 50: 100975, 2021 08.
Article in English | MEDLINE | ID: mdl-34139635

ABSTRACT

In order to become proficient native speakers, children have to learn the morpho-syntactic relations between distant elements in a sentence, so-called non-adjacent dependencies (NADs). Previous research suggests that NAD learning in children comprises different developmental stages, where until 2 years of age children are able to learn NADs associatively under passive listening conditions, while starting around the age of 3-4 years children fail to learn NADs during passive listening. To test whether the transition between these developmental stages occurs gradually, we tested children's NAD learning in a foreign language using event-related potentials (ERPs). We found ERP evidence of NAD learning across the ages of 1, 2 and 3 years. The amplitude of the ERP effect indexing NAD learning, however, decreased with age. These findings might indicate a gradual transition in children's ability to learn NADs associatively. Cognitively, this transition might be driven by children's increasing knowledge of their native language, hindering NAD learning in novel contexts. Neuroanatomically, maturation of the prefrontal cortex might play a crucial role, promoting top-down learning, affecting bottom-up, associative learning. In sum, our study suggests that NAD learning under passive listening conditions undergoes a gradual transition between different developmental stages during early childhood.


Subject(s)
Language , Learning , Auditory Perception , Child, Preschool , Evoked Potentials , Female , Humans , Language Development , Male
9.
Sci Adv ; 6(43)2020 10.
Article in English | MEDLINE | ID: mdl-33087361

ABSTRACT

The ability to track syntactic relationships between words, particularly over distances ("nonadjacent dependencies"), is a critical faculty underpinning human language, although its evolutionary origins remain poorly understood. While some monkey species are reported to process auditory nonadjacent dependencies, comparative data from apes are missing, complicating inferences regarding shared ancestry. Here, we examined nonadjacent dependency processing in common marmosets, chimpanzees, and humans using "artificial grammars": strings of arbitrary acoustic stimuli composed of adjacent (nonhumans) or nonadjacent (all species) dependencies. Individuals from each species (i) generalized the grammars to novel stimuli and (ii) detected grammatical violations, indicating that they processed the dependencies between constituent elements. Furthermore, there was no difference between marmosets and chimpanzees in their sensitivity to nonadjacent dependencies. These notable similarities between monkeys, apes, and humans indicate that nonadjacent dependency processing, a crucial cognitive facilitator of language, is an ancestral trait that evolved at least ~40 million years before language itself.


Subject(s)
Hominidae , Animals , Biological Evolution , Haplorhini , Humans , Language , Linguistics , Pan troglodytes
10.
Front Psychol ; 11: 1597, 2020.
Article in English | MEDLINE | ID: mdl-32760327

ABSTRACT

Recent scholarship emphasizes the scaffolding role of language for cognition. Language, it is claimed, is a cognition-enhancing niche (Clark, 2006), a programming tool for cognition (Lupyan and Bergen, 2016), even neuroenhancement (Dove, 2019) and augments cognitive functions such as memory, categorization, cognitive control, and meta-cognitive abilities ("thinking about thinking"). Yet, the notion that language enhances or augments cognition, and in particular, cognitive control does not easily fit in with embodied approaches to language processing, or so we will argue. Accounts aiming to explain how language enhances various cognitive functions often employ a notion of abstract representation. Yet, embodied approaches to language processing have it that language processing crucially, according to some accounts even exclusively, involves embodied, modality-specific, i.e., non-abstract representations. In coming to understand a particular phrase or sentence, a prior experience has to be simulated or reenacted. The representation thus activated is embodied (modality-specific) as sensorimotor regions of the brain are thereby recruited. In this paper, we will first discuss the notion of representation, clarify what it takes for a representation to be embodied or abstract, and distinguish between conceptual and (other) linguistic representations. We will then put forward a characterization of cognitive control and examine its representational infrastructure. The remainder of the paper will be devoted to arguing that language augments cognitive control. To that end, we will draw on two lines of research, which investigate how language augments cognitive control: (i) research on the availability of linguistic labels and (ii) research on the active usage of a linguistic code, specifically, in inner speech. Eventually, we will argue that the cognition-enhancing capacity of language can be explained once we assume that it provides us with (a) abstract, non-embodied representations and with (b) abstract, sparse linguistic representations that may serve as easy-to-manipulate placeholders for fully embodied or otherwise more detailed representations.

11.
Top Cogn Sci ; 12(3): 843-858, 2020 07.
Article in English | MEDLINE | ID: mdl-32729673

ABSTRACT

Learning and processing natural language requires the ability to track syntactic relationships between words and phrases in a sentence, which are often separated by intervening material. These nonadjacent dependencies can be studied using artificial grammar learning paradigms and structured sequence processing tasks. These approaches have been used to demonstrate that human adults, infants and some nonhuman animals are able to detect and learn dependencies between nonadjacent elements within a sequence. However, learning nonadjacent dependencies appears to be more cognitively demanding than detecting dependencies between adjacent elements, and only occurs in certain circumstances. In this review, we discuss different types of nonadjacent dependencies in language and in artificial grammar learning experiments, and how these differences might impact learning. We summarize different types of perceptual cues that facilitate learning, by highlighting the relationship between dependent elements bringing them closer together either physically, attentionally, or perceptually. Finally, we review artificial grammar learning experiments in human adults, infants, and nonhuman animals, and discuss how similarities and differences observed across these groups can provide insights into how language is learned across development and how these language-related abilities might have evolved.


Subject(s)
Biological Evolution , Human Development , Language , Learning , Linguistics , Adult , Animals , Human Development/physiology , Humans , Infant , Learning/physiology
12.
Dev Cogn Neurosci ; 45: 100821, 2020 10.
Article in English | MEDLINE | ID: mdl-32658761

ABSTRACT

Despite the prominence of non-visual semantic features for some words (e.g., siren or thunder), little is known about when and how the meanings of those words that refer to auditory objects can be acquired in early infancy. With associative learning being an important mechanism of word learning, we ask the question whether associations between sounds and words lead to similar learning effects as associations between visual objects and words. In an event-related potential (ERP) study, 10- to 12-month-old infants were presented with pairs of environmental sounds and pseudowords in either a consistent (where sound-word mapping can occur) or inconsistent manner. Subsequently, the infants were presented with sound-pseudoword combinations either matching or violating the consistent pairs from the training phase. In the training phase, we observed word-form familiarity effects and pairing consistency effects for ERPs time-locked to the onset of the word. The test phase revealed N400-like effects for violated pairs as compared to matching pairs. These results indicate that associative word learning is also possible for auditory objects before infants' first birthday. The specific temporal occurrence of the N400-like effect and topological distribution of the ERPs suggests that the object's modality has an impact on how novel words are processed.


Subject(s)
Brain Mapping/methods , Electroencephalography/methods , Evoked Potentials/physiology , Female , Humans , Infant , Male
13.
Top Cogn Sci ; 12(3): 859-874, 2020 07.
Article in English | MEDLINE | ID: mdl-30033636

ABSTRACT

Most human language learners acquire language primarily via the auditory modality. This is one reason why auditory artificial grammars play a prominent role in the investigation of the development and evolutionary roots of human syntax. The present position paper brings together findings from human and non-human research on the impact of auditory cues on learning about linguistic structures with a special focus on how different types of cues and biases in auditory cognition may contribute to success and failure in artificial grammar learning (AGL). The basis of our argument is the link between auditory cues and syntactic structure across languages and development. Cross-species comparison suggests that many aspects of auditory cognition that are relevant for language are not human specific and are present even in rather distantly related species. Furthermore, auditory cues and biases impact on learning, which we will discuss in the example of auditory perception and AGL studies. This observation, together with the significant role of auditory cues in language processing, supports the idea that auditory cues served as a bootstrap to syntax during language evolution. Yet this also means that potentially human-specific syntactic abilities are not due to basic auditory differences between humans and non-human animals but are based upon more advanced cognitive processes.


Subject(s)
Auditory Perception , Biological Evolution , Language , Learning , Linguistics , Animals , Auditory Perception/physiology , Cues , Humans , Learning/physiology , Speech Perception/physiology
14.
Front Hum Neurosci ; 13: 412, 2019.
Article in English | MEDLINE | ID: mdl-31866842

ABSTRACT

In order to memorize sentences we use both processes of language comprehension during encoding and processes of language production during maintenance. While the former processes are easily testable via controlled presentation of the input, the latter are more difficult to assess directly as language production is typically initiated and controlled internally. In the present event-related potential (ERP) study we track subvocal rehearsal of sentences, with the goal of studying the concomitant planning processes with the help of a silent cued-production task. Native German participants read different types of sentences word-by-word, then were prompted by a visual cue to silently repeat each individual word, in a rehearsal phase. In order to assess both local and global effects of sentence planning, we presented correct sentences, syntactically or semantically violated sentences, or random word order sequences. Semantic violations during reading elicited an N400 effect at the noun violating the selectional restrictions of the preceding verb. Syntactic violations, induced by a gender incongruency between determiner and noun, led to a P600 effect at the same position. Different ERP patterns occurred during the silent production phase. Here, semantically violated sentences elicited an early fronto-central negativity at the verb, while syntactically violated sentences elicited a late right-frontal positivity at the determiner. Random word order was accompanied by long-lasting slow waves during the production phase. The findings are consistent with models of hierarchical sentence planning and further indicate that the ongoing working memory processes are qualitatively distinct from comprehension mechanisms and neurophysiologically specific for syntactic and lexical-semantic level planning. In conclusion, active working memory maintenance of sentences is likely to comprise specific stages of sentence production that are indicated by ERP correlates of syntactic and semantic planning at the phrasal and clausal level respectively.

15.
Front Psychol ; 10: 376, 2019.
Article in English | MEDLINE | ID: mdl-30894822

ABSTRACT

One unresolved question about polarity sensitivity in theoretical linguistics concerns whether and to what extent negative and positive polarity items are parallel. Using event-related brain potentials (ERPs), previous studies found N400 and/or P600 components for negative and positive polarity violations with inconsistent results. We report on an ERP study of German polarity items. Both negative and positive polarity violations elicited biphasic N400/P600 effects relative to correct polarity conditions. Furthermore, negative polarity violations elicited a P600-only effect relative to positive polarity violations. The lack of a graded N400 effect indicates that both kinds of violations involve similar semantic processing costs. We attribute the increase in P600 amplitude of negative polarity violations relative to positive polarity violations to their different nature: the former are syntactic anomalies triggering structural reanalysis, whereas the latter are pragmatic oddities inducing discourse reanalysis. We conclude that negative and positive polarity violations involve at least partly distinct mechanisms.

16.
Dev Sci ; 22(1): e12700, 2019 01.
Article in English | MEDLINE | ID: mdl-29949219

ABSTRACT

Infants' ability to learn complex linguistic regularities from early on has been revealed by electrophysiological studies indicating that 3-month-olds, but not adults, can automatically detect non-adjacent dependencies between syllables. While different ERP responses in adults and infants suggest that both linguistic rule learning and its link to basic auditory processing undergo developmental changes, systematic investigations of the developmental trajectories are scarce. In the present study, we assessed 2- and 4-year-olds' ERP indicators of pitch discrimination and linguistic rule learning in a syllable-based oddball design. To test for the relation between auditory discrimination and rule learning, ERP responses to pitch changes were used as predictor for potential linguistic rule-learning effects. Results revealed that 2-year-olds, but not 4-year-olds, showed ERP markers of rule learning. Although, 2-year-olds' rule learning was not dependent on differences in pitch perception, 4-year-old children demonstrated a dependency, such that those children who showed more pronounced responses to pitch changes still showed an effect of rule learning. These results narrow down the developmental decline of the ability for automatic linguistic rule learning to the age between 2 and 4 years, and, moreover, point towards a strong modification of this change by auditory processes. At an age when the ability of automatic linguistic rule learning phases out, rule learning can still be observed in children with enhanced auditory responses. The observed interrelations are plausible causes for age-of-acquisition effects and inter-individual differences in language learning.


Subject(s)
Language Development , Learning/physiology , Pitch Perception/physiology , Adult , Auditory Perception/physiology , Child , Child, Preschool , Female , Humans , Individuality , Linguistics , Male
17.
Neuroimage ; 152: 647-657, 2017 05 15.
Article in English | MEDLINE | ID: mdl-28288909

ABSTRACT

Sentences are easier to remember than random word sequences, likely because linguistic regularities facilitate chunking of words into meaningful groups. The present electroencephalography study investigated the neural oscillations modulated by this so-called sentence superiority effect during the encoding and maintenance of sentence fragments versus word lists. We hypothesized a chunking-related modulation of neural processing during the encoding and retention of sentences (i.e., sentence fragments) as compared to word lists. Time-frequency analysis revealed a two-fold oscillatory pattern for the memorization of sentences: Sentence encoding was accompanied by higher delta amplitude (4Hz), originating both from regions processing syntax as well as semantics (bilateral superior/middle temporal regions and fusiform gyrus). Subsequent sentence retention was reflected in decreased theta (6Hz) and beta/gamma (27-32Hz) amplitude instead. Notably, whether participants simply read or properly memorized the sentences did not impact chunking-related activity during encoding. Therefore, we argue that the sentence superiority effect is grounded in highly automatized language processing mechanisms, which generate meaningful memory chunks irrespective of task demands.


Subject(s)
Brain Waves , Brain/physiology , Comprehension/physiology , Memory/physiology , Reading , Speech Perception/physiology , Adult , Delta Rhythm , Female , Humans , Male , Semantics , Young Adult
18.
Sci Rep ; 6: 36259, 2016 11 09.
Article in English | MEDLINE | ID: mdl-27827366

ABSTRACT

There is considerable interest in understanding the ontogeny and phylogeny of the human language system, yet, neurobiological work at the interface of both fields is absent. Syntactic processes in language build on sensory processing and sequencing capabilities on the side of the receiver. While we better understand language-related ontogenetic changes in the human brain, it remains a mystery how neurobiological processes at specific human development stages compare with those in phylogenetically closely related species. To address this knowledge gap, we measured EEG event-related potentials (ERPs) in two macaque monkeys using a paradigm developed to evaluate human infant and adult brain potentials associated with the processing of non-adjacent ordering relationships in sequences of syllable triplets. Frequent standard triplet sequences were interspersed with infrequent voice pitch or non-adjacent rule deviants. Monkey ERPs show early pitch and rule deviant mismatch responses that are strikingly similar to those previously reported in human infants. This stands in contrast to adults' later ERP responses for rule deviants. The results reveal how non-adjacent sequence ordering relationships are processed in the primate brain and provide evidence for evolutionarily conserved neurophysiological effects, some of which are remarkably like those seen at an early human developmental stage.


Subject(s)
Acoustic Stimulation/methods , Brain/physiology , Evoked Potentials , Adult , Animals , Electroencephalography , Evoked Potentials, Auditory , Evolution, Molecular , Humans , Infant , Language , Language Development , Macaca mulatta , Pitch Perception
19.
Front Hum Neurosci ; 10: 551, 2016.
Article in English | MEDLINE | ID: mdl-27877120

ABSTRACT

Sensitivity to regularities plays a crucial role in the acquisition of various linguistic features from spoken language input. Artificial grammar learning paradigms explore pattern recognition abilities in a set of structured sequences (i.e., of syllables or letters). In the present study, we investigated the functional underpinnings of learning phonological regularities in auditorily presented syllable sequences. While previous neuroimaging studies either focused on functional differences between the processing of correct vs. incorrect sequences or between different levels of sequence complexity, here the focus is on the neural foundation of the actual learning success. During functional magnetic resonance imaging (fMRI), participants were exposed to a set of syllable sequences with an underlying phonological rule system, known to ensure performance differences between participants. We expected that successful learning and rule application would require phonological segmentation and phoneme comparison. As an outcome of four alternating learning and test fMRI sessions, participants split into successful learners and non-learners. Relative to non-learners, successful learners showed increased task-related activity in a fronto-parietal network of brain areas encompassing the left lateral premotor cortex as well as bilateral superior and inferior parietal cortices during both learning and rule application. These areas were previously associated with phonological segmentation, phoneme comparison, and verbal working memory. Based on these activity patterns and the phonological strategies for rule acquisition and application, we argue that successful learning and processing of complex phonological rules in our paradigm is mediated via a fronto-parietal network for phonological processes.

20.
Front Psychol ; 7: 133, 2016.
Article in English | MEDLINE | ID: mdl-26903930

ABSTRACT

In the present study, we investigate how early and late L2 learners process L2 grammatical traits that are either present or absent in their native language (L1). Thirteen early (AoA = 4 years old) and 13 late (AoA = 18 years old) Spanish learners of Basque performed a grammatical judgment task on auditory Basque sentences while their event-related brain potentials (ERPs) were recorded. The sentences contained violations of a syntactic property specific to participants' L2, i.e., ergative case, or violations of a syntactic property present in both of the participants' languages, i.e., verb agreement. Two forms of verb agreement were tested: subject agreement, found in participants' L1 and L2, and object agreement, present only in participants' L2. Behaviorally, early bilinguals were more accurate in the judgment task than late L2 learners. Early bilinguals showed native-like ERPs for verb agreement, which differed from the late learners' ERP pattern. Nonetheless, approximation to native-likeness was greater for the subject-verb agreement processing, the type of verb-agreement present in participants' L1, compared to object-verb agreement, the type of verb-agreement present only in participants' L2. For the ergative argument alignment, unique to L2, the two non-native groups showed similar ERP patterns which did not correspond to the natives' ERP pattern. We conclude that non-native syntactic processing approximates native processing for early L2 acquisition and high proficiency levels when the syntactic property is common to the L1 and L2. However, syntactic traits that are not present in the L1 do not rely on native-like processing, despite early AoA and high proficiency.

SELECTION OF CITATIONS
SEARCH DETAIL
...