Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
Cortex ; 151: 147-161, 2022 06.
Article in English | MEDLINE | ID: mdl-35413597

ABSTRACT

Converging behavioral and neuroimaging evidence suggests parallel activation of native (L1) and second (L2) language codes in bilinguals, with the modulation of the N400 as the most likely neural correlate of such L1-L2 interplay at lexico-semantic level. However, this relatively late effect may reflect secondary controlled processes, in contrast to earlier modulations found in monolinguals (<200 msec) indicative of fast and automatic lexico-semantic L1 access, which has so far not been documented for bilingualism. To address this, we investigated early neurophysiological crosslinguistic activation during bilingual word access. EEG signals were recorded from a group of late bilinguals during a masked-priming crosslinguistic task in which L1 (Russian) words were presented as subliminal primes for 50 msec before L2 (English) target words. Prime-target pairs matched either phonologically only, semantically only, both phonologically and semantically, or did not match. Cluster-based random permutation analyses revealed a main effect of semantic similarity at 40-60 msec over centro-posterior scalp sites, reflecting a negative-going shift of ERP amplitudes for semantically similar prime-target pairs. Importantly, neural source reconstruction showed activations within a left-hemispheric network comprising the middle and superior temporal cortex and the angular gyrus as the most likely neural substrate of this early semantic effect. Furthermore, analyses also revealed significant differences over frontocentral sites for the main effect of semantic and phonological similarity, ranging from 312 to 356 and 380-444 msec respectively, thus confirming previously described N400 crosslinguistic effects. Our findings confirm the existence of an integrated brain network for the bilingual lexicon and reveal the earliest (∼50 msec) crosslinguistic effect reported so far, suggesting fast and automatic L1-L2 interplay, followed by later (possibly top-down controlled) processing stages.


Subject(s)
Multilingualism , Semantics , Electroencephalography/methods , Evoked Potentials/physiology , Female , Humans , Language , Male
2.
J Physiol Paris ; 102(1-3): 50-8, 2008.
Article in English | MEDLINE | ID: mdl-18485679

ABSTRACT

Numerous previous neuroimaging studies suggest an involvement of cortical motor areas not only in action execution but also in action recognition and understanding. Motor areas of the human brain have also been found to activate during the processing of written and spoken action-related words and sentences. Even more strikingly, stimuli referring to different bodily effectors produced specific somatotopic activation patterns in the motor areas. However, metabolic neuroimaging results can be ambiguous with respect to the processing stage they reflect. This is a serious limitation when hypotheses concerning linguistic processes are tested, since in this case it is usually crucial to distinguish early lexico-semantic processing from strategic effects or mental imagery that may follow lexico-semantic information access. Timing information is therefore pivotal to determine the functional significance of motor areas in action recognition and action-word comprehension. Here, we review attempts to reveal the time course of these processes using neurophysiological methods (EEG, MEG and TMS), in visual and auditory domains. We will highlight the importance of the choice of appropriate paradigms in combination with the corresponding method for the extraction of timing information. The findings will be discussed in the general context of putative brain mechanisms of word and object recognition.


Subject(s)
Brain/physiology , Comprehension/physiology , Language , Neurophysiology , Electroencephalography/methods , Evoked Potentials, Visual/physiology , Humans , Photic Stimulation/methods , Reaction Time/physiology , Time Factors
3.
Eur J Neurosci ; 23(3): 811-21, 2006 Feb.
Article in English | MEDLINE | ID: mdl-16487161

ABSTRACT

In order to explore the activation dynamics of the human action recognition system, we investigated electrophysiological distinctions between the brain responses to sounds produced by human finger and tongue movements. Of special interest were the questions of how early these differences may occur, and whether the neural activation at the early stages of processing involves cortical motor representations related to the generation of these sounds. For this purpose we employed a high-density EEG set-up and recorded mismatch negativity (MMN) using a recently developed novel multideviant paradigm which allows acquisition of a high number of trials within a given time period. Deviant stimuli were naturally recorded finger and tongue clicks, as well as control stimuli with similar physical features but without the clear action associations (this was tested in a separate behavioural experiment). Both natural stimuli produced larger MMNs than their respective control stimuli at approximately 100 ms, indicating activation of memory traces for familiar action-related sounds. Furthermore, MMN topography at this latency differed between the brain responses to the natural finger and natural tongue sounds. Source estimation revealed the strongest sources for finger sounds in centrolateral areas of the left hemisphere, suggesting that hearing a sound related to finger actions evokes activity in motor areas associated with the dominant hand. Furthermore, tongue sounds produced activation in more inferior brain areas. Our data suggest that motor areas in the human brain are part of neural systems subserving the early automatic recognition of action-related sounds.


Subject(s)
Brain Mapping , Contingent Negative Variation/physiology , Fingers , Movement/physiology , Nerve Net/physiology , Sound , Tongue , Acoustic Stimulation/methods , Adult , Electroencephalography/methods , Female , Humans , Male , Speech Perception/physiology , Time Factors
4.
Hear Res ; 199(1-2): 31-9, 2005 Jan.
Article in English | MEDLINE | ID: mdl-15574298

ABSTRACT

The effect of different types of real-life noise on the central auditory processing of speech and non-speech sounds was evaluated by the means of mismatch negativity and behavioral responses. Subjects (19-34 years old; 6 males, 4 females) were presented, in separate conditions, with either speech or non-speech stimuli of approximately equal complexity in five background conditions: babble noise, industrial noise, traffic noise, wide band noise, and silent condition. Whereas there were no effects of stimuli or noise on the behavioral responses, the MMN results revealed that speech and non-speech sounds are processed differently both in silent and noisy conditions. Speech processing was more affected than non-speech processing in all noise conditions. Moreover, different noise types had a differential effect on the pre-attentive discrimination, as reflected in MMN, on speech and non-speech sounds. Babble and industrial noises dramatically reduced the MMN amplitudes for both stimulus types, while traffic noise affected only speech stimuli.


Subject(s)
Acoustic Stimulation/methods , Evoked Potentials, Auditory/physiology , Noise , Speech Perception/physiology , Adult , Audiometry, Evoked Response , Audiometry, Pure-Tone , Auditory Threshold , Female , Humans , Male , Noise/adverse effects
5.
Neuroimage ; 14(3): 607-16, 2001 Sep.
Article in English | MEDLINE | ID: mdl-11506534

ABSTRACT

Brain responses to the same spoken syllable completing a Finnish word or a pseudo-word were studied. Native Finnish-speaking subjects were instructed to ignore the sound stimuli and watch a silent movie while the mismatch negativity (MMN), an automatic index of experience-dependent auditory memory traces, was recorded. The MMN to each syllable was larger when it completed a word than when it completed a pseudo-word. This enhancement, reaching its maximum amplitude at about 150 ms after the word's recognition point, did not occur in foreign subjects who did not know any Finnish. These results provide the first demonstration of the presence of memory traces for individual spoken words in the human brain. Using whole-head magnetoencephalography, the major intracranial source of this word-related MMN was found in the left superior temporal lobe.


Subject(s)
Brain/physiology , Language , Memory/physiology , Speech Perception/physiology , Adult , Electroencephalography , Finland , Humans , Magnetoencephalography
6.
Neuroimage ; 12(6): 657-63, 2000 Dec.
Article in English | MEDLINE | ID: mdl-11112397

ABSTRACT

The key question in understanding the nature of speech perception is whether the human brain has unique speech-specific mechanisms or treats all sounds equally. We assessed possible differences between the processing of speech and complex nonspeech sounds in the two cerebral hemispheres by measuring the magnetic equivalent of the mismatch negativity, the brain's automatic change-detection response, which was elicited by speech sounds and by similarly complex nonspeech sounds with either fast or slow acoustic transitions. Our results suggest that the right hemisphere is predominant in the perception of slow acoustic transitions, whereas neither hemisphere clearly dominates the discrimination of nonspeech sounds with fast acoustic transitions. In contrast, the perception of speech stimuli with similarly rapid acoustic transitions was dominated by the left hemisphere, which may be explained by the presence of acoustic templates (long-term memory traces) for speech sounds formed in this hemisphere.


Subject(s)
Attention/physiology , Auditory Cortex/physiology , Auditory Perception/physiology , Dominance, Cerebral/physiology , Magnetoencephalography , Speech Perception/physiology , Time Perception/physiology , Adolescent , Adult , Brain Mapping , Contingent Negative Variation/physiology , Evoked Potentials, Auditory/physiology , Female , Humans , Male , Phonetics , Sound Spectrography
7.
Neuroreport ; 11(13): 2893-6, 2000 Sep 11.
Article in English | MEDLINE | ID: mdl-11006961

ABSTRACT

Potential use of different auditory evoked brain responses for determining cerebral lateralization of speech function was evaluated. Cortical magnetic fields elicited by plosive syllables or complex non-speech sounds analogous to them were recorded with 122-channel magnetometer. We estimated parameters of magnetic P1, N1 and P2 responses to both stimuli in the two hemispheres and found no hemispheric asymmetry for any of the responses. No correlation between the right-ear advantage, determined with dichotic listening test, and any of asymmetry indexes, calculated for the speech-elicited responses, was observed. These results suggest that P1, N1 and P2 responses to speech signals do not indicate lateralization of speech function in the brain. The results are discussed in relation to previous studies suggesting that the mismatch negativity (MMN) seems to be the only early auditory cortex response sensitive to the lateralization of speech function.


Subject(s)
Auditory Cortex/physiology , Electromagnetic Fields , Evoked Potentials/physiology , Functional Laterality/physiology , Speech Perception/physiology , Adolescent , Adult , Auditory Cortex/anatomy & histology , Female , Humans , Male , Reaction Time/physiology
8.
Brain Cogn ; 43(1-3): 392-8, 2000.
Article in English | MEDLINE | ID: mdl-10857733

ABSTRACT

The goal of the present study was to evaluate the differences between dichotic listening and mismatch negativity as measures of speech lateralization in the human brain. For this purpose, we recorded the magnetic equivalent of the mismatch negativity, elicited by consonant-vowel syllable change, and tested the same subjects in the dichotic listening procedure. The results showed that both methods indicated left-hemisphere dominance in speech processing. However, the mismatch negativity, as compared to the right-ear advantage, suggested slightly stronger left-hemisphere dominance in speech processing. No clear correlation was found between the laterality indexes of mismatch negativity and right-ear advantage calculated from dichotic listening results. The possible explanation for this finding would be that these two measures reflect different stages of speech processing in the human brain.


Subject(s)
Brain/physiology , Dichotic Listening Tests/methods , Functional Laterality/physiology , Speech Perception/physiology , Adult , Female , Humans , Male
9.
Neuroreport ; 10(10): 2189-92, 1999 Jul 13.
Article in English | MEDLINE | ID: mdl-10424696

ABSTRACT

This study explored the effects of acoustic noise on the cerebral asymmetry of speech perception. We measured magnetic fields of the brain elicited by consonant-vowel syllables in silence and white noise. Background noise affected brain responses to these stimuli differently in the left and right auditory cortices. Its depressive effect on cortical responses was found mainly in the left hemisphere, whereas the right hemisphere was unaffected or exhibited increased activity in noise. Locations of the P1, N1, and P2 activity sources in noise were different from those in silence in the right but not in the left hemisphere. These results suggest an increased right hemisphere role in speech sound processing in noisy conditions, involving the recruitment of additional right auditory cortex structures.


Subject(s)
Functional Laterality/physiology , Mental Processes/physiology , Noise , Speech Perception/physiology , Adult , Analysis of Variance , Evoked Potentials, Auditory/physiology , Humans , Magnetoencephalography , Male , Reaction Time/physiology
10.
Neurosci Lett ; 251(2): 141-4, 1998 Jul 24.
Article in English | MEDLINE | ID: mdl-9718994

ABSTRACT

The present study explored effects of background noise on the cerebral functional asymmetry of speech perception. The magnetic equivalent (MMNm) of mismatch negativity (MMN) elicited by consonant-vowel syllable change presented in silence and during background white noise was measured with a whole-head magnetometer. It was found that in silence MMNm to speech stimuli, registered from the auditory cortex, was stronger in the left than in the right hemisphere. However, when speech signals were presented in white noise background, MMNm in the left hemisphere diminished while that in the right hemisphere increased in amplitude and dipole moment. These results confirm that in silence, speech signals are mainly discriminated in the left hemisphere's auditory cortex. However, in noisy conditions the involvement of the left hemisphere's auditory cortex in speech discrimination is considerably decreased, while that of the right hemisphere increases.


Subject(s)
Brain/physiology , Evoked Potentials, Auditory/physiology , Speech Perception/physiology , Humans , Magnetics , Magnetoencephalography/methods
SELECTION OF CITATIONS
SEARCH DETAIL
...