RESUMO
We quantified the electroencephalogram signals associated with the selective attention processing of experienced simultaneous interpreters and calculated the phase-locked responses evoked by a 40-Hz auditory steady-state response (40-Hz ASSR) and the values of robust inter-trial coherence (ITC) for environmental changes. Since we assumed that an interpreter's attention ability improves with an increase in the number of years of experience of simultaneous interpretation, we divided the participants into two groups based on their simultaneous interpretation experience: experts with more than 15 years of experience (E group; n = 7) and beginners with <1 year (B group; n = 15). We also compared two conditions: simultaneous interpretation (SI) and shadowing (SH). We found a significant interaction in the ITC between years of SI experience (E and B groups) and tasks (SI and SH). This result demonstrates that the number of years of SI experience influences selective attention during interpretation.
RESUMO
This study analyzed the selective attention processing related to cognitive load on simultaneous interpretation (SI). We tested simultaneous interpreter's brain function using EEG signals and calculated inter-trial coherence (ITC) extracted by the 40-Hz auditory steady-state response (ASSR). In this experiment, we set two conditions as Japanese-English translation and Japanese shadowing cognition. We also compared two subject groups: S rank with more than 15 years of SI experience (n=7) and C rank with less than one year experience (n=15). As a result, the ITCs for S rank in interpreting conditions were more significantly increased than C rank in the shadowing conditions (ITC: p<0.001). Our results demonstrate that 40-Hz ASSR might be a good indicator of selective attention and cognitive load during SI in ecologically valid environmental conditions. It can also be used to detect attention and cognitive control dysfunction in ADHD or schizophrenia.
Assuntos
Eletroencefalografia , Potenciais Evocados Auditivos , Estimulação Acústica , Atenção , Sincronização de Fases em EletroencefalografiaRESUMO
In communication, language can be interpreted differently depending upon the emotional context. To clarify the effect of emotional context on language processing, we performed experiments using a cross-modal priming paradigm with an auditorily presented prime and a visually presented target. The primes were the names of people that were spoken with a happy, sad, or neutral intonation; the targets were interrogative one-word sentences with emotionally neutral content. Using magnetoencephalography, we measured neural activities during silent reading of the targets presented in a happy, sad, or neutral context. We identified two conditional differences: the happy and sad conditions produced less activity than the neutral condition in the right posterior inferior and middle frontal cortices in the latency window from 300 to 400 ms; the happy and neutral conditions produced greater activity than the sad condition in the left posterior inferior frontal cortex in the latency window from 400 to 500 ms. These results suggest that the use of emotional context stored in the right frontal cortex starts at â¼300 ms, that integration of linguistic information with emotional context starts at â¼400 ms in the left frontal cortex, and that language comprehension dependent on emotional context is achieved by â¼500 ms.