Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
Neuropsychologia ; 174: 108315, 2022 09 09.
Article in English | MEDLINE | ID: mdl-35798066

ABSTRACT

Co-speech hand gestures are an ubiquitous form of nonverbal communication, which can express additional information that is not present in speech. Hand gestures may become more relevant when verbal production is impaired, as in speakers with post-stroke aphasia. In fact, speakers with aphasia produce more gestures than non-brain damaged speakers. Furthermore, there is evidence to suggest that speakers with aphasia produce gestures that convey information essential to understand their communication. In the present study, we addressed the question whether these gestures catch the attention of their addressees. Healthy volunteers (observers) watched short video clips while their eye movements were recorded. These video clips featured speakers with aphasia and non-brain damaged speakers describing two different scenarios (buying a sweater or having witnessed an accident). Our results show that hand gestures produced by speakers with aphasia are on average attended to longer than gestures produced by non-brain damaged speakers. This effect was significant even when we controlled for the longer duration of the gestural movements in speakers with aphasia. Further, the amount of information in speech was also correlated with gesture attention. That is gestures produced by speakers with less informative speech were attended to more frequently. In conclusion, our findings suggest that listeners reallocate their attention and focus more strongly on non-verbal information from co-speech gestures if speech comprehension becomes challenging due to the speaker's verbal production deficits. These findings support a communicative function of co-speech gestures and advocate for instructing people with aphasia to communicate things in the form of gestures that cannot be expressed verbally because interlocutors take notice of these gestures.


Subject(s)
Aphasia , Gestures , Aphasia/etiology , Attention , Eye-Tracking Technology , Humans , Speech
2.
Neuroimage ; 258: 119375, 2022 09.
Article in English | MEDLINE | ID: mdl-35700949

ABSTRACT

Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech perception is normally confounded by acoustic differences in the stimulus. By using ambiguous sounds, however, it is possible to dissociate acoustic from perceptual stimulus representations. Twenty-seven normally hearing individuals took part in an fMRI study in which they were presented with an ambiguous syllable (intermediate between /da/ and /ga/) in one ear and with disambiguating acoustic feature (third formant, F3) in the other ear. Multi-voxel pattern searchlight analysis was used to identify brain areas that consistently differentiated between response patterns associated with different syllable reports. By comparing responses to different stimuli with identical syllable reports and identical stimuli with different syllable reports, we disambiguated whether these regions primarily differentiated the acoustics of the stimuli or the syllable report. We found that BOLD activity patterns in left perisylvian regions (STG, SMG), left inferior frontal regions (vMC, IFG, AI), left supplementary motor cortex (SMA/pre-SMA), and right motor and somatosensory regions (M1/S1) represent listeners' syllable report irrespective of stimulus acoustics. Most of these regions are outside of what is traditionally regarded as auditory or phonological processing areas. Our results indicate that the process of speech sound categorization implicates decision-making mechanisms and auditory-motor transformations.


Subject(s)
Auditory Cortex , Speech Perception , Acoustic Stimulation/methods , Auditory Cortex/diagnostic imaging , Auditory Cortex/physiology , Auditory Perception , Hearing , Humans , Phonetics , Speech/physiology , Speech Perception/physiology
3.
Front Cell Neurosci ; 16: 818703, 2022.
Article in English | MEDLINE | ID: mdl-35273479

ABSTRACT

There is considerable individual variability in the reported effectiveness of non-invasive brain stimulation. This variability has often been ascribed to differences in the neuroanatomy and resulting differences in the induced electric field inside the brain. In this study, we addressed the question whether individual differences in the induced electric field can predict the neurophysiological and behavioral consequences of gamma band tACS. In a within-subject experiment, bi-hemispheric gamma band tACS and sham stimulation was applied in alternating blocks to the participants' superior temporal lobe, while task-evoked auditory brain activity was measured with concurrent functional magnetic resonance imaging (fMRI) and a dichotic listening task. Gamma tACS was applied with different interhemispheric phase lags. In a recent study, we could show that anti-phase tACS (180° interhemispheric phase lag), but not in-phase tACS (0° interhemispheric phase lag), selectively modulates interhemispheric brain connectivity. Using a T1 structural image of each participant's brain, an individual simulation of the induced electric field was computed. From these simulations, we derived two predictor variables: maximal strength (average of the 10,000 voxels with largest electric field values) and precision of the electric field (spatial correlation between the electric field and the task evoked brain activity during sham stimulation). We found considerable variability in the individual strength and precision of the electric fields. Importantly, the strength of the electric field over the right hemisphere predicted individual differences of tACS induced brain connectivity changes. Moreover, we found in both hemispheres a statistical trend for the effect of electric field strength on tACS induced BOLD signal changes. In contrast, the precision of the electric field did not predict any neurophysiological measure. Further, neither strength, nor precision predicted interhemispheric integration. In conclusion, we found evidence for the dose-response relationship between individual differences in electric fields and tACS induced activity and connectivity changes in concurrent fMRI. However, the fact that this relationship was stronger in the right hemisphere suggests that the relationship between the electric field parameters, neurophysiology, and behavior may be more complex for bi-hemispheric tACS.

4.
Proc Natl Acad Sci U S A ; 118(7)2021 02 16.
Article in English | MEDLINE | ID: mdl-33568530

ABSTRACT

Brain connectivity plays a major role in the encoding, transfer, and integration of sensory information. Interregional synchronization of neural oscillations in the γ-frequency band has been suggested as a key mechanism underlying perceptual integration. In a recent study, we found evidence for this hypothesis showing that the modulation of interhemispheric oscillatory synchrony by means of bihemispheric high-density transcranial alternating current stimulation (HD-TACS) affects binaural integration of dichotic acoustic features. Here, we aimed to establish a direct link between oscillatory synchrony, effective brain connectivity, and binaural integration. We experimentally manipulated oscillatory synchrony (using bihemispheric γ-TACS with different interhemispheric phase lags) and assessed the effect on effective brain connectivity and binaural integration (as measured with functional MRI and a dichotic listening task, respectively). We found that TACS reduced intrahemispheric connectivity within the auditory cortices and antiphase (interhemispheric phase lag 180°) TACS modulated connectivity between the two auditory cortices. Importantly, the changes in intra- and interhemispheric connectivity induced by TACS were correlated with changes in perceptual integration. Our results indicate that γ-band synchronization between the two auditory cortices plays a functional role in binaural integration, supporting the proposed role of interregional oscillatory synchrony in perceptual integration.


Subject(s)
Auditory Perception , Brain/physiology , Functional Laterality , Connectome , Female , Gamma Rhythm , Humans , Magnetic Resonance Imaging , Male , Transcranial Direct Current Stimulation , Young Adult
5.
J Cogn Neurosci ; 32(7): 1242-1250, 2020 07.
Article in English | MEDLINE | ID: mdl-31682569

ABSTRACT

Perceiving speech requires the integration of different speech cues, that is, formants. When the speech signal is split so that different cues are presented to the right and left ear (dichotic listening), comprehension requires the integration of binaural information. Based on prior electrophysiological evidence, we hypothesized that the integration of dichotically presented speech cues is enabled by interhemispheric phase synchronization between primary and secondary auditory cortex in the gamma frequency band. We tested this hypothesis by applying transcranial alternating current stimulation (TACS) bilaterally above the superior temporal lobe to induce or disrupt interhemispheric gamma-phase coupling. In contrast to initial predictions, we found that gamma TACS applied in-phase above the two hemispheres (interhemispheric lag 0°) perturbs interhemispheric integration of speech cues, possibly because the applied stimulation perturbs an inherent phase lag between the left and right auditory cortex. We also observed this disruptive effect when applying antiphasic delta TACS (interhemispheric lag 180°). We conclude that interhemispheric phase coupling plays a functional role in interhemispheric speech integration. The direction of this effect may depend on the stimulation frequency.


Subject(s)
Auditory Cortex , Transcranial Direct Current Stimulation , Auditory Perception , Humans , Phonetics , Speech
6.
J Acoust Soc Am ; 145(3): EL190, 2019 03.
Article in English | MEDLINE | ID: mdl-31067965

ABSTRACT

The present study investigated whether speech-related spectral information benefits from initially predominant right or left hemisphere processing. Normal hearing individuals categorized speech sounds composed of an ambiguous base (perceptually intermediate between /ga/ and /da/), presented to one ear, and a disambiguating low or high F3 chirp presented to the other ear. Shorter response times were found when the chirp was presented to the left ear than to the right ear (inducing initially right-hemisphere chirp processing), but no between-ear differences in strength of overall integration. The results are in line with the assumptions of a right hemispheric dominance for spectral processing.


Subject(s)
Functional Laterality , Speech Perception , Brain/physiology , Female , Humans , Learning , Male , Phonetics , Young Adult
8.
Front Hum Neurosci ; 12: 200, 2018.
Article in English | MEDLINE | ID: mdl-29962942

ABSTRACT

The role of nonverbal communication in patients with post-stroke language impairment (aphasia) is not yet fully understood. This study investigated how aphasic patients perceive and produce co-speech gestures during face-to-face interaction, and whether distinct brain lesions would predict the frequency of spontaneous co-speech gesturing. For this purpose, we recorded samples of conversations in patients with aphasia and healthy participants. Gesture perception was assessed by means of a head-mounted eye-tracking system, and the produced co-speech gestures were coded according to a linguistic classification system. The main results are that meaning-laden gestures (e.g., iconic gestures representing object shapes) are more likely to attract visual attention than meaningless hand movements, and that patients with aphasia are more likely to fixate co-speech gestures overall than healthy participants. This implies that patients with aphasia may benefit from the multimodal information provided by co-speech gestures. On the level of co-speech gesture production, we found that patients with damage to the anterior part of the arcuate fasciculus showed a higher frequency of meaning-laden gestures. This area lies in close vicinity to the premotor cortex and is considered to be important for speech production. This may suggest that the use of meaning-laden gestures depends on the integrity of patients' speech production abilities.

9.
J Cogn Neurosci ; 28(10): 1613-24, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27243612

ABSTRACT

The human turn-taking system regulates the smooth and precise exchange of speaking turns during face-to-face interaction. Recent studies investigated the processing of ongoing turns during conversation by measuring the eye movements of noninvolved observers. The findings suggest that humans shift their gaze in anticipation to the next speaker before the start of the next turn. Moreover, there is evidence that the ability to timely detect turn transitions mainly relies on the lexico-syntactic content provided by the conversation. Consequently, patients with aphasia, who often experience deficits in both semantic and syntactic processing, might encounter difficulties to detect and timely shift their gaze at turn transitions. To test this assumption, we presented video vignettes of natural conversations to aphasic patients and healthy controls, while their eye movements were measured. The frequency and latency of event-related gaze shifts, with respect to the end of the current turn in the videos, were compared between the two groups. Our results suggest that, compared with healthy controls, aphasic patients have a reduced probability to shift their gaze at turn transitions but do not show significantly increased gaze shift latencies. In healthy controls, but not in aphasic patients, the probability to shift the gaze at turn transition was increased when the video content of the current turn had a higher lexico-syntactic complexity. Furthermore, the results from voxel-based lesion symptom mapping indicate that the association between lexico-syntactic complexity and gaze shift latency in aphasic patients is predicted by brain lesions located in the posterior branch of the left arcuate fasciculus. Higher lexico-syntactic processing demands seem to lead to a reduced gaze shift probability in aphasic patients. This finding may represent missed opportunities for patients to place their contributions during everyday conversation.


Subject(s)
Aphasia/psychology , Eye Movements , Motion Perception , Social Perception , Speech Perception , Adult , Analysis of Variance , Aphasia/diagnostic imaging , Aphasia/physiopathology , Brain/diagnostic imaging , Eye Movement Measurements , Eye Movements/physiology , Female , Humans , Male , Motion Perception/physiology , Speech Perception/physiology , Video Recording
10.
PLoS One ; 11(1): e0146583, 2016.
Article in English | MEDLINE | ID: mdl-26735917

ABSTRACT

BACKGROUND: Co-speech gestures are omnipresent and a crucial element of human interaction by facilitating language comprehension. However, it is unclear whether gestures also support language comprehension in aphasic patients. Using visual exploration behavior analysis, the present study aimed to investigate the influence of congruence between speech and co-speech gestures on comprehension in terms of accuracy in a decision task. METHOD: Twenty aphasic patients and 30 healthy controls watched videos in which speech was either combined with meaningless (baseline condition), congruent, or incongruent gestures. Comprehension was assessed with a decision task, while remote eye-tracking allowed analysis of visual exploration. RESULTS: In aphasic patients, the incongruent condition resulted in a significant decrease of accuracy, while the congruent condition led to a significant increase in accuracy compared to baseline accuracy. In the control group, the incongruent condition resulted in a decrease in accuracy, while the congruent condition did not significantly increase the accuracy. Visual exploration analysis showed that patients fixated significantly less on the face and tended to fixate more on the gesturing hands compared to controls. CONCLUSION: Co-speech gestures play an important role for aphasic patients as they modulate comprehension. Incongruent gestures evoke significant interference and deteriorate patients' comprehension. In contrast, congruent gestures enhance comprehension in aphasic patients, which might be valuable for clinical and therapeutic purposes.


Subject(s)
Comprehension/physiology , Eye Movements/physiology , Gestures , Stroke/physiopathology , Adult , Aged , Brain/diagnostic imaging , Brain Mapping , Case-Control Studies , Demography , Female , Humans , Magnetic Resonance Imaging , Male , Mass Media , Middle Aged , Photic Stimulation , Radiography , Speech , Stroke/diagnostic imaging
11.
Neuropsychologia ; 71: 158-64, 2015 May.
Article in English | MEDLINE | ID: mdl-25841335

ABSTRACT

According to the direct matching hypothesis, perceived movements automatically activate existing motor components through matching of the perceived gesture and its execution. The aim of the present study was to test the direct matching hypothesis by assessing whether visual exploration behavior correlate with deficits in gestural imitation in left hemisphere damaged (LHD) patients. Eighteen LHD patients and twenty healthy control subjects took part in the study. Gesture imitation performance was measured by the test for upper limb apraxia (TULIA). Visual exploration behavior was measured by an infrared eye-tracking system. Short videos including forty gestures (20 meaningless and 20 communicative gestures) were presented. Cumulative fixation duration was measured in different regions of interest (ROIs), namely the face, the gesturing hand, the body, and the surrounding environment. Compared to healthy subjects, patients fixated significantly less the ROIs comprising the face and the gesturing hand during the exploration of emblematic and tool-related gestures. Moreover, visual exploration of tool-related gestures significantly correlated with tool-related imitation as measured by TULIA in LHD patients. Patients and controls did not differ in the visual exploration of meaningless gestures, and no significant relationships were found between visual exploration behavior and the imitation of emblematic and meaningless gestures in TULIA. The present study thus suggests that altered visual exploration may lead to disturbed imitation of tool related gestures, however not of emblematic and meaningless gestures. Consequently, our findings partially support the direct matching hypothesis.


Subject(s)
Eye Movements , Gestures , Imitative Behavior , Motion Perception , Pattern Recognition, Visual , Stroke , Adult , Aged , Eye Movement Measurements , Female , Functional Laterality , Humans , Infrared Rays , Magnetic Resonance Imaging , Male , Middle Aged , Stroke/pathology , Stroke/psychology , Tomography, X-Ray Computed , Video Recording , Young Adult
12.
Cortex ; 64: 157-68, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25461716

ABSTRACT

BACKGROUND: Co-speech gestures are part of nonverbal communication during conversations. They either support the verbal message or provide the interlocutor with additional information. Furthermore, they prompt as nonverbal cues the cooperative process of turn taking. In the present study, we investigated the influence of co-speech gestures on the perception of dyadic dialogue in aphasic patients. In particular, we analysed the impact of co-speech gestures on gaze direction (towards speaker or listener) and fixation of body parts. We hypothesized that aphasic patients, who are restricted in verbal comprehension, adapt their visual exploration strategies. METHODS: Sixteen aphasic patients and 23 healthy control subjects participated in the study. Visual exploration behaviour was measured by means of a contact-free infrared eye-tracker while subjects were watching videos depicting spontaneous dialogues between two individuals. Cumulative fixation duration and mean fixation duration were calculated for the factors co-speech gesture (present and absent), gaze direction (to the speaker or to the listener), and region of interest (ROI), including hands, face, and body. RESULTS: Both aphasic patients and healthy controls mainly fixated the speaker's face. We found a significant co-speech gesture × ROI interaction, indicating that the presence of a co-speech gesture encouraged subjects to look at the speaker. Further, there was a significant gaze direction × ROI × group interaction revealing that aphasic patients showed reduced cumulative fixation duration on the speaker's face compared to healthy controls. CONCLUSION: Co-speech gestures guide the observer's attention towards the speaker, the source of semantic input. It is discussed whether an underlying semantic processing deficit or a deficit to integrate audio-visual information may cause aphasic patients to explore less the speaker's face.


Subject(s)
Aphasia/psychology , Attention/physiology , Gestures , Speech Perception/physiology , Speech/physiology , Visual Perception/physiology , Adult , Aged , Aphasia/physiopathology , Communication , Comprehension/physiology , Female , Humans , Male , Middle Aged
SELECTION OF CITATIONS
SEARCH DETAIL
...