Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Hear Res ; 447: 109023, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38733710

ABSTRACT

Limited auditory input, whether caused by hearing loss or by electrical stimulation through a cochlear implant (CI), can be compensated by the remaining senses. Specifically for CI users, previous studies reported not only improved visual skills, but also altered cortical processing of unisensory visual and auditory stimuli. However, in multisensory scenarios, it is still unclear how auditory deprivation (before implantation) and electrical hearing experience (after implantation) affect cortical audiovisual speech processing. Here, we present a prospective longitudinal electroencephalography (EEG) study which systematically examined the deprivation- and CI-induced alterations of cortical processing of audiovisual words by comparing event-related potentials (ERPs) in postlingually deafened CI users before and after implantation (five weeks and six months of CI use). A group of matched normal-hearing (NH) listeners served as controls. The participants performed a word-identification task with congruent and incongruent audiovisual words, focusing their attention on either the visual (lip movement) or the auditory speech signal. This allowed us to study the (top-down) attention effect on the (bottom-up) sensory cortical processing of audiovisual speech. When compared to the NH listeners, the CI candidates (before implantation) and the CI users (after implantation) exhibited enhanced lipreading abilities and an altered cortical response at the N1 latency range (90-150 ms) that was characterized by a decreased theta oscillation power (4-8 Hz) and a smaller amplitude in the auditory cortex. After implantation, however, the auditory-cortex response gradually increased and developed a stronger intra-modal connectivity. Nevertheless, task efficiency and activation in the visual cortex was significantly modulated in both groups by focusing attention on the visual as compared to the auditory speech signal, with the NH listeners additionally showing an attention-dependent decrease in beta oscillation power (13-30 Hz). In sum, these results suggest remarkable deprivation effects on audiovisual speech processing in the auditory cortex, which partially reverse after implantation. Although even experienced CI users still show distinct audiovisual speech processing compared to NH listeners, pronounced effects of (top-down) direction of attention on (bottom-up) audiovisual processing can be observed in both groups. However, NH listeners but not CI users appear to show enhanced allocation of cognitive resources in visually as compared to auditory attended audiovisual speech conditions, which supports our behavioural observations of poorer lipreading abilities and reduced visual influence on audition in NH listeners as compared to CI users.


Subject(s)
Acoustic Stimulation , Attention , Cochlear Implantation , Cochlear Implants , Deafness , Electroencephalography , Persons With Hearing Impairments , Photic Stimulation , Speech Perception , Humans , Male , Female , Middle Aged , Cochlear Implantation/instrumentation , Adult , Prospective Studies , Longitudinal Studies , Persons With Hearing Impairments/psychology , Persons With Hearing Impairments/rehabilitation , Deafness/physiopathology , Deafness/rehabilitation , Deafness/psychology , Case-Control Studies , Aged , Visual Perception , Lipreading , Time Factors , Hearing , Evoked Potentials, Auditory , Auditory Cortex/physiopathology , Evoked Potentials
2.
Clin Neurophysiol ; 154: 141-156, 2023 10.
Article in English | MEDLINE | ID: mdl-37611325

ABSTRACT

OBJECTIVE: Hearing with a cochlear implant (CI) is difficult in noisy environments, but the use of noise reduction algorithms, specifically ForwardFocus, can improve speech intelligibility. The current event-related potentials (ERP) study examined the electrophysiological correlates of this perceptual improvement. METHODS: Ten bimodal CI users performed a syllable-identification task in auditory and audiovisual conditions, with syllables presented from the front and stationary noise presented from the sides. Brainstorm was used for spatio-temporal evaluation of ERPs. RESULTS: CI users revealed an audiovisual benefit as reflected by shorter response times and greater activation in temporal and occipital regions at P2 latency. However, in auditory and audiovisual conditions, background noise hampered speech processing, leading to longer response times and delayed auditory-cortex-activation at N1 latency. Nevertheless, activating ForwardFocus resulted in shorter response times, reduced listening effort and enhanced superior-frontal-cortex-activation at P2 latency, particularly in audiovisual conditions. CONCLUSIONS: ForwardFocus enhances speech intelligibility in audiovisual speech conditions by potentially allowing the reallocation of attentional resources to relevant auditory speech cues. SIGNIFICANCE: This study shows for CI users that background noise and ForwardFocus differentially affect spatio-temporal cortical response patterns, both in auditory and audiovisual speech conditions.


Subject(s)
Cochlear Implantation , Cochlear Implants , Speech Perception , Humans , Speech Perception/physiology , Evoked Potentials , Noise/adverse effects
3.
Curr Res Neurobiol ; 3: 100059, 2022.
Article in English | MEDLINE | ID: mdl-36405629

ABSTRACT

Hearing with a cochlear implant (CI) is limited compared to natural hearing. Although CI users may develop compensatory strategies, it is currently unknown whether these extend from auditory to visual functions, and whether compensatory strategies vary between different CI user groups. To better understand the experience-dependent contributions to multisensory plasticity in audiovisual speech perception, the current event-related potential (ERP) study presented syllables in auditory, visual, and audiovisual conditions to CI users with unilateral or bilateral hearing loss, as well as to normal-hearing (NH) controls. Behavioural results revealed shorter audiovisual response times compared to unisensory conditions for all groups. Multisensory integration was confirmed by electrical neuroimaging, including topographic and ERP source analysis, showing a visual modulation of the auditory-cortex response at N1 and P2 latency. However, CI users with bilateral hearing loss showed a distinct pattern of N1 topography, indicating a stronger visual impact on auditory speech processing compared to CI users with unilateral hearing loss and NH listeners. Furthermore, both CI user groups showed a delayed auditory-cortex activation and an additional recruitment of the visual cortex, and a better lip-reading ability compared to NH listeners. In sum, these results extend previous findings by showing distinct multisensory processes not only between NH listeners and CI users in general, but even between CI users with unilateral and bilateral hearing loss. However, the comparably enhanced lip-reading ability and visual-cortex activation in both CI user groups suggest that these visual improvements are evident regardless of the hearing status of the contralateral ear.

4.
Brain Topogr ; 35(4): 431-452, 2022 07.
Article in English | MEDLINE | ID: mdl-35668310

ABSTRACT

Cochlear implants (CIs) allow to restore the hearing function in profoundly deaf individuals. Due to the degradation of the stimulus by CI signal processing, implanted individuals with single-sided deafness (SSD) have the specific challenge that the input highly differs between their ears. The present study compared normal-hearing (NH) listeners (N = 10) and left- and right-ear implanted SSD CI users (N = 10 left, N = 9 right), to evaluate cortical speech processing between CI- and NH-ears and to explore for side-of-implantation effects. The participants performed a two-deviant oddball task, separately with the left and the right ear. Auditory event-related potentials (ERPs) in response to syllables were compared between proficient and non-proficient CI users, as well as between CI and NH ears. The effect of the side of implantation was analysed on the sensor and the source level. CI proficiency could be distinguished based on the ERP amplitudes of the N1 and the P3b. Moreover, syllable processing via the CI ear, when compared to the NH ear, resulted in attenuated and delayed ERPs. In addition, the left-ear implanted SSD CI users revealed an enhanced functional asymmetry in the auditory cortex than right-ear implanted SSD CI users, regardless of whether the syllables were perceived via the CI or the NH ear. Our findings reveal that speech-discrimination proficiency in SSD CI users can be assessed by N1 and P3b ERPs. The results contribute to a better understanding of the rehabilitation success in SSD CI users by showing that cortical speech processing in SSD CI users is affected by CI-related stimulus degradation and experience-related functional changes in the auditory cortex.


Subject(s)
Auditory Cortex , Cochlear Implantation , Cochlear Implants , Deafness , Hearing Loss, Unilateral , Speech Perception , Cochlear Implantation/methods , Humans , Speech Perception/physiology
5.
Neuroimage Clin ; 34: 102982, 2022.
Article in English | MEDLINE | ID: mdl-35303598

ABSTRACT

A cochlear implant (CI) is an auditory prosthesis which can partially restore the auditory function in patients with severe to profound hearing loss. However, this bionic device provides only limited auditory information, and CI patients may compensate for this limitation by means of a stronger interaction between the auditory and visual system. To better understand the electrophysiological correlates of audiovisual speech perception, the present study used electroencephalography (EEG) and a redundant target paradigm. Postlingually deafened CI users and normal-hearing (NH) listeners were compared in auditory, visual and audiovisual speech conditions. The behavioural results revealed multisensory integration for both groups, as indicated by shortened response times for the audiovisual as compared to the two unisensory conditions. The analysis of the N1 and P2 event-related potentials (ERPs), including topographic and source analyses, confirmed a multisensory effect for both groups and showed a cortical auditory response which was modulated by the simultaneous processing of the visual stimulus. Nevertheless, the CI users in particular revealed a distinct pattern of N1 topography, pointing to a strong visual impact on auditory speech processing. Apart from these condition effects, the results revealed ERP differences between CI users and NH listeners, not only in N1/P2 ERP topographies, but also in the cortical source configuration. When compared to the NH listeners, the CI users showed an additional activation in the visual cortex at N1 latency, which was positively correlated with CI experience, and a delayed auditory-cortex activation with a reversed, rightward functional lateralisation. In sum, our behavioural and ERP findings demonstrate a clear audiovisual benefit for both groups, and a CI-specific alteration in cortical activation at N1 latency when auditory and visual input is combined. These cortical alterations may reflect a compensatory strategy to overcome the limited CI input, which allows the CI users to improve the lip-reading skills and to approximate the behavioural performance of NH listeners in audiovisual speech conditions. Our results are clinically relevant, as they highlight the importance of assessing the CI outcome not only in auditory-only, but also in audiovisual speech conditions.


Subject(s)
Cochlear Implantation , Cochlear Implants , Speech Perception , Acoustic Stimulation/methods , Auditory Perception/physiology , Evoked Potentials , Humans , Speech , Speech Perception/physiology , Visual Perception/physiology
6.
Front Neurosci ; 16: 1005859, 2022.
Article in English | MEDLINE | ID: mdl-36620447

ABSTRACT

A cochlear implant (CI) can partially restore hearing in individuals with profound sensorineural hearing loss. However, electrical hearing with a CI is limited and highly variable. The current study aimed to better understand the different factors contributing to this variability by examining how age affects cognitive functions and cortical speech processing in CI users. Electroencephalography (EEG) was applied while two groups of CI users (young and elderly; N = 13 each) and normal-hearing (NH) listeners (young and elderly; N = 13 each) performed an auditory sentence categorization task, including semantically correct and incorrect sentences presented either with or without background noise. Event-related potentials (ERPs) representing earlier, sensory-driven processes (N1-P2 complex to sentence onset) and later, cognitive-linguistic integration processes (N400 to semantically correct/incorrect sentence-final words) were compared between the different groups and speech conditions. The results revealed reduced amplitudes and prolonged latencies of auditory ERPs in CI users compared to NH listeners, both at earlier (N1, P2) and later processing stages (N400 effect). In addition to this hearing-group effect, CI users and NH listeners showed a comparable background-noise effect, as indicated by reduced hit rates and reduced (P2) and delayed (N1/P2) ERPs in conditions with background noise. Moreover, we observed an age effect in CI users and NH listeners, with young individuals showing improved specific cognitive functions (working memory capacity, cognitive flexibility and verbal learning/retrieval), reduced latencies (N1/P2), decreased N1 amplitudes and an increased N400 effect when compared to the elderly. In sum, our findings extend previous research by showing that the CI users' speech processing is impaired not only at earlier (sensory) but also at later (semantic integration) processing stages, both in conditions with and without background noise. Using objective ERP measures, our study provides further evidence of strong age effects on cortical speech processing, which can be observed in both the NH listeners and the CI users. We conclude that elderly individuals require more effortful processing at sensory stages of speech processing, which however seems to be at the cost of the limited resources available for the later semantic integration processes.

SELECTION OF CITATIONS
SEARCH DETAIL
...