Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Brain Commun ; 6(3): fcae175, 2024.
Article in English | MEDLINE | ID: mdl-38846536

ABSTRACT

Over the first years of life, the brain undergoes substantial organization in response to environmental stimulation. In a silent world, it may promote vision by (i) recruiting resources from the auditory cortex and (ii) making the visual cortex more efficient. It is unclear when such changes occur and how adaptive they are, questions that children with cochlear implants can help address. Here, we examined 7-18 years old children: 50 had cochlear implants, with delayed or age-appropriate language abilities, and 25 had typical hearing and language. High-density electroencephalography and functional near-infrared spectroscopy were used to evaluate cortical responses to a low-level visual task. Evidence for a 'weaker visual cortex response' and 'less synchronized or less inhibitory activity of auditory association areas' in the implanted children with language delays suggests that cross-modal reorganization can be maladaptive and does not necessarily strengthen the dominant visual sense.

2.
Percept Mot Skills ; 131(1): 74-105, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37977135

ABSTRACT

Auditory-motor and visual-motor networks are often coupled in daily activities, such as when listening to music and dancing; but these networks are known to be highly malleable as a function of sensory input. Thus, congenital deafness may modify neural activities within the connections between the motor, auditory, and visual cortices. Here, we investigated whether the cortical responses of children with cochlear implants (CI) to a simple and repetitive motor task would differ from that of children with typical hearing (TH) and we sought to understand whether this response related to their language development. Participants were 75 school-aged children, including 50 with CI (with varying language abilities) and 25 controls with TH. We used functional near-infrared spectroscopy (fNIRS) to record cortical responses over the whole brain, as children squeezed the back triggers of a joystick that vibrated or not with the squeeze. Motor cortex activity was reflected by an increase in oxygenated hemoglobin concentration (HbO) and a decrease in deoxygenated hemoglobin concentration (HbR) in all children, irrespective of their hearing status. Unexpectedly, the visual cortex (supposedly an irrelevant region) was deactivated in this task, particularly for children with CI who had good language skills when compared to those with CI who had language delays. Presence or absence of vibrotactile feedback made no difference in cortical activation. These findings support the potential of fNIRS to examine cognitive functions related to language in children with CI.


Subject(s)
Cochlear Implantation , Cochlear Implants , Deafness , Child , Humans , Spectroscopy, Near-Infrared/methods , Cochlear Implantation/methods , Deafness/surgery , Hemoglobins
3.
Brain Res Bull ; 205: 110817, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37989460

ABSTRACT

Sensory deprivation can offset the balance of audio versus visual information in multimodal processing. Such a phenomenon could persist for children born deaf, even after they receive cochlear implants (CIs), and could potentially explain why one modality is given priority over the other. Here, we recorded cortical responses to a single speaker uttering two syllables, presented in audio-only (A), visual-only (V), and audio-visual (AV) modes. Electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) were successively recorded in seventy-five school-aged children. Twenty-five were children with normal hearing (NH) and fifty wore CIs, among whom 26 had relatively high language abilities (HL) comparable to those of NH children, while 24 others had low language abilities (LL). In EEG data, visual-evoked potentials were captured in occipital regions, in response to V and AV stimuli, and they were accentuated in the HL group compared to the LL group (the NH group being intermediate). Close to the vertex, auditory-evoked potentials were captured in response to A and AV stimuli and reflected a differential treatment of the two syllables but only in the NH group. None of the EEG metrics revealed any interaction between group and modality. In fNIRS data, each modality induced a corresponding activity in visual or auditory regions, but no group difference was observed in A, V, or AV stimulation. The present study did not reveal any sign of abnormal AV integration in children with CI. An efficient multimodal integrative network (at least for rudimentary speech materials) is clearly not a sufficient condition to exhibit good language and literacy.


Subject(s)
Cochlear Implants , Deafness , Speech Perception , Child , Humans , Speech Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Electroencephalography
4.
J Am Acad Audiol ; 32(7): 433-444, 2021 07.
Article in English | MEDLINE | ID: mdl-34847584

ABSTRACT

BACKGROUND: Considerable variability exists in the speech recognition abilities achieved by children with cochlear implants (CIs) due to varying demographic and performance variables including language abilities. PURPOSE: This article examines the factors associated with speech recognition performance of school-aged children with CIs who were grouped by language ability. RESEARCH DESIGN: This is a single-center cross-sectional study with repeated measures for subjects across two language groups. STUDY SAMPLE: Participants included two groups of school-aged children, ages 7 to 17 years, who received unilateral or bilateral CIs by 4 years of age. The High Language group (N = 26) had age-appropriate spoken-language abilities, and the Low Language group (N = 24) had delays in their spoken-language abilities. DATA COLLECTION AND ANALYSIS: Group comparisons were conducted to examine the impact of demographic characteristics on word recognition in quiet and sentence recognition in quiet and noise. RESULTS: Speech recognition in quiet and noise was significantly poorer in the Low Language compared with the High Language group. Greater hours of implant use and better adherence to auditory-verbal (AV) therapy appointments were associated with higher speech recognition in quiet and noise. CONCLUSION: To ensure maximal speech recognition in children with low-language outcomes, professionals should develop strategies to ensure that families support full-time CI use and have the means to consistently attend AV appointments.


Subject(s)
Cochlear Implants , Speech , Adolescent , Child , Cross-Sectional Studies , Humans , Schools
SELECTION OF CITATIONS
SEARCH DETAIL
...