Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters











Database
Language
Publication year range
1.
Ear Hear ; 39(4): 783-794, 2018.
Article in English | MEDLINE | ID: mdl-29252979

ABSTRACT

OBJECTIVES: Visual information from talkers facilitates speech intelligibility for listeners when audibility is challenged by environmental noise and hearing loss. Less is known about how listeners actively process and attend to visual information from different talkers in complex multi-talker environments. This study tracked looking behavior in children with normal hearing (NH), mild bilateral hearing loss (MBHL), and unilateral hearing loss (UHL) in a complex multi-talker environment to examine the extent to which children look at talkers and whether looking patterns relate to performance on a speech-understanding task. It was hypothesized that performance would decrease as perceptual complexity increased and that children with hearing loss would perform more poorly than their peers with NH. Children with MBHL or UHL were expected to demonstrate greater attention to individual talkers during multi-talker exchanges, indicating that they were more likely to attempt to use visual information from talkers to assist in speech understanding in adverse acoustics. It also was of interest to examine whether MBHL, versus UHL, would differentially affect performance and looking behavior. DESIGN: Eighteen children with NH, eight children with MBHL, and 10 children with UHL participated (8-12 years). They followed audiovisual instructions for placing objects on a mat under three conditions: a single talker providing instructions via a video monitor, four possible talkers alternately providing instructions on separate monitors in front of the listener, and the same four talkers providing both target and nontarget information. Multi-talker background noise was presented at a 5 dB signal-to-noise ratio during testing. An eye tracker monitored looking behavior while children performed the experimental task. RESULTS: Behavioral task performance was higher for children with NH than for either group of children with hearing loss. There were no differences in performance between children with UHL and children with MBHL. Eye-tracker analysis revealed that children with NH looked more at the screens overall than did children with MBHL or UHL, though individual differences were greater in the groups with hearing loss. Listeners in all groups spent a small proportion of time looking at relevant screens as talkers spoke. Although looking was distributed across all screens, there was a bias toward the right side of the display. There was no relationship between overall looking behavior and performance on the task. CONCLUSIONS: The present study examined the processing of audiovisual speech in the context of a naturalistic task. Results demonstrated that children distributed their looking to a variety of sources during the task, but that children with NH were more likely to look at screens than were those with MBHL/UHL. However, all groups looked at the relevant talkers as they were speaking only a small proportion of the time. Despite variability in looking behavior, listeners were able to follow the audiovisual instructions and children with NH demonstrated better performance than children with MBHL/UHL. These results suggest that performance on some challenging multi-talker audiovisual tasks is not dependent on visual fixation to relevant talkers for children with NH or with MBHL/UHL.


Subject(s)
Fixation, Ocular , Hearing Loss, Bilateral/physiopathology , Hearing Loss, Unilateral/physiopathology , Speech Perception , Visual Perception , Case-Control Studies , Child , Child Behavior , Female , Humans , Male , Severity of Illness Index , Task Performance and Analysis
2.
J Speech Lang Hear Res ; 59(5): 1218-1232, 2016 10 01.
Article in English | MEDLINE | ID: mdl-27784030

ABSTRACT

Purpose: This study examined the effects of stimulus type and hearing status on speech recognition and listening effort in children with normal hearing (NH) and children with mild bilateral hearing loss (MBHL) or unilateral hearing loss (UHL). Method: Children (5-12 years of age) with NH (Experiment 1) and children (8-12 years of age) with MBHL, UHL, or NH (Experiment 2) performed consonant identification and word and sentence recognition in background noise. Percentage correct performance and verbal response time (VRT) were assessed (onset time, total duration). Results: In general, speech recognition improved as signal-to-noise ratio (SNR) increased both for children with NH and children with MBHL or UHL. The groups did not differ on measures of VRT. Onset times were longer for incorrect than for correct responses. For correct responses only, there was a general increase in VRT with decreasing SNR. Conclusions: Findings indicate poorer sentence recognition in children with NH and MBHL or UHL as SNR decreases. VRT results suggest that greater effort was expended when processing stimuli that were incorrectly identified. Increasing VRT with decreasing SNR for correct responses also supports greater effort in poorer acoustic conditions. The absence of significant hearing status differences suggests that VRT was not differentially affected by MBHL, UHL, or NH for children in this study.


Subject(s)
Hearing Loss, Bilateral/psychology , Hearing Loss, Unilateral/psychology , Noise , Pattern Recognition, Physiological , Speech Perception , Child , Child, Preschool , Female , Humans , Linear Models , Male , Neuropsychological Tests
3.
Ear Hear ; 36(1): 136-44, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25170780

ABSTRACT

OBJECTIVES: While classroom acoustics can affect educational performance for all students, the impact for children with minimal/mild hearing loss (MMHL) may be greater than for children with normal hearing (NH). The purpose of this study was to examine the effect of MMHL on children's speech recognition comprehension and looking behavior in a simulated classroom environment. It was hypothesized that children with MMHL would perform similarly to their peers with NH on the speech recognition task but would perform more poorly on the comprehension task. Children with MMHL also were expected to look toward talkers more often than children with NH. DESIGN: Eighteen children with MMHL and 18 age-matched children with NH participated. In a simulated classroom environment, children listened to lines from an elementary-age-appropriate play read by a teacher and four students reproduced over LCD monitors and loudspeakers located around the listener. A gyroscopic headtracking device was used to monitor looking behavior during the task. At the end of the play, comprehension was assessed by asking a series of 18 factual questions. Children also were asked to repeat 50 meaningful sentences with three key words each presented audio-only by a single talker either from the loudspeaker at 0 degree azimuth or randomly from the five loudspeakers. RESULTS: Both children with NH and those with MMHL performed at or near ceiling on the sentence recognition task. For the comprehension task, children with MMHL performed more poorly than those with NH. Assessment of looking behavior indicated that both groups of children looked at talkers while they were speaking less than 50% of the time. In addition, the pattern of overall looking behaviors suggested that, compared with older children with NH, a larger portion of older children with MMHL may demonstrate looking behaviors similar to younger children with or without MMHL. CONCLUSIONS: The results of this study demonstrate that, under realistic acoustic conditions, it is difficult to differentiate performance among children with MMHL and children with NH using a sentence recognition task. The more cognitively demanding comprehension task identified performance differences between these two groups. The comprehension task represented a condition in which the persons talking change rapidly and are not readily visible to the listener. Examination of looking behavior suggested that, in this complex task, attempting to visualize the talker may inefficiently utilize cognitive resources that would otherwise be allocated for comprehension.


Subject(s)
Child Behavior , Hearing Loss/physiopathology , Noise , Schools , Speech Perception/physiology , Acoustics , Audiometry, Pure-Tone , Auditory Threshold , Case-Control Studies , Child , Humans , Severity of Illness Index , Sound Localization/physiology
SELECTION OF CITATIONS
SEARCH DETAIL