Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
Add more filters










Publication year range
1.
Acta Psychol (Amst) ; 218: 103336, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34020280

ABSTRACT

The current study examined how simple tones affect speeded visuomotor responses in a visual-spatial sequence learning task. Across the three reported experiments, participants were presented with a visual target that appeared in different locations on a touchscreen monitor and they were instructed to touch the visual targets as quickly as possible. Visual sequences were either paired with sounds that correlated with the location of the target, paired with sounds that did not correlate with the location of the target, or the sequences were presented in silence (baseline). Response times decreased across training and participants were slower to respond to the visual stimuli when the sequences were paired with tones. Moreover, these interference effects were more pronounced early in training and explicit instructions directing attention to the visual modality had little effect on eliminating auditory interference, suggesting that these interference effects may stem from bottom-up factors and do not appear to be under attentional control. These findings have implications on tasks that require the processing of simultaneously presented auditory and visual information and provide support for a proposed mechanism underlying auditory dominance on a task that is typically better suited for the visual modality.


Subject(s)
Sound , Visual Perception , Acoustic Stimulation , Auditory Perception , Humans , Photic Stimulation , Reaction Time
2.
Front Psychol ; 11: 1643, 2020.
Article in English | MEDLINE | ID: mdl-32849007

ABSTRACT

The current study used cross-modal oddball tasks to examine cardiac and behavioral responses to changing auditory and visual information. When instructed to press the same button for auditory and visual oddballs, auditory dominance was found with cross-modal presentation slowing down visual response times more than auditory response times (Experiment 1). When instructed to make separate responses to auditory and visual oddballs, visual dominance was found with cross-modal presentation decreasing auditory discrimination, and participants also made more visual-based than auditory-based errors on cross-modal trials (Experiment 2). Experiment 3 increased task demands while requiring a single button press and found evidence of auditory dominance, suggesting that it is unlikely that increased task demands can account for the reversal in Experiment 2. Auditory processing speed was the best predictor of auditory dominance, with auditory dominance being stronger in participants who were slower at processing the sounds, whereas auditory and visual processing speed and baseline heart rate variability did not predict visual dominance. Examination of cardiac responses that were time-locked with stimulus onset showed cross-modal facilitation effects, with auditory and visual discrimination occurring earlier in the course of processing in the cross-modal condition than in the unimodal conditions. The current findings showing that response demand manipulations reversed modality dominance and that time-locked cardiac responses show cross-modal facilitation, not interference, suggest that auditory and visual dominance effects may both be occurring later in the course of processing, not from disrupted encoding.

3.
J Exp Psychol Hum Percept Perform ; 46(11): 1301-1312, 2020 Nov.
Article in English | MEDLINE | ID: mdl-32730069

ABSTRACT

The current study used an eye tracker to examine how auditory input affects the latency of visual saccades, fixations, and response times while using variations of a Serial Response Time (SRT) task. In Experiment 1, participants viewed a repeating sequence of visual stimuli that appeared in different locations on a computer monitor and they had to quickly determine if each visual stimulus was red/blue. The visual sequence was either presented in silence or paired with tones. Compared with the silent condition, the tones slowed down red/blue discriminations and delayed the latency of first fixations to the visual stimuli. To ensure the interference was not occurring during the decision or response phase and to better understand the nature of auditory interference, we removed the red/blue discrimination task in Experiment 2, manipulated cognitive load, and developed a gaze-contingent procedure where the timing of each visual stimulus was dependent on a saccade crossing a gaze-contingent boundary surrounding the target. Participants were slower at initiating their saccades or fixations and made more fixations under high load. As in Experiment 1, auditory interference was found with participants being more likely to fixate on the visual stimuli and were faster at fixating on the visual stimuli in the unimodal condition. These findings suggest that auditory interference effects occur early in the course of processing and provide insights into potential mechanisms underlying modality dominance effects. (PsycInfo Database Record (c) 2020 APA, all rights reserved).


Subject(s)
Auditory Perception/physiology , Fixation, Ocular/physiology , Psychomotor Performance/physiology , Saccades/physiology , Space Perception/physiology , Visual Perception/physiology , Adult , Eye-Tracking Technology , Female , Humans , Male , Reaction Time/physiology , Young Adult
4.
Vision (Basel) ; 4(1)2020 Feb 29.
Article in English | MEDLINE | ID: mdl-32121428

ABSTRACT

Investigations of multisensory integration have demonstrated that, under certain conditions, one modality is more likely to dominate the other. While the direction of this relationship typically favors the visual modality, the effect can be reversed to show auditory dominance under some conditions. The experiments presented here use an oddball detection paradigm with variable stimulus timings to test the hypothesis that a stimulus that is presented earlier will be processed first and therefore contribute to sensory dominance. Additionally, we compared two measures of sensory dominance (slowdown scores and error rate) to determine whether the type of measure used can affect which modality appears to dominate. When stimuli were presented asynchronously, analysis of slowdown scores and error rates yielded the same result; for both the 1- and 3-button versions of the task, participants were more likely to show auditory dominance when the auditory stimulus preceded the visual stimulus, whereas evidence for visual dominance was observed as the auditory stimulus was delayed. In contrast, for the simultaneous condition, slowdown scores indicated auditory dominance, whereas error rates indicated visual dominance. Overall, these results provide empirical support for the hypothesis that the modality that engages processing first is more likely to show dominance, and suggest that more explicit measures of sensory dominance may favor the visual modality.

5.
J Exp Child Psychol ; 178: 317-340, 2019 02.
Article in English | MEDLINE | ID: mdl-30384968

ABSTRACT

There are occasions when infants and children have difficulty in processing arbitrary auditory-visual pairings, with auditory input sometimes attenuating visual processing (i.e., auditory dominance). The current research examined possible mechanisms underlying these auditory dominance effects in infants and 4-year-olds. Do auditory dominance effects stem from auditory input attenuating encoding of visual input, from the difficulty of inhibiting auditory-based responses, or from a combination of these factors? In five reported experiments, 4-year-olds (Experiments 1A, 1B, 2A, and 2B) and 14- and 22-month-olds (Experiment 3) were presented with a variety of tasks that required simultaneous processing of auditory and visual input, and then we assessed memory for the visual items at test. Auditory dominance in young children resulted from response competition that children could not resolve. Infants' results were not as robust, but they provided some evidence that nonlinguistic sounds and possibly spoken words may attenuate encoding of visual input. The current findings shed light on mechanisms underlying cross-modal processing and auditory dominance and have implications for many tasks that hinge on the processing of arbitrary auditory-visual pairings.


Subject(s)
Auditory Perception , Sound , Visual Perception , Acoustic Stimulation , Child, Preschool , Female , Humans , Infant , Male , Memory , Photic Stimulation , Visual Perception/physiology
6.
Front Psychol ; 9: 2454, 2018.
Article in English | MEDLINE | ID: mdl-30568624

ABSTRACT

Many situations require the simultaneous processing of auditory and visual information, however, stimuli presented to one sensory modality can sometimes interfere with processing in a second sensory modality (i.e., modality dominance). The current study further investigated modality dominance by examining how task demands and bimodal presentation affect speeded auditory and visual discriminations. Participants in the current study had to quickly determine if two words, two pictures, or two word-picture pairings were the same or different, and we manipulated task demands across three different conditions. In an immediate recognition task, there was only one second between the two stimuli/stimulus pairs and auditory dominance was found. Compared to the respective unimodal baselines, pairing pictures and words together slowed down visual responses and sped up auditory responses. Increasing the interstimulus interval to four seconds and blocking verbal rehearsal weakened auditory dominance effects, however, conflicting and redundant visual cues sped up auditory discriminations. Thus, simultaneously presenting pictures and words had different effects on auditory and visual processing, with bimodal presentation slowing down visual processing and speeding up auditory processing. These findings are consistent with a proposed mechanism underlying auditory dominance, which posits that auditory stimuli automatically grab attention and attenuate/delay visual processing.

7.
Psychol Aging ; 33(3): 545-558, 2018 05.
Article in English | MEDLINE | ID: mdl-29756807

ABSTRACT

The study examined individual contributions of visual and auditory information on multisensory integration across the life span. In the experiment, children, young adults, and older adults participated in a variant of the Sound-Induced Flash Illusion where participants had to either ignore beeps and report how many flashes they saw or ignore flashes and report how many beeps they heard. Collapsed across age, auditory input had a stronger effect on visual processing than vice versa. However, relative contributions of auditory and visual information interacted with age, with young adults showing evidence of auditory dominance (only auditory input affected visual processing), whereas, multisensory integration effects were more symmetrical in children and older adults. The findings have implications for many tasks that require the processing of multisensory information. (PsycINFO Database Record


Subject(s)
Auditory Perception/physiology , Longevity/physiology , Visual Perception/physiology , Adolescent , Adult , Aged , Aged, 80 and over , Aging , Child , Child, Preschool , Female , Humans , Male , Middle Aged , Young Adult
8.
Front Psychol ; 9: 358, 2018.
Article in English | MEDLINE | ID: mdl-29618996

ABSTRACT

Effects of linguistic labels on learning outcomes are well-established; however, developmental research examining possible mechanisms underlying these effects have provided mixed results. We used a novel paradigm where 8-year-olds and adults were simultaneously trained on three sparse categories (categories with many irrelevant or unique features and a single rule defining feature). Category members were either associated with the same label, different labels, or no labels (silent baseline). Similar to infant paradigms, participants passively viewed individual exemplars and we examined fixations to category relevant features across training. While it is well established that adults can optimize their attention in forced-choice categorization tasks without linguistic input, the present findings provide support for label induced attention optimization: simply hearing the same label associated with different exemplars was associated with increased attention to category relevant features over time, and participants continued to focus on these features on a subsequent recognition task. Participants also viewed images longer and made more fixations when images were paired with unique labels. These findings provide support for the claim that labels may facilitate categorization by directing attention to category relevant features.

9.
Acta Psychol (Amst) ; 182: 154-165, 2018 Jan.
Article in English | MEDLINE | ID: mdl-29179020

ABSTRACT

The present study sought to better understand how children, young adults, and older adults attend and respond to multisensory information. In Experiment 1, young adults were presented with two spoken words, two pictures, or two word-picture pairings and they had to determine if the two stimuli/pairings were exactly the same or different. Pairing the words and pictures together slowed down visual but not auditory response times and delayed the latency of first fixations, both of which are consistent with a proposed mechanism underlying auditory dominance. Experiment 2 examined the development of modality dominance in children, young adults, and older adults. Cross-modal presentation attenuated visual accuracy and slowed down visual response times in children, whereas older adults showed the opposite pattern, with cross-modal presentation attenuating auditory accuracy and slowing down auditory response times. Cross-modal presentation also delayed first fixations in children and young adults. Mechanisms underlying modality dominance and multisensory processing are discussed.


Subject(s)
Auditory Perception/physiology , Dominance, Cerebral/physiology , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Child , Child, Preschool , Eye Movements/physiology , Female , Humans , Male , Photic Stimulation , Reaction Time , Young Adult
10.
Front Psychol ; 9: 2564, 2018.
Article in English | MEDLINE | ID: mdl-30618983

ABSTRACT

The current experiment examined changes in visual selective attention in young children, older children, young adults, and older adults while participants were instructed to ignore auditory and visual distractors. The aims of the study were to: (a) determine if the Perceptual Load Hypothesis (PLH) (distraction greater under low perceptual load) could predict which irrelevant stimuli would disrupt visual selective attention, and (b) if auditory to visual shifts found in modality dominance research could be extended to selective attention tasks. Overall, distractibility decreased with age, with incompatible distractors having larger costs in young and older children than adults. In regard to accuracy, visual distractibility did not differ across age nor load, whereas, auditory interference was more pronounced early in development and correlated with age. Auditory and visual distractors also slowed down responses in young and older children more than adults. Finally, the PLH did not predict performance. Rather, children often showed the opposite pattern, with visual distractors having a greater cost in the high load condition (older children) and auditory distractors having a greater cost in the high load condition (young children). These findings are consistent with research examining the development of modality dominance and shed light on changes in multisensory processing and selective attention across the lifespan.

11.
J Exp Psychol Hum Percept Perform ; 42(12): 1947-1958, 2016 12.
Article in English | MEDLINE | ID: mdl-27505224

ABSTRACT

Simultaneously presenting auditory and visual stimuli can hinder performance for one modality while the other dominates. For approximately 40 years, research with adults has primarily indicated visual dominance, while recent research with infants and young children has revealed auditory dominance. The current study further investigates modality dominance with adults, finding evidence for both auditory and visual dominance across 3 experiments. Using a simple discrimination task, Experiment 1 revealed that cross-modal presentation attenuated discrimination of auditory input, while at the same time, also slowed down visual processing. Even when participants were instructed to only pay attention to the visual stimuli, both spoken nonsense words and nonlinguistic sounds slowed down visual processing (Experiment 2). Experiment 3 used a similar discrimination task while utilizing an eye tracker to examine how auditory input affects visual fixations. Cross-modal presentation attenuated auditory discrimination; however, it also slowed down visual response times. In addition, adults also made longer fixations and were slower to make their first fixation when images were paired with sounds. The latter finding is novel and consistent with a proposed mechanism of auditory dominance: auditory stimuli automatically engage attention and attenuate or delay visual processing. (PsycINFO Database Record


Subject(s)
Attention/physiology , Auditory Perception/physiology , Pattern Recognition, Visual/physiology , Psychomotor Performance/physiology , Adult , Eye Movement Measurements , Female , Humans , Male , Young Adult
12.
Atten Percept Psychophys ; 78(4): 1104-14, 2016 05.
Article in English | MEDLINE | ID: mdl-26832916

ABSTRACT

Approximately 40 years of research on modality dominance has shown that humans are inclined to focus on visual information when presented with compounded visual and auditory stimuli. The current paper reports a series of experiments showing evidence of both auditory and visual dominance effects. Using a behavioral oddball task, we found auditory dominance when examining response times to auditory and visual oddballs-simultaneously presenting pictures and sounds slowed down responses to visual but not auditory oddballs. However, when requiring participants to make separate responses for auditory, visual, and bimodal oddballs, auditory dominance was eliminated with a reversal to visual dominance (Experiment 2). Experiment 3 replicated auditory dominance and showed that increased task demands and asking participants to analyze cross-modal stimuli conjunctively (as opposed to disjunctively) cannot account for the reversal to visual dominance. Mechanisms underlying sensory dominance and factors that may modulate sensory dominance are discussed.


Subject(s)
Acoustic Stimulation/methods , Auditory Perception/physiology , Dominance, Cerebral , Photic Stimulation/methods , Visual Perception/physiology , Attention/physiology , Female , Humans , Male , Reaction Time/physiology , Young Adult
13.
Exp Psychol ; 60(2): 113-21, 2013.
Article in English | MEDLINE | ID: mdl-23047918

ABSTRACT

Presenting information to multiple sensory modalities sometimes facilitates and sometimes interferes with processing of this information. Research examining interference effects shows that auditory input often interferes with processing of visual input in young children (i.e., auditory dominance effect), whereas visual input often interferes with auditory processing in adults (i.e., visual dominance effect). The current study used a cross-modal statistical learning task to examine modality dominance in adults. Participants ably learned auditory and visual statistics when auditory and visual sequences were presented unimodally and when auditory and visual sequences were correlated during training. However, increasing task demands resulted in an important asymmetry: Increased task demands attenuated visual statistical learning, while having no effect on auditory statistical learning. These findings are consistent with auditory dominance effects reported in young children and have important implications for our understanding of how sensory modalities interact while learning the structure of cross-modal information.


Subject(s)
Auditory Perception/physiology , Discrimination, Psychological/physiology , Psychomotor Performance/physiology , Transfer, Psychology/physiology , Visual Perception/physiology , Acoustic Stimulation/methods , Adult , Humans , Photic Stimulation/methods , Reaction Time/physiology , Task Performance and Analysis
14.
Cognition ; 126(2): 156-64, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23142036

ABSTRACT

Many objects and events can be categorized in different ways, and learning multiple categories in parallel often requires flexibly attending to different stimulus dimensions in different contexts. Although infants and young children often exhibit poor attentional control, several theoretical proposals argue that such flexibility can be achieved without selective attention. If this is the case, then even young infants should be able to learn multiple dimension-context contingencies in parallel. This possibility was tested in four experiments with 14- and 22-month-olds. Learning of contingencies succeeded as long as there were multiple correlations between the context and the to-be-learned dimension. These findings suggest that infants can learn multiple dimension-context contingencies in parallel, but only when there is sufficient redundancy in the input.


Subject(s)
Attention/physiology , Child Development/physiology , Concept Formation/physiology , Learning/physiology , Discrimination Learning/physiology , Female , Humans , Infant , Male
15.
Front Psychol ; 3: 95, 2012.
Article in English | MEDLINE | ID: mdl-22514543

ABSTRACT

The current review focuses on how exposure to linguistic input, and count nouns in particular, affect performance on various cognitive tasks, including individuation, categorization and category learning, and inductive inference. We review two theoretical accounts of effects of words. Proponents of one account argue that words have top-down effects on cognitive tasks, and, as such, function as supervisory signals. Proponents of the other account suggest that early in development, words, just like any other perceptual feature, are first and foremost part of the stimulus input and influence cognitive tasks in a bottom-up, non-supervisory fashion. We then review evidence supporting each account. We conclude that, although much research is needed, there is a large body of evidence indicating that words start out like other perceptual features and become supervisory signals in the course of development.

16.
J Exp Child Psychol ; 107(3): 351-8, 2010 Nov.
Article in English | MEDLINE | ID: mdl-20553691

ABSTRACT

Two experiments examined the effects of multimodal presentation and stimulus familiarity on auditory and visual processing. In Experiment 1, 10-month-olds were habituated to either an auditory stimulus, a visual stimulus, or an auditory-visual multimodal stimulus. Processing time was assessed during the habituation phase, and discrimination of auditory and visual stimuli was assessed during a subsequent testing phase. In Experiment 2, the familiarity of the auditory or visual stimulus was systematically manipulated by prefamiliarizing infants to either the auditory or visual stimulus prior to the experiment proper. With the exception of the prefamiliarized auditory condition in Experiment 2, infants in the multimodal conditions failed to increase looking when the visual component changed at test. This finding is noteworthy given that infants discriminated the same visual stimuli when presented unimodally, and there was no evidence that multimodal presentation attenuated auditory processing. Possible factors underlying these effects are discussed.


Subject(s)
Auditory Perception/physiology , Child Development/physiology , Discrimination, Psychological/physiology , Pattern Recognition, Visual/physiology , Recognition, Psychology/physiology , Acoustic Stimulation/methods , Attention/physiology , Female , Habituation, Psychophysiologic/physiology , Humans , Infant , Male , Mental Processes/physiology , Photic Stimulation/methods , Reaction Time/physiology
17.
Wiley Interdiscip Rev Cogn Sci ; 1(1): 135-141, 2010 Jan.
Article in English | MEDLINE | ID: mdl-26272846

ABSTRACT

The ability to process and integrate cross-modal input is important for many everyday tasks. The current paper reviews theoretical and empirical work examining cross-modal processing with a focus on recent findings examining infants' and children's processing of arbitrary auditory-visual pairings. The current paper puts forward a potential mechanism that may account for modality dominance effects found in a variety of cognitive tasks. The mechanism assumes that although early processing of auditory and visual input is parallel, attention is allocated in a serial manner with the modality that is faster to engage attention dominating later processing. Details of the mechanism, factors influencing processing of arbitrary auditory-visual pairings, and implications for higher-order tasks are discussed. Copyright © 2009 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website.

18.
Dev Sci ; 11(6): 869-81, 2008 Nov.
Article in English | MEDLINE | ID: mdl-19046156

ABSTRACT

Under many conditions auditory input interferes with visual processing, especially early in development. These interference effects are often more pronounced when the auditory input is unfamiliar than when the auditory input is familiar (e.g. human speech, pre-familiarized sounds, etc.). The current study extends this research by examining how auditory input affects 8- and 14-month-olds' performance on individuation tasks. The results of the current study indicate that both unfamiliar sounds and words interfered with infants' performance on an individuation task, with cross-modal interference effects being numerically stronger for unfamiliar sounds. The effects of auditory input on a variety of lexical tasks are discussed.


Subject(s)
Individuation , Recognition, Psychology/physiology , Speech Perception/physiology , Speech/physiology , Visual Perception/physiology , Acoustic Stimulation , Female , Humans , Infant , Language , Language Development , Male
19.
Cogn Sci ; 32(2): 342-65, 2008 Mar.
Article in English | MEDLINE | ID: mdl-21635339

ABSTRACT

Although it is well documented that language plays an important role in cognitive development, there are different views concerning the mechanisms underlying these effects. Some argue that even early in development, effects of words stem from top-down knowledge, whereas others argue that these effects stem from auditory input affecting attention allocated to visual input. Previous research (e.g., Robinson & Sloutsky, 2004a) demonstrated that non-speech sounds attenuate processing of corresponding visual input at 8, 12, and 16 months of age, whereas the current study demonstrates that words attenuate visual processing at 10 months but not at 16 months (Experiment 1). Furthermore, prefamiliarization with non-speech sounds (Experiment 2) resulted in able processing of visual input by 16-month-olds. These findings suggest that some effects of labels found early in development may stem from familiarity with human speech. The possibility of general-auditory factors underlying the effects of words on cognitive development is discussed.

20.
Dev Sci ; 10(6): 734-40, 2007 Nov.
Article in English | MEDLINE | ID: mdl-17973789

ABSTRACT

The ability to process simultaneously presented auditory and visual information is a necessary component underlying many cognitive tasks. While this ability is often taken for granted, there is evidence that under many conditions auditory input attenuates processing of corresponding visual input. The current study investigated infants' processing of visual input under unimodal and cross-modal conditions. Results of the three reported experiments indicate that different auditory input had different effects on infants' processing of visual information. In particular, unfamiliar auditory input slowed down visual processing, whereas more familiar auditory input did not. These results elucidate mechanisms underlying auditory overshadowing in the course of cross-modal processing and have implications on a variety of cognitive tasks that depend on cross-modal processing.


Subject(s)
Acoustic Stimulation , Visual Perception/physiology , Female , Humans , Infant , Kinetics , Male , Photic Stimulation , Recognition, Psychology/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...