Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
J Exp Psychol Gen ; 153(6): 1568-1581, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38647455

ABSTRACT

People excel at learning the statistics of their environments. For instance, people rapidly learn to pay attention to locations that frequently contain visual search targets. Here, we investigated how frequently finding specific objects as search targets influences attentional selection during real-world object search. We investigated how learning that a specific object (e.g., a coat) is task-relevant affects searching for that object and whether a previously frequent target would influence search more broadly for all items of that target's category (e.g., all coats). Across five experiments, one or more objects from a single category were likely targets during a training phase, after which objects from many categories became equally likely to be targets in a neutral testing phase. Participants learned to find a single frequent target object faster than other objects (Experiment 1, N = 44). This learning was specific to that object, with no advantage in finding a novel category-matched object (Experiment 2, N = 32). In contrast, learning to prioritize multiple exemplars from one category spread to untrained objects from the same category, and this spread was comparable whether participants learned to find two, four, or six exemplars (Experiment 3, N = 72). These differences in the breadth of attention were due to variability in the learning environment and not differences in task (Experiment 4, N = 24). Finally, a set-size manipulation showed that learning affects attentional guidance itself, not only postselective processing (Experiment 5, N = 96). These experiments demonstrate that the breadth of attentional tuning is flexibly adjusted based on recent experience to effectively address task demands. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Attention , Humans , Attention/physiology , Male , Female , Adult , Young Adult , Learning/physiology , Pattern Recognition, Visual/physiology , Visual Perception/physiology
2.
Invest Ophthalmol Vis Sci ; 64(12): 23, 2023 09 01.
Article in English | MEDLINE | ID: mdl-37703039

ABSTRACT

Purpose: In the United States, AMD is a leading cause of low vision that leads to central vision loss and has a high co-occurrence with hearing loss. The impact of central vision loss on the daily functioning of older individuals cannot be fully addressed without considering their hearing status. We investigated the impact of combined central vision loss and hearing loss on spatial localization, an ability critical for social interactions and navigation. Methods: Sixteen older adults with central vision loss primarily due to AMD, with or without co-occurring hearing loss, completed a spatial perimetry task in which they verbally reported the directions of visual or auditory targets. Auditory testing was done with eyes open in a dimly lit room or with a blindfold. Twenty-three normally sighted, age-matched, and hearing-matched control subjects also completed the task. Results: Subjects with central vision loss missed visual targets more often. They showed increased deviations in visual biases from control subjects as the scotoma size increased. However, these deficits did not generalize to sound localization. As hearing loss became more severe, the sound localization variability increased, and this relationship was not altered by coexisting central vision loss. For both control and central vision loss subjects, sound localization was less reliable when subjects wore blindfolds, possibly due to the absence of visual contextual cues. Conclusions: Although central vision loss impairs visual localization, it does not impair sound localization and does not prevent vision from providing useful contextual cues for sound localization.


Subject(s)
Deafness , Hearing Loss , Sound Localization , Humans , Aged , Scotoma , Hearing Loss/diagnosis , Eye
3.
Atten Percept Psychophys ; 85(3): 834-844, 2023 Apr.
Article in English | MEDLINE | ID: mdl-36229632

ABSTRACT

Explicit knowledge about upcoming target or distractor features can increase performance in tasks like visual search. However, explicit distractor cues generally result in smaller performance benefits than target cues, suggesting that suppressing irrelevant information is less effective than enhancing relevant information. Is this asymmetry a general principle of feature-based attention? Across four experiments (N = 75 each) we compared the efficiency of target selection and distractor ignoring through either incidental experience or explicit instructions. Participants searched for an orientation-defined target amidst seven distractors-three in the target color and four in another color. In Experiment 1, either targets (Exp. 1a) or distractors (Exp. 1b) were presented more often in a specific color than other possible search colors. Response times showed comparable benefits of learned attention towards (Exp. 1a) and away from (Exp. 1b) the frequent color, suggesting that learned target selection and distractor ignoring can be equally effective. In Experiment 2, participants completed a nearly identical task, only with explicit cues to the target (Exp. 2a) or distractor color (Exp. 2b), inducing voluntary attention. Both target and distractor cues were beneficial for search performance, but distractor cues much less so than target cues, consistent with previous results. Cross-experiment analyses verified that the relative inefficiency of distractor ignoring versus target selection is a unique characteristic of voluntary attention that is not shared by incidentally learned attention, pointing to dissociable mechanisms of voluntary and learned attention to support distractor ignoring.


Subject(s)
Cues , Learning , Humans , Reaction Time/physiology , Knowledge
4.
Front Aging Neurosci ; 14: 838194, 2022.
Article in English | MEDLINE | ID: mdl-35493928

ABSTRACT

Visual and auditory localization abilities are crucial in real-life tasks such as navigation and social interaction. Aging is frequently accompanied by vision and hearing loss, affecting spatial localization. The purpose of the current study is to elucidate the effect of typical aging on spatial localization and to establish a baseline for older individuals with pathological sensory impairment. Using a verbal report paradigm, we investigated how typical aging affects visual and auditory localization performance, the reliance on vision during sound localization, and sensory integration strategies when localizing audiovisual targets. Fifteen younger adults (N = 15, mean age = 26 years) and thirteen older adults (N = 13, mean age = 68 years) participated in this study, all with age-adjusted normal vision and hearing based on clinical standards. There were significant localization differences between younger and older adults, with the older group missing peripheral visual stimuli at significantly higher rates, localizing central stimuli as more peripheral, and being less precise in localizing sounds from central locations when compared to younger subjects. Both groups localized auditory targets better when the test space was visible compared to auditory localization when blindfolded. The two groups also exhibited similar patterns of audiovisual integration, showing optimal integration in central locations that was consistent with a Maximum-Likelihood Estimation model, but non-optimal integration in peripheral locations. These findings suggest that, despite the age-related changes in auditory and visual localization, the interactions between vision and hearing are largely preserved in older individuals without pathological sensory impairments.

5.
Prog Neurobiol ; 213: 102269, 2022 06.
Article in English | MEDLINE | ID: mdl-35427732

ABSTRACT

Distractor suppression refers to the ability to filter out distracting and task-irrelevant information. Distractor suppression is essential for survival and considered a key aspect of selective attention. Despite the recent and rapidly evolving literature on distractor suppression, we still know little about how the brain suppresses distracting information. What limits progress is that we lack mutually agreed upon principles of how to study the neural basis of distractor suppression and its manifestation in behavior. Here, we offer ten simple rules that we believe are fundamental when investigating distractor suppression. We provide guidelines on how to design conclusive experiments on distractor suppression (Rules 1-3), discuss different types of distractor suppression that need to be distinguished (Rules 4-6), and provide an overview of models of distractor suppression and considerations of how to evaluate distractor suppression statistically (Rules 7-10). Together, these rules provide a concise and comprehensive synopsis of promising advances in the field of distractor suppression. Following these rules will propel research on distractor suppression in important ways, not only by highlighting prominent issues to both new and more advanced researchers in the field, but also by facilitating communication between sub-disciplines.


Subject(s)
Attention , Brain , Humans
7.
Psychon Bull Rev ; 29(4): 1338-1346, 2022 Aug.
Article in English | MEDLINE | ID: mdl-35318583

ABSTRACT

Visual search benefits from advance knowledge of nontarget features. However, it is unknown whether these negatively cued features are suppressed in advance (proactively) or during search (reactively). To test this, we presented color cues varying from trial-to-trial that predicted target or nontarget colors. Experiment 1 (N = 96) showed that both target and nontarget cues speeded search. To test whether attention proactively modified cued feature representations, in Experiment 2 (N = 200), we interleaved color probe and search trials and had participants detect the color of a briefly presented ring that could either match the cued color or not. People detected positively cued colors better than other colors, whereas negatively cued colors were detected no better or worse than other colors. These results demonstrate that nontarget features are not suppressed proactively, and instead suggest that anticipated nontarget features are ignored via reactive mechanisms.


Subject(s)
Attention , Cues , Color Perception , Humans , Reaction Time , Visual Perception
8.
Atten Percept Psychophys ; 84(6): 1901-1912, 2022 Aug.
Article in English | MEDLINE | ID: mdl-34921336

ABSTRACT

Central vision loss disrupts voluntary shifts of spatial attention during visual search. Recently, we reported that a simulated scotoma impaired learned spatial attention towards regions likely to contain search targets. In that task, search items were overlaid on natural scenes. Because natural scenes can induce explicit awareness of learned biases leading to voluntary shifts of attention, here we used a search display with a blank background less likely to induce awareness of target location probabilities. Participants searched both with and without a simulated central scotoma: a training phase contained targets more often in one screen quadrant and a testing phase contained targets equally often in all quadrants. In Experiment 1, training used no scotoma, while testing alternated between blocks of scotoma and no-scotoma search. Experiment 2 training included the scotoma and testing again alternated between scotoma and no-scotoma search. Response times and saccadic behaviors in both experiments showed attentional biases towards the high-probability target quadrant during scotoma and no-scotoma search. Whereas simulated central vision loss impairs learned spatial attention in the context of natural scenes, our results show that this may not arise from impairments to the basic mechanisms of attentional learning indexed by visual search tasks without scenes.


Subject(s)
Probability Learning , Scotoma , Attention/physiology , Humans , Reaction Time , Saccades
9.
Cortex ; 138: 241-252, 2021 05.
Article in English | MEDLINE | ID: mdl-33735796

ABSTRACT

Some eye diseases, especially macular degeneration, can cause central vision loss (CVL), impairing goal-driven guidance of attention. Does CVL also affect implicit, experience-driven attention? We investigated how simulated central scotomas affected young adults' ability to prioritize locations frequently containing visual search targets (location probability learning). Participants searched among distractor letter 'L's for a target 'T' that appeared more often in one screen quadrant than others. To dissociate potential impairments to statistical learning of target locations and attentional guidance, two experiments each included search with and without simulated scotomas. Experiment 1 successfully induced probability learning in a no-scotoma phase. When participants later searched both with and without simulated scotomas, they showed persistent, statistically equivalent spatial biases in both no-scotoma and scotoma search. Experiment 2 trained participants with a central scotoma. While Experiment 1's participants acquired probability learning regardless of their self-reported awareness of the target's location probability, in Experiment 2 only aware participants learned to bias attention to the high probability region. Similarly, learning with a scotoma affected search with no scotoma in aware but not unaware participants. Together, these results show that simulated central vision loss interferes with the acquisition of implicitly learned location probability learning, supporting a role of central vision in implicit spatial attentional biases.


Subject(s)
Probability Learning , Scotoma , Bias , Humans , Learning , Probability , Young Adult
10.
Trends Cogn Sci ; 23(11): 927-937, 2019 11.
Article in English | MEDLINE | ID: mdl-31521482

ABSTRACT

In addition to conscious goals and stimulus salience, an observer's prior experience also influences selective attention. Early studies demonstrated experience-driven effects on attention mainly in the visual modality, but increasing evidence shows that experience drives auditory selection as well. We review evidence for a multiple-levels framework of auditory attention, in which experience-driven attention relies on mechanisms that acquire control settings and mechanisms that guide attention towards selected stimuli. Mechanisms of acquisition include cue-target associative learning, reward learning, and sensitivity to prior selection history. Once acquired, implementation of these biases can occur either consciously or unconsciously. Future research should more fully characterize the sources of experience-driven auditory attention and investigate the neural mechanisms used to acquire and implement experience-driven auditory attention.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Learning/physiology , Humans
11.
Psychon Bull Rev ; 26(2): 552-558, 2019 Apr.
Article in English | MEDLINE | ID: mdl-30887446

ABSTRACT

We tested whether implicit learning causes shifts of spatial attention in advance of or in response to stimulus onset. Participants completed randomly interspersed trials of letter search, which involved reporting the orientation of a T among Ls, and scene search, which involved identifying which of four scenes was from a target category (e.g., forest). In Experiment 1, an initial phase more often contained target letters in one screen quadrant, while the target scenes appeared equally often in all quadrants. Participants persistently prioritized letter targets in the more probable region, but the implicitly learned preference did not affect the unbiased scene task. In Experiment 2, the spatial probabilities of the scene and letter tasks reversed. Participants unaware of the probability manipulation acquired only a spatial bias to scene targets in the more probable region, with no effect on letter search. Instead of recruiting baseline shifts of spatial attention prior to stimulus onset, implicit learning of target probability yields task-dependent shifts of spatial attention following stimulus onset. Such shifts may involve attentional behaviors unique to certain task contexts.


Subject(s)
Attention , Orientation, Spatial , Probability Learning , Adolescent , Bias , Female , Humans , Male , Reaction Time , Space Perception , Task Performance and Analysis , Visual Perception , Young Adult
12.
J Exp Psychol Hum Percept Perform ; 45(4): 474-488, 2019 Apr.
Article in English | MEDLINE | ID: mdl-30816786

ABSTRACT

Evidence suggests that prior attentional selection guides visuospatial attention without conscious intent. Yet few studies have examined whether selection history influences auditory spatial attention. Using a novel auditory search task, we investigated two selection history effects: short-term intertrial location priming and long-term location probability learning. Participants reported whether a spoken number, occurring simultaneously with three spoken letter distractors presented from different locations, was odd or even. We first showed that endogenous attention guided by informative arrows facilitated search in our paradigm. Next, intertrial location priming was assessed by comparing reaction time when target location repeated across recent trials to when target location changed. Unlike visual search, auditory search showed little evidence of intertrial location priming. In a separate experiment, we investigated location probability learning by making targets disproportionately likely to appear in one location. Results showed location probability learning: participants were faster when targets occurred in the high-probability location than in the low-probability locations. To our knowledge, this is the first study of intertrial location priming or long-term location probability learning in auditory search. The findings have implications for the role of spatial relevance in auditory attention and suggest that long-term attentional learning and short-term priming rely on separate mechanisms. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Subject(s)
Attention/physiology , Auditory Perception/physiology , Probability Learning , Space Perception/physiology , Adolescent , Adult , Female , Humans , Male , Young Adult
13.
J Exp Psychol Hum Percept Perform ; 44(3): 356-366, 2018 Mar.
Article in English | MEDLINE | ID: mdl-28795835

ABSTRACT

To what degree does spatial attention for one task spread to all stimuli in the attended region, regardless of task relevance? Most models imply that spatial attention acts through a unitary priority map in a task-general manner. We show that implicit learning, unlike endogenous spatial cuing, can bias spatial attention within one task without biasing attention to a spatially overlapping secondary task. Participants completed a visual search task superimposed on a background containing scenes, which they were told to encode for a later memory task. Experiments 1 and 2 used explicit instructions to bias spatial attention to one region for visual search; Experiment 3 used location probability cuing to implicitly bias spatial attention. In location probability cuing, a target appeared in one region more than others despite participants not being told of this. In all experiments, search performance was better in the cued region than in uncued regions. However, scene memory was better in the cued region only following endogenous guidance, not after implicit biasing of attention. These data support a dual-system view of top-down attention that dissociates goal-driven and implicitly learned attention. Goal-driven attention is task general, amplifying processing of a cued region across tasks, whereas implicit statistical learning is task-specific. (PsycINFO Database Record


Subject(s)
Attention/physiology , Goals , Probability Learning , Space Perception/physiology , Visual Perception/physiology , Adult , Cues , Female , Humans , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...