Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
Behav Res Methods ; 55(2): 583-599, 2023 02.
Article in English | MEDLINE | ID: mdl-35353316

ABSTRACT

Many applied screening tasks (e.g., medical image or baggage screening) involve challenging searches for which standard laboratory search is rarely equivalent. For example, whereas laboratory search frequently requires observers to look for precisely defined targets among isolated, non-overlapping images randomly arrayed on clean backgrounds, medical images present unspecified targets in noisy, yet spatially regular scenes. Those unspecified targets are typically oddities, elements that do not belong. To develop a closer laboratory analogue to this, we created a database of scenes containing subtle, ill-specified "oddity" targets. These scenes have similar perceptual densities and spatial regularities to those found in expert search tasks, and each includes 16 variants of the unedited scene wherein an oddity (a subtle deformation of the scene) is hidden. In Experiment 1, eight volunteers searched thousands of scene variants for an oddity. Regardless of their search accuracy, they were then shown the highlighted anomaly and rated its subtlety. Subtlety ratings reliably predicted search performance (accuracy and response times) and did so better than image statistics. In Experiment 2, we conducted a conceptual replication in which a larger group of naïve searchers scanned subsets of the scene variants. Prior subtlety ratings reliably predicted search outcomes. Whereas medical image targets are difficult for naïve searchers to detect, our database contains thousands of interior and exterior scenes that vary in difficulty, but are nevertheless searchable by novices. In this way, the stimuli will be useful for studying visual search as it typically occurs in expert domains: Ill-specified search for anomalies in noisy displays.


Subject(s)
Pattern Recognition, Visual , Visual Perception , Humans , Reaction Time/physiology , Databases, Factual , Visual Perception/physiology , Pattern Recognition, Visual/physiology
2.
Atten Percept Psychophys ; 83(7): 2753-2783, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34089167

ABSTRACT

Examining eye-movement behavior during visual search is an increasingly popular approach for gaining insights into the moment-to-moment processing that takes place when we look for targets in our environment. In this tutorial review, we describe a set of pitfalls and considerations that are important for researchers - both experienced and new to the field - when engaging in eye-movement and visual search experiments. We walk the reader through the research cycle of a visual search and eye-movement experiment, from choosing the right predictions, through to data collection, reporting of methodology, analytic approaches, the different dependent variables to analyze, and drawing conclusions from patterns of results. Overall, our hope is that this review can serve as a guide, a talking point, a reflection on the practices and potential problems with the current literature on this topic, and ultimately a first step towards standardizing research practices in the field.


Subject(s)
Eye Movements , Movement , Humans
3.
Cognition ; 210: 104587, 2021 05.
Article in English | MEDLINE | ID: mdl-33508577

ABSTRACT

The label-feedback hypothesis (Lupyan, 2012) proposes that language modulates low- and high-level visual processing, such as priming visual object perception. Lupyan and Swingley (2012) found that repeating target names facilitates visual search, resulting in shorter response times (RTs) and higher accuracy. In the present investigation, we conceptually replicated and extended their study, using additional control conditions and recording eye movements during search. Our goal was to evaluate whether self-directed speech influences target locating (i.e. attentional guidance) or object perception (i.e., distractor rejection and target appreciation). In three experiments, during object search, people spoke target names, nonwords, irrelevant (absent) object names, or irrelevant (present) object names (all within-participants). Experiments 1 and 2 examined search RTs and accuracy: Speaking target names improved performance, without differences among the remaining conditions. Experiment 3 incorporated eye-tracking: Gaze fixation patterns suggested that language does not affect attentional guidance, but instead affects both distractor rejection and target appreciation. When search trials were conditionalized according to distractor fixations, language effects became more orderly: Search was fastest while people spoke target names, followed in linear order by the nonword, distractor-absent, and distractor-present conditions. We suggest that language affects template maintenance during search, allowing fluent differentiation of targets and distractors. Materials, data, and analyses can be retrieved here: https://osf.io/z9ex2/.


Subject(s)
Attention , Eye Movements , Feedback , Humans , Reaction Time , Visual Perception
4.
J Exp Psychol Hum Percept Perform ; 46(3): 274-291, 2020 Mar.
Article in English | MEDLINE | ID: mdl-32077742

ABSTRACT

Research by Rajsic, Wilson, and Pratt (2015, 2017) suggests that people are biased to use a target-confirming strategy when performing simple visual search. In 3 experiments, we sought to determine whether another stubborn phenomenon in visual search, the low-prevalence effect (Wolfe, Horowitz, & Kenner, 2005), would modulate this confirmatory bias. We varied the reliability of the initial cue: For some people, targets usually occurred in the cued color (high prevalence). For others, targets rarely matched the cues (low prevalence). High cue-target prevalence exacerbated the confirmation bias, indexed via search response times (RTs) and eye-tracking measures. Surprisingly, given low cue-target prevalence, people remained biased to examine cue-colored letters, even though cue-colored targets were exceedingly rare. At the same time, people were more fluent at detecting the more common, cue-mismatching targets. The findings suggest that attention is guided to "confirm" the more available cued target template, but prevalence learning over time determines how fluently objects are perceptually appreciated. (PsycINFO Database Record (c) 2020 APA, all rights reserved).


Subject(s)
Attention/physiology , Color Perception/physiology , Cues , Pattern Recognition, Visual/physiology , Space Perception/physiology , Adult , Eye Movement Measurements , Female , Humans , Male , Reaction Time/physiology , Young Adult
5.
Atten Percept Psychophys ; 79(6): 1695-1725, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28508116

ABSTRACT

Recent research has suggested that bilinguals show advantages over monolinguals in visual search tasks, although these findings have been derived from global behavioral measures of accuracy and response times. In the present study we sought to explore the bilingual advantage by using more sensitive eyetracking techniques across three visual search experiments. These spatially and temporally fine-grained measures allowed us to carefully investigate any nuanced attentional differences between bilinguals and monolinguals. Bilingual and monolingual participants completed visual search tasks that varied in difficulty. The experiments required participants to make careful discriminations in order to detect target Landolt Cs among similar distractors. In Experiment 1, participants performed both feature and conjunction search. In Experiments 2 and 3, participants performed visual search while making different types of speeded discriminations, after either locating the target or mentally updating a constantly changing target. The results across all experiments revealed that bilinguals and monolinguals were equally efficient at guiding attention and generating responses. These findings suggest that the bilingual advantage does not reflect a general benefit in attentional guidance, but could reflect more efficient guidance only under specific task demands.


Subject(s)
Attention/physiology , Eye Movements , Multilingualism , Visual Perception/physiology , Adult , Female , Humans , Male , Reaction Time/physiology , Young Adult
6.
Atten Percept Psychophys ; 78(8): 2633-2654, 2016 11.
Article in English | MEDLINE | ID: mdl-27531018

ABSTRACT

During visual search, people are distracted by objects that visually resemble search targets; search is impaired when targets and distractors share overlapping features. In this study, we examined whether a nonvisual form of similarity, overlapping object names, can also affect search performance. In three experiments, people searched for images of real-world objects (e.g., a beetle) among items whose names either all shared the same phonological onset (/bi/), or were phonologically varied. Participants either searched for 1 or 3 potential targets per trial, with search targets designated either visually or verbally. We examined standard visual search (Experiments 1 and 3) and a self-paced serial search task wherein participants manually rejected each distractor (Experiment 2). We hypothesized that people would maintain visual templates when searching for single targets, but would rely more on object names when searching for multiple items and when targets were verbally cued. This reliance on target names would make performance susceptible to interference from similar-sounding distractors. Experiments 1 and 2 showed the predicted interference effect in conditions with high memory load and verbal cues. In Experiment 3, eye-movement results showed that phonological interference resulted from small increases in dwell time to all distractors. The results suggest that distractor names are implicitly activated during search, slowing attention disengagement when targets and distractors share similar names.


Subject(s)
Attention/physiology , Names , Visual Perception/physiology , Adult , Analysis of Variance , Association , Cues , Eye Movements/physiology , Female , Humans , Linguistics , Male , Memory/physiology , Neuropsychological Tests , Perceptual Masking/physiology , Photic Stimulation/methods , Reaction Time/physiology
7.
J Exp Psychol Hum Percept Perform ; 41(4): 1007-20, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25938253

ABSTRACT

When engaged in a visual search for two targets, participants are slower and less accurate in their responses, relative to their performance when searching for singular targets. Previous work on this "dual-target cost" has primarily focused on the breakdown of attentional guidance when looking for two items. Here, we investigated how object identification processes are affected by dual-target search. Our goal was to chart the speed at which distractors could be rejected, to assess whether dual-target search impairs object identification. To do so, we examined the capacity coefficient, which measures the speed at which decisions can be made, and provides a baseline of parallel performance against which to compare. We found that participants could search at or above this baseline, suggesting that dual-target search does not impair object identification abilities. We also found substantial differences in performance when participants were asked to search for simple versus complex images. Somewhat paradoxically, participants were able to reject complex images more rapidly than simple images. We suggest that this reflects the greater number of features that can be used to identify complex images, a finding that has important consequences for understanding object identification in visual search more generally.


Subject(s)
Attention/physiology , Pattern Recognition, Visual/physiology , Psychomotor Performance/physiology , Adult , Humans , Young Adult
8.
J Exp Psychol Hum Percept Perform ; 41(4): 977-94, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25915073

ABSTRACT

In visual search, rare targets are missed disproportionately often. This low-prevalence effect (LPE) is a robust problem with demonstrable societal consequences. What is the source of the LPE? Is it a perceptual bias against rare targets or a later process, such as premature search termination or motor response errors? In 4 experiments, we examined the LPE using standard visual search (with eye tracking) and 2 variants of rapid serial visual presentation (RSVP) in which observers made present/absent decisions after sequences ended. In all experiments, observers looked for 2 target categories (teddy bear and butterfly) simultaneously. To minimize simple motor errors, caused by repetitive absent responses, we held overall target prevalence at 50%, with 1 low-prevalence and 1 high-prevalence target type. Across conditions, observers either searched for targets among other real-world objects or searched for specific bears or butterflies among within-category distractors. We report 4 main results: (a) In standard search, high-prevalence targets were found more quickly and accurately than low-prevalence targets. (b) The LPE persisted in RSVP search, even though observers never terminated search on their own. (c) Eye-tracking analyses showed that high-prevalence targets elicited better attentional guidance and faster perceptual decisions. And (d) even when observers looked directly at low-prevalence targets, they often (12%-34% of trials) failed to detect them. These results strongly argue that low-prevalence misses represent failures of perception when early search termination or motor errors are controlled.


Subject(s)
Eye Movements/physiology , Pattern Recognition, Visual/physiology , Psychomotor Performance/physiology , Adult , Humans , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...