Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Biol Psychol ; 177: 108512, 2023 02.
Article in English | MEDLINE | ID: mdl-36724810

ABSTRACT

Past work has shown that when a peripheral sound captures our attention, it activates the contralateral visual cortex as revealed by an event-related potential component labelled the auditory-evoked contralateral occipital positivity (ACOP). This cross-modal activation of the visual cortex has been observed even when the sounds were not relevant to the ongoing task (visual or auditory), suggesting that peripheral sounds automatically activate the visual cortex. However, it is unclear whether top-down factors such as visual working memory (VWM) load and endogenous attention, which modulate the impact of task-irrelevant information, may modulate this spatially-specific component. Here, we asked participants to perform a lateralized VWM task (change detection), whose performance is supported by both endogenous spatial attention and VWM storage. A peripheral sound that was unrelated to the ongoing task was delivered during the retention interval. The amplitude of sound-elicited ACOP was analyzed as a function of the spatial correspondence with the cued hemifield, and of the memory array set-size. The typical ACOP modulation was observed over parieto-occipital sites in the 280-500 ms time window after sound onset. Its amplitude was not affected by VWM load but was modulated when the location of the sound did not correspond to the hemifield (right or left) that was cued for the change detection task. Our results suggest that sound-elicited activation of visual cortices, as reflected in the ACOP modulation, is unaffected by visual working memory load. However, endogenous spatial attention affects the ACOP, challenging the hypothesis that it reflects an automatic process.


Subject(s)
Evoked Potentials, Auditory , Visual Cortex , Adult , Female , Humans , Male , Young Adult , Attention/physiology , Evoked Potentials, Auditory/physiology , Memory, Short-Term/physiology , Visual Cortex/physiology
2.
Cogn Sci ; 45(6): e13009, 2021 06.
Article in English | MEDLINE | ID: mdl-34170027

ABSTRACT

The investigation of visual categorization has recently been aided by the introduction of deep convolutional neural networks (CNNs), which achieve unprecedented accuracy in picture classification after extensive training. Even if the architecture of CNNs is inspired by the organization of the visual brain, the similarity between CNN and human visual processing remains unclear. Here, we investigated this issue by engaging humans and CNNs in a two-class visual categorization task. To this end, pictures containing animals or vehicles were modified to contain only low/high spatial frequency (HSF) information, or were scrambled in the phase of the spatial frequency spectrum. For all types of degradation, accuracy increased as degradation was reduced for both humans and CNNs; however, the thresholds for accurate categorization varied between humans and CNNs. More remarkable differences were observed for HSF information compared to the other two types of degradation, both in terms of overall accuracy and image-level agreement between humans and CNNs. The difficulty with which the CNNs were shown to categorize high-passed natural scenes was reduced by picture whitening, a procedure which is inspired by how visual systems process natural images. The results are discussed concerning the adaptation to regularities in the visual environment (scene statistics); if the visual characteristics of the environment are not learned by CNNs, their visual categorization may depend only on a subset of the visual information on which humans rely, for example, on low spatial frequency information.


Subject(s)
Neural Networks, Computer , Visual Perception , Animals , Brain , Brain Mapping , Humans
3.
J Cogn Neurosci ; 31(1): 109-125, 2019 01.
Article in English | MEDLINE | ID: mdl-30188778

ABSTRACT

Understanding natural scenes involves the contribution of bottom-up analysis and top-down modulatory processes. However, the interaction of these processes during the categorization of natural scenes is not well understood. In the current study, we approached this issue using ERPs and behavioral and computational data. We presented pictures of natural scenes and asked participants to categorize them in response to different questions (Is it an animal/vehicle? Is it indoors/outdoors? Are there one/two foreground elements?). ERPs for target scenes requiring a "yes" response began to differ from those of nontarget scenes, beginning at 250 msec from picture onset, and this ERP difference was unmodulated by the categorization questions. Earlier ERPs showed category-specific differences (e.g., between animals and vehicles), which were associated with the processing of scene statistics. From 180 msec after scene onset, these category-specific ERP differences were modulated by the categorization question that was asked. Categorization goals do not modulate only later stages associated with target/nontarget decision but also earlier perceptual stages, which are involved in the processing of scene statistics.


Subject(s)
Brain/physiology , Decision Making/physiology , Goals , Pattern Recognition, Visual/physiology , Adult , Attention/physiology , Electroencephalography , Evoked Potentials , Female , Humans , Machine Learning , Male , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...