Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Front Hum Neurosci ; 18: 1331253, 2024.
Article in English | MEDLINE | ID: mdl-38566999

ABSTRACT

Introduction: The concept of affordance refers to the opportunities for action provided by the environment, often conveyed through visual information. It has been applied to explain visuomotor processing and movement planning. As emotion modulates both visual perception and the motor system, it is reasonable to ask whether emotion can influence affordance judgments. If present, this relationship can have important ontological implications for affordances. Thus, we investigated whether the emotional value of manipulable objects affected the judgment of the appropriate grasping that could be used to interact with them (i.e., their affordance). Methods: Volunteers were instructed to use a numerical scale to report their judgment on how an observed object should be grasped. We compared these judgments across emotional categories of objects (pleasant, unpleasant and neutral), while also considering the expected effect of object size. Results: We found that unpleasant objects were rated as more appropriately graspable by a precision grip than pleasant and neutral objects. Simultaneously, smaller object size also favored this judgment. This effect was seen in all emotional categories examined in equal magnitude. Discussion: Our findings suggest that the emotional value of objects modulates affordance judgments in a way that favors careful manipulation and minimal physical contact with aversive stimuli. Finally, we discuss how this affective aspect of our experience of objects overlaps with what affordances are conceptualized to be, calling for further reexamination of the relationship between affordances and emotions.

2.
J Neurosci ; 42(45): 8556-8568, 2022 11 09.
Article in English | MEDLINE | ID: mdl-36150889

ABSTRACT

An increasing number of studies have shown that cross-modal interaction can occur in early sensory cortices. Yet, how neurons in sensory cortices integrate multisensory cues in perceptual tasks and to what extent this influences behavior is largely unclear. To investigate, we examined visual modulation of auditory responses in the primary auditory cortex (A1) in a two-alternative forced-choice task. During the task, male rats were required to make a behavioral choice based on the pure tone frequency (low vs high) of the self-triggered stimulus to get a water reward. The result showed that the presence of a noninformative visual cue did not uniformly influence auditory response, with frequently enhancing just one of them. Closely correlated with behavioral choice, the visual cue mainly enhanced responsiveness to the auditory cue indicating a movement direction contralateral to A1 being recorded. Operating in this fashion provided A1 neurons a superior capability to discriminate sound during multisensory trials. Concomitantly, behavioral data and decoding analysis revealed that visual cue presence could speed the process of sound discrimination. We also observed this differential multisensory integration effect in well-trained rats when tested with passive stimulation and under anesthesia, albeit to a much lesser extent. We did not see this differentially integrative effect while recording in A1 in another similar group of rats performing a free-choice task. These data suggest that auditory cortex can engage in meaningful audiovisual processing, and perceptual learning can modify its multisensory integration mechanism to meet task requirements.SIGNIFICANCE STATEMENT In the natural environment, visual stimuli are frequently accompanied by auditory cues. Although multisensory integration has traditionally been seen as a feature of associational cortices, recent studies have shown that cross-modal inputs can also influence neuronal activity in primary sensory cortices. However, exactly how neurons in sensory cortices integrate multisensory cues to guide behavioral choice is still unclear. Here, we describe a novel model of multisensory integration used by A1 neurons to shape auditory representations when rats performed a cue-guided task. We found that a task-irrelevant visual cue could specifically enhance the response of neurons in sound guiding to the contralateral choice. This differentially integrative model facilitated sound discrimination and behavioral choice. This result indicates that task engagement can modulate multisensory integration.


Subject(s)
Auditory Cortex , Male , Rats , Animals , Auditory Cortex/physiology , Acoustic Stimulation , Photic Stimulation , Auditory Perception/physiology , Discrimination, Psychological , Visual Perception/physiology
3.
Cereb Cortex ; 32(5): 1040-1054, 2022 02 19.
Article in English | MEDLINE | ID: mdl-34378017

ABSTRACT

Sensory cortices, classically considered to represent modality-specific sensory information, are also found to engage in multisensory processing. However, how sensory processing in sensory cortices is cross-modally modulated remains an open question. Specifically, we understand little of cross-modal representation in sensory cortices in perceptual tasks and how perceptual learning modifies this process. Here, we recorded neural responses in primary auditory cortex (A1) both while freely moving rats discriminated stimuli in Go/No-Go tasks and when anesthetized. Our data show that cross-modal representation in auditory cortices varies with task contexts. In the task of an audiovisual cue being the target associating with water reward, a significantly higher proportion of auditory neurons showed a visually evoked response. The vast majority of auditory neurons, if processing auditory-visual interactions, exhibit significant multisensory enhancement. However, when the rats performed tasks with unisensory cues being the target, cross-modal inhibition, rather than enhancement, predominated. In addition, multisensory associational learning appeared to leave a trace of plastic change in A1, as a larger proportion of A1 neurons showed multisensory enhancement in anesthesia. These findings indicate that multisensory processing in principle sensory cortices is not static, and having cross-modal interaction in the task requirement can substantially enhance multisensory processing in sensory cortices.


Subject(s)
Auditory Cortex , Acoustic Stimulation , Animals , Auditory Cortex/physiology , Auditory Perception/physiology , Cues , Photic Stimulation , Rats , Visual Perception/physiology
4.
Behav Brain Res ; 406: 113223, 2021 05 21.
Article in English | MEDLINE | ID: mdl-33677014

ABSTRACT

Most everyday actions engender interactions with meaningful emotionally-laden stimuli. This study aimed to select pictures of objects as emotional stimulus of affordance to be grasped. The participant's depression trait was also assessed to examine its effect on the judgment of these pictures, and time spent in the classification was computed. Sixty-three participants joined this study. Self-Assessment-Manikin scale was used to classify pictures of the objects, and Beck Depression Inventory was applied to distribute the sample according depression trait. Cluster analysis was used in the classification of 123 objects based on valence and arousal values. Cluster results returned 102 classified pictures in three categories: pleasant (21), neutral (48) and unpleasant (33). Where cluster analysis did not agree, the picture was excluded and not used any further (21). Pleasant pictures presented the highest valence values and unpleasant pictures the lowest, and both categories returned the highest arousal level. In the middle of the valence range, the neutral category evoked the lowest arousal levels. Participants were slower to classify unpleasant pictures in valence sub-scale and faster to classify neutral pictures in arousal one. There was no effect of depression in the response time needed to score the pictures. Thus, agreement of high-performance soft clustering algorithms emerged as a good tool to classify pictures representing objects based on valence and arousal dimensions. Depression trait does not significantly affect the accuracy or time-order of emotional classification. Finally, we presented a set of emotional stimuli that can be employed to examine distinct aspects of emotion over physiology and behavior.


Subject(s)
Concept Formation/physiology , Depression/physiopathology , Depressive Disorder/physiopathology , Emotions/physiology , Pattern Recognition, Visual/physiology , Adult , Cluster Analysis , Female , Humans , Male , Pleasure/physiology , Young Adult
5.
Mol Brain ; 14(1): 13, 2021 01 15.
Article in English | MEDLINE | ID: mdl-33446258

ABSTRACT

Cross-modal interaction (CMI) could significantly influence the perceptional or decision-making process in many circumstances. However, it remains poorly understood what integrative strategies are employed by the brain to deal with different task contexts. To explore it, we examined neural activities of the medial prefrontal cortex (mPFC) of rats performing cue-guided two-alternative forced-choice tasks. In a task requiring rats to discriminate stimuli based on auditory cue, the simultaneous presentation of an uninformative visual cue substantially strengthened mPFC neurons' capability of auditory discrimination mainly through enhancing the response to the preferred cue. Doing this also increased the number of neurons revealing a cue preference. If the task was changed slightly and a visual cue, like the auditory, denoted a specific behavioral direction, mPFC neurons frequently showed a different CMI pattern with an effect of cross-modal enhancement best evoked in information-congruent multisensory trials. In a choice free task, however, the majority of neurons failed to show a cross-modal enhancement effect and cue preference. These results indicate that CMI at the neuronal level is context-dependent in a way that differs from what has been shown in previous studies.


Subject(s)
Choice Behavior , Prefrontal Cortex/physiology , Acoustic Stimulation , Animals , Behavior, Animal , Cues , Male , Neurons/physiology , Perception/physiology , Photic Stimulation , Rats, Sprague-Dawley , Task Performance and Analysis
6.
J Physiol ; 596(20): 5033-5050, 2018 10.
Article in English | MEDLINE | ID: mdl-30144059

ABSTRACT

KEY POINTS: It has been known for some time that sensory information of one type can bias the spatial perception of another modality. However, there is a lack of evidence of this occurring in individual neurons. In the present study, we found that the spatial receptive field of superior colliculus multisensory neurons could be dynamically shifted by a preceding stimulus in a different modality. The extent to which the receptive field shifted was dependent on both temporal and spatial gaps between the preceding and following stimuli, as well as the salience of the preceding stimulus. This result provides a neural mechanism that could underlie the process of cross-modal spatial calibration. ABSTRACT: Psychophysical studies have shown that the different senses can be spatially entrained by each other. This can be observed in certain phenomena, such as ventriloquism, in which a visual stimulus can attract the perceived location of a spatially discordant sound. However, the neural mechanism underlying this cross-modal spatial recalibration has remained unclear, as has whether it takes place dynamically. We explored these issues in multisensory neurons of the cat superior colliculus (SC), a midbrain structure that involves both cross-modal and sensorimotor integration. Sequential cross-modal stimulation showed that the preceding stimulus can shift the receptive field (RF) of the lagging response. This cross-modal spatial calibration took place in both auditory and visual RFs, although auditory RFs shifted slightly more. By contrast, if a preceding stimulus was from the same modality, it failed to induce a similarly substantial RF shift. The extent of the RF shift was dependent on both temporal and spatial gaps between the preceding and following stimuli, as well as the salience of the preceding stimulus. A narrow time gap and high stimulus salience were able to induce larger RF shifts. In addition, when both visual and auditory stimuli were presented simultaneously, a substantial RF shift toward the location-fixed stimulus was also induced. These results, taken together, reveal an online cross-modal process and reflect the details of the organization of SC inter-sensory spatial calibration.


Subject(s)
Superior Colliculi/physiology , Animals , Auditory Perception , Cats , Evoked Potentials , Male , Superior Colliculi/cytology , Visual Perception
7.
Cereb Cortex ; 27(12): 5568-5578, 2017 12 01.
Article in English | MEDLINE | ID: mdl-27797831

ABSTRACT

Physiological and behavioral studies in cats show that corticotectal inputs play a critical role in the information-processing capabilities of neurons in the deeper layers of the superior colliculus (SC). Among them, the sensory inputs from functionally related associational cortices are especially critical for SC multisensory integration. However, the underlying mechanism supporting this influence is still unclear. Here, results demonstrate that deactivation of relevant cortices can both dislocate SC visual and auditory spatial receptive fields (RFs) and decrease their overall size, resulting in reduced alignment. Further analysis demonstrated that this RF separation is significantly correlated with the decrement of neurons' multisensory enhancement and is most pronounced in low stimulus intensity conditions. In addition, cortical deactivation could influence the degree of stimulus effectiveness, thereby illustrating the means by which higher order cortices may modify the multisensory activity of SC.


Subject(s)
Auditory Perception/physiology , Cerebral Cortex/physiology , Neurons/physiology , Superior Colliculi/physiology , Visual Perception/physiology , Acoustic Stimulation , Action Potentials , Animals , Cats , Cold Temperature , Electrodes, Implanted , Female , Male , Photic Stimulation , Visual Fields/physiology , Water
SELECTION OF CITATIONS
SEARCH DETAIL
...