Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Sci Rep ; 13(1): 6478, 2023 04 20.
Article in English | MEDLINE | ID: mdl-37081084

ABSTRACT

We investigated whether 'invisible' visual information, i.e., visual information that is not consciously perceived, could affect auditory speech perception. Repeated exposure to McGurk stimuli (auditory /ba/ with visual [ga]) temporarily changes the perception of the auditory /ba/ into a 'da' or 'ga'. This altered auditory percept persists even after the presentation of the McGurk stimuli when the auditory stimulus is presented alone (McGurk aftereffect). We used this and presented the auditory /ba/ either with or without (No Face) a masked face articulating a visual [ba] (Congruent Invisible) or a visual [ga] (Incongruent Invisible). Thus, we measured the extent to which the invisible faces could undo or prolong the McGurk aftereffects. In a further control condition, the incongruent faces remained unmasked and thus visible, resulting in four conditions in total. Visibility was defined by the participants' subjective dichotomous reports ('visible' or 'invisible'). The results showed that the Congruent Invisible condition reduced the McGurk aftereffects compared with the other conditions, while the Incongruent Invisible condition showed no difference with the No Face condition. These results suggest that 'invisible' visual information that is not consciously perceived can affect phonetic perception, but only when visual information is congruent with auditory information.


Subject(s)
Phonetics , Speech Perception , Humans , Lip , Visual Perception , Auditory Perception , Photic Stimulation/methods , Acoustic Stimulation/methods
2.
Brain Res Bull ; 85(5): 245-59, 2011 Jun 30.
Article in English | MEDLINE | ID: mdl-20193747

ABSTRACT

Spring compliance is perceived by combining the sensed force exerted by the spring with the displacement caused by the action (sensed through vision and proprioception). We investigated the effect of delay of visual and force information with respect to proprioception to understand how visual-haptic perception of compliance is achieved. First, we confirm an earlier result that force delay increases perceived compliance. Furthermore, we find that perceived compliance decreases with a delay in the visual information. These effects of delay on perceived compliance would not be present if the perceptual system would utilize all force-displacement information available during the interaction. Both delays generate a bias in compliance which is opposite in the loading and unloading phases of the interaction. To explain these findings, we propose that information during the loading phase of the spring displacement is weighted more than information obtained during unloading. We confirm this hypothesis by showing that sensitivity to compliance during loading movements is much higher than during unloading movements. Moreover, we show that visual and proprioceptive information about the hand position are used for compliance perception depending on the sensitivity to compliance. Finally, by analyzing participants' movements we show that these two factors (loading/unloading and reliability) account for the change in perceived compliance due to visual and force delays.


Subject(s)
Movement , Proprioception/physiology , Psychomotor Performance/physiology , User-Computer Interface , Visual Perception/physiology , Adult , Compliance , Female , Hand/physiology , Humans , Male , Models, Theoretical , Stress, Mechanical , Young Adult
3.
IEEE Trans Haptics ; 3(1): 63-72, 2010.
Article in English | MEDLINE | ID: mdl-27788091

ABSTRACT

In this study, we investigate the influence of visual feedback on haptic exploration. A haptic search task was designed in which subjects had to haptically explore a virtual display using a force-feedback device and to determine whether a target was present among distractor items. Although the target was recognizable only haptically, visual feedback of finger position or possible target positions could be given. Our results show that subjects could use visual feedback on possible target positions even in the absence of feedback on finger position. When there was no feedback on possible target locations, subjects scanned the whole display systematically. When feedback on finger position was present, subjects could make well-directed movements back to areas of interest. This was not the case without feedback on finger position, indicating that showing finger position helps to form a spatial representation of the display. In addition, we show that response time models of visual serial search do not generally apply for haptic serial search. Consequently, in teleoperation systems, for instance, it is helpful to show the position of the probe even if visual information on the scene is poor.

4.
Science ; 298(5598): 1627-30, 2002 Nov 22.
Article in English | MEDLINE | ID: mdl-12446912

ABSTRACT

Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.


Subject(s)
Cues , Touch , Visual Perception , Form Perception , Humans , Mathematics , Stereognosis , Vision Disparity
5.
Psychol Sci ; 12(1): 37-42, 2001 Jan.
Article in English | MEDLINE | ID: mdl-11294226

ABSTRACT

On the whole, people recognize objects best when they see the objects from a familiar view and worse when they see the objects from views that were previously occluded from sight. Unexpectedly, we found haptic object recognition to be viewpoint-specific as well, even though hand movements were unrestricted. This viewpoint dependence was due to the hands preferring the back "view" of the objects. Furthermore, when the sensory modalities (visual vs. haptic) differed between learning an object and recognizing it, recognition performance was best when the objects were rotated back-to-front between learning and recognition. Our data indicate that the visual system recognizes the front view of objects best, whereas the hand recognizes objects best from the back.


Subject(s)
Cognition/physiology , Environment , Touch/physiology , Visual Perception/physiology , Adult , Female , Hand/physiology , Humans , Male , Movement/physiology , Random Allocation
6.
Nat Neurosci ; 3(1): 69-73, 2000 Jan.
Article in English | MEDLINE | ID: mdl-10607397

ABSTRACT

The visual system uses several signals to deduce the three-dimensional structure of the environment, including binocular disparity, texture gradients, shading and motion parallax. Although each of these sources of information is independently insufficient to yield reliable three-dimensional structure from everyday scenes, the visual system combines them by weighting the available information; altering the weights would therefore change the perceived structure. We report that haptic feedback (active touch) increases the weight of a consistent surface-slant signal relative to inconsistent signals. Thus, appearance of a subsequently viewed surface is changed: the surface appears slanted in the direction specified by the haptically reinforced signal.


Subject(s)
Biofeedback, Psychology/physiology , Depth Perception/physiology , Touch/physiology , Data Display , Distance Perception/physiology , Humans , Photic Stimulation , Space Perception/physiology , Surface Properties
SELECTION OF CITATIONS
SEARCH DETAIL
...