Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Front Psychol ; 6: 819, 2015.
Article in English | MEDLINE | ID: mdl-26124739

ABSTRACT

Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio "high" frequencies correspond to a visual "up" defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.

2.
Proc Natl Acad Sci U S A ; 112(23): 7321-6, 2015 Jun 09.
Article in English | MEDLINE | ID: mdl-26015584

ABSTRACT

Despite decades of research, there is still uncertainty about how people make simple decisions about perceptual stimuli. Most theories assume that perceptual decisions are based on decision variables, which are internal variables that encode task-relevant information. However, decision variables are usually considered to be theoretical constructs that cannot be measured directly, and this often makes it difficult to test theories of perceptual decision making. Here we show how to measure decision variables on individual trials, and we use these measurements to test theories of perceptual decision making more directly than has previously been possible. We measure classification images, which are estimates of templates that observers use to extract information from stimuli. We then calculate the dot product of these classification images with the stimuli to estimate observers' decision variables. Finally, we reconstruct each observer's "decision space," a map that shows the probability of the observer's responses for all values of the decision variables. We use this method to examine decision strategies in two-alternative forced choice (2AFC) tasks, for which there are several competing models. In one experiment, the resulting decision spaces support the difference model, a classic theory of 2AFC decisions. In a second experiment, we find unexpected decision spaces that are not predicted by standard models of 2AFC decisions, and that suggest intrinsic uncertainty or soft thresholding. These experiments give new evidence regarding observers' strategies in 2AFC tasks, and they show how measuring decision variables can answer long-standing questions about perceptual decision making.


Subject(s)
Choice Behavior , Humans
3.
J Exp Psychol Hum Percept Perform ; 41(1): 42-9, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25485660

ABSTRACT

To accurately interpret tactile information, the brain needs to have an accurate representation of the body to which to refer the sensations. Despite this, body representation has only recently been incorporated into the study of tactile perception. Here, we investigate whether distortions of body representation affect tactile sensations. We perceptually altered the length of the arm and the width of the waist using a tendon vibration illusion and measured spatial acuity and sensitivity. Surprisingly, we found reduction in both tactile acuity and sensitivity thresholds when the arm or waist was perceptually altered, which indicates a general disruption of low-level tactile processing. We postulate that the disruptive changes correspond to the preliminary stage as the body representation starts to change and may give new insights into sensory processing in people with long-term or sudden abnormal body representation such as are found in eating disorders or following amputation.


Subject(s)
Body Image , Illusions/physiology , Sensory Thresholds/physiology , Touch Perception/physiology , Touch/physiology , Adult , Female , Humans , Male , Young Adult
4.
Multisens Res ; 26(1-2): 3-18, 2013.
Article in English | MEDLINE | ID: mdl-23713197

ABSTRACT

Previous research showing systematic localisation errors in touch perception related to eye and head position has suggested that touch is at least partially localised in a visual reference frame. However, many previous studies had participants report the location of tactile stimuli relative to a visual probe, which may force coding into a visual reference. Also, the visual probe could itself be subject to an effect of eye or head position. Thus, it is necessary to assess the perceived position of a tactile stimulus using a within-modality measure in order to make definitive conclusions about the coordinate system in which touch might be coded. Here, we present a novel method for measuring the perceived location of a touch in body coordinates: the Segmented Space Method (SSM). In the SSM participants imagine the region within which the stimulus could be presented divided into several equally spaced, and numbered, segments. Participants then simply report the number corresponding to the segment in which they perceived the stimulus. The SSM represents a simple and novel method that can be easily extended to other modalities by dividing any response space into numbered segments centred on some appropriate reference point (e.g. the head, the torso, the hand, or some point in space off the body). Here we apply SSM to the forearm during eccentric viewing and report localisation errors for touch similar to those previously reported using a crossmodal comparison. The data collected with the SSM strengthen the theory that tactile spatial localisation is generally coded in a visual reference frame even when visual coding is not required by the task.


Subject(s)
Body Image , Neurosciences/methods , Space Perception/physiology , Touch Perception/physiology , Visual Perception/physiology , Arm , Cues , Female , Functional Laterality/physiology , Head Movements/physiology , Humans , Male , Orientation/physiology , Somatosensory Cortex/physiology , Touch/physiology , Visual Cortex/physiology , Young Adult
5.
Exp Brain Res ; 222(4): 437-45, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22941315

ABSTRACT

The position of gaze (eye plus head position) relative to body is known to alter the perceived locations of sensory targets. This effect suggests that perceptual space is at least partially coded in a gaze-centered reference frame. However, the direction of the effects reported has not been consistent. Here, we investigate the cause of a discrepancy between reported directions of shift in tactile localization related to head position. We demonstrate that head eccentricity can cause errors in touch localization in either the same or opposite direction as the head is turned depending on the procedure used. When head position is held eccentric during both the presentation of a touch and the response, there is a shift in the direction opposite to the head. When the head is returned to center before reporting, the shift is in the same direction as head eccentricity. We rule out a number of possible explanations for the difference and conclude that when the head is moved between a touch and response the touch is coded in a predominantly gaze-centered reference frame, whereas when the head remains stationary a predominantly body-centered reference frame is used. The mechanism underlying these displacements in perceived location is proposed to involve an underestimated gaze signal. We propose a model demonstrating how this single neural error could cause localization errors in either direction depending on whether the gaze or body midline is used as a reference. This model may be useful in explaining gaze-related localization errors in other modalities.


Subject(s)
Head Movements/physiology , Photic Stimulation/methods , Psychomotor Performance/physiology , Reaction Time/physiology , Touch/physiology , Vibration , Adult , Female , Humans , Male
6.
Exp Brain Res ; 213(2-3): 229-34, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21559744

ABSTRACT

The location of a touch to the skin, first coded in body coordinates, may be transformed into retinotopic coordinates to facilitate visual-tactile integration. In order for the touch location to be transformed into a retinotopic reference frame, the location of the eyes and head must be taken into account. Previous studies have found eye position-related errors (Harrar and Harris in Exp Brain Res 203:615-620, 2009) and head position-related errors (Ho and Spence Brain Res 1144:136-141, 2007) in tactile localization, indicating that imperfect versions of eye and head signals may be used in the body-to-visual coordinate transformation. Here, we investigated the combined effects of head and eye position on the perceived location of a mechanical touch to the arm. Subjects reported the perceived position of a touch that was presented while their head was positioned to the left, right, or center of the body and their eyes were positioned to the left, right, or center in their orbits. The perceived location of a touch shifted in the direction of both head and the eyes by approximately the same amount. We interpret these shifts as being consistent with touch location being coded in a visual reference frame with a gaze signal used to compute the transformation.


Subject(s)
Attention/physiology , Fixation, Ocular/physiology , Orientation , Touch Perception/physiology , Touch/physiology , Adult , Analysis of Variance , Female , Head Movements , Humans , Male , Physical Stimulation/methods , Vestibule, Labyrinth/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...