Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
Add more filters










Publication year range
1.
J Exp Psychol Gen ; 148(4): 713-727, 2019 Apr.
Article in English | MEDLINE | ID: mdl-30973263

ABSTRACT

Brain plasticity is a key mechanism for learning and recovery. A striking example of plasticity in the adult brain occurs following input loss, for example, following amputation, whereby the deprived zone is "invaded" by new representations. Although it has long been assumed that such reorganization leads to functional benefits for the invading representation, the behavioral evidence is controversial. Here, we investigate whether a temporary period of somatosensory input loss to one finger, induced by anesthetic block, is sufficient to cause improvements in touch perception ("direct" effects of deafferentation). Further, we determine whether this deprivation can improve touch perception by enhancing sensory learning processes, for example, by training ("interactive" effects). Importantly, we explore whether direct and interactive effects of deprivation are dissociable by directly comparing their effects on touch perception. Using psychophysical thresholds, we found brief deprivation alone caused improvements in tactile perception of a finger adjacent to the blocked finger but not to non-neighboring fingers. Two additional groups underwent minimal tactile training to one finger either during anesthetic block of the neighboring finger or a sham block with saline. Deprivation significantly enhanced the effects of tactile perceptual training, causing greater learning transfer compared with sham block. That is, following deafferentation and training, learning gains were seen in fingers normally outside the boundaries of topographic transfer of tactile perceptual learning. Our results demonstrate that sensory deprivation can improve perceptual abilities, both directly and interactively, when combined with sensory learning. This dissociation provides novel opportunities for future clinical interventions to improve sensation. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Subject(s)
Fingers/physiology , Learning/physiology , Touch Perception/physiology , Touch/physiology , Adult , Anesthetics/pharmacology , Female , Humans , Learning/drug effects , Male , Psychophysics/methods , Touch/drug effects , Touch Perception/drug effects , Transfer, Psychology/drug effects , Transfer, Psychology/physiology , Young Adult
2.
J Vis ; 18(9): 13, 2018 09 04.
Article in English | MEDLINE | ID: mdl-30208432

ABSTRACT

Video-based eye trackers have enabled major advancements in our understanding of eye movements through their ease of use and their non-invasiveness. One necessity to obtain accurate eye recordings using video-based trackers is calibration. The aim of the current study was to determine the feasibility and reliability of alternative calibration methods for scenarios in which the standard visual-calibration is not possible. Fourteen participants were tested using the EyeLink 1000 Plus video-based eye tracker, and each completed the following 5-point calibration methods: 1) standard visual-target calibration, 2) described calibration where participants were provided with verbal instructions about where to direct their eyes (without vision of the screen), 3) proprioceptive calibration where participants were asked to look at their hidden finger, 4) replacement calibration, where the visual calibration was performed by 3 different people; the calibrators were temporary substitutes for the participants. Following calibration, participants performed a simple visually-guided saccade task to 16 randomly presented targets on a grid. We found that precision errors were comparable across the alternative calibration methods. In terms of accuracy, compared to the standard calibration, non-visual calibration methods (described and proprioception) led to significantly larger errors, whilst the replacement calibration method had much smaller errors. In conditions where calibration is not possible, for example when testing blind or visually impaired people who are unable to foveate the calibration targets, we suggest that using a single stand-in to perform the calibration is a simple and easy alternative calibration method, which should only cause a minimal decrease in accuracy.


Subject(s)
Calibration , Diagnostic Techniques, Ophthalmological , Eye Movement Measurements/instrumentation , Eye Movements , Video Recording , Adolescent , Adult , Eye Movement Measurements/standards , Female , Humans , Male , Proprioception/physiology , Reproducibility of Results , Young Adult
3.
J Vis ; 18(9): 4, 2018 09 04.
Article in English | MEDLINE | ID: mdl-30193346

ABSTRACT

Previous studies have shown that eye and arm movements tend to be intrinsically coupled in their behavior. There is, however, no consensus on whether planning of eye and arm movements is based on shared or independent representations. One way to gain insight into these processes is to compare how exogenous attentional modulation influences the temporal and spatial characteristics of the eye and the arm during single or combined movements. Thirteen participants (M = 22.8 years old, SD = 1.5) performed single or combined movements to an eccentric target. A behaviorally irrelevant cue flashed just before the target at different locations. There was no effect of the cue on the saccade or reach amplitudes, whether they were performed alone or together. We found no differences in overall reaction times (RTs) between single and combined movements. With respect to the effect of the cue, both saccades and reaches followed a similar pattern with the shortest RTs when the cue was closest to the target, which we propose reflects effector-independent processes. Compared to when no cue was presented before the target, saccade RTs were generally inhibited by the irrelevant cue with increasing cue-target distance. In contrast, reach RTs showed strong facilitation at the target location and less facilitation at farther distances. We propose that this reflects the presence of effector-dependent processes. The similarities and differences in RTs between the saccades and reaches are consistent with effector-dependent and -independent processes working in parallel.


Subject(s)
Attention/physiology , Cues , Movement/physiology , Psychomotor Performance/physiology , Saccades/physiology , Adult , Analysis of Variance , Arm , Female , Humans , Male , Reaction Time/physiology , Young Adult
4.
Invest Ophthalmol Vis Sci ; 58(14): 6282-6291, 2017 12 01.
Article in English | MEDLINE | ID: mdl-29242902

ABSTRACT

Purpose: Fetal alcohol spectrum disorder (FASD) is a developmental disease characterized by behavioral problems and physical defects including malformations of the eye and associated optical defects. How these malformations affect retinal functioning is not well known, although animal models have suggested that scotopic vision is particularly deficient. Age is also known to affect scotopic vision. Here, we determined the combined effects of age and fetal alcohol exposure (FAE) on retinal function using full-field electroretinograms (ERGs) in monkeys (Chlorocebus sabaeus). Methods: ERGs were recorded in monkeys aged 3- to 12-years old, at multiple flash intensities under scotopic and photopic conditions, and functions were fit to the amplitudes of the a- and b-waves. Results: We found that both age and alcohol exposure affected ERGs. In photopic ERGs, amplitudes increased with age, and were higher in FAEs than controls, for data related to the OFF- and ON-pathways. In scotopic ERGs, amplitudes were decreased in young FAE compared with age-matched controls but only for the rod-dominated responses, while at brighter flashes, alcohol exposure led to an increase in the amplitude of the a- and b-waves. Conclusions: The ERGs from the FAE animals closely resembled the data from the older sucrose-control monkeys. This suggests that the FAE monkey retina ages more quickly than the control monkeys. This large sample of nonhuman primates, with carefully monitored ethanol exposure, demonstrates the critical interplay between age and alcohol when assessing the integrity of the retina. We suggest that ERGs might be an important adjunct to diagnosing human FASD.


Subject(s)
Chlorocebus aethiops/embryology , Ethanol/toxicity , Maternal Exposure/adverse effects , Night Vision/drug effects , Pregnancy, Animal , Prenatal Exposure Delayed Effects/diagnosis , Retina/embryology , Animals , Disease Models, Animal , Electroretinography/drug effects , Female , Pregnancy , Retina/drug effects , Retina/physiopathology
5.
Exp Brain Res ; 235(3): 763-775, 2017 03.
Article in English | MEDLINE | ID: mdl-27872958

ABSTRACT

The importance of multisensory integration for perception and action has long been recognised. Integrating information from individual senses increases the chance of survival by reducing the variability in the incoming signals, thus allowing us to respond more rapidly. Reaction times (RTs) are fastest when the components of the multisensory signals are simultaneous. This response facilitation is traditionally attributed to multisensory integration. However, it is unclear if facilitation of RTs occurs when stimuli are perceived as synchronous or are actually physically synchronous. Repeated exposure to audiovisual asynchrony can change the delay at which multisensory stimuli are perceived as simultaneous, thus changing the delay at which the stimuli are integrated-perceptually. Here we set out to determine how such changes in multisensory integration for perception affect our ability to respond to multisensory events. If stimuli perceived as simultaneous were reacted to most rapidly, it would suggest a common system for multisensory integration for perception and action. If not, it would suggest separate systems. We measured RTs to auditory, visual, and audiovisual stimuli following exposure to audiovisual asynchrony. Exposure affected the variability of the unisensory RT distributions; in particular, the slowest RTs were either speed up or slowed down (in the direction predicted from shifts in perceived simultaneity). Additionally, the multisensory facilitation of RTs (beyond statistical summation) only occurred when audiovisual onsets were physically synchronous, rather than when they appeared simultaneous. We conclude that the perception of synchrony is therefore independent of multisensory integration and suggest a division between multisensory processes that are fast (automatic and unaffected by temporal adaptation) and those that are slow (perceptually driven and adaptable).


Subject(s)
Auditory Perception/physiology , Reaction Time/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Female , Humans , Male , Models, Psychological , Photic Stimulation , Young Adult
6.
Vis Neurosci ; 33: E006, 2016 01.
Article in English | MEDLINE | ID: mdl-27485069

ABSTRACT

The endogenous cannabinoid system plays important roles in the retina of mice and monkeys via their classic CB1 and CB2 receptors. We have previously reported that the G protein-coupled receptor 55 (GPR55), a putative cannabinoid receptor, is exclusively expressed in rod photoreceptors in the monkey retina, suggesting its possible role in scotopic vision. To test this hypothesis, we recorded full-field electroretinograms (ERGs) after the intravitreal injection of the GPR55 agonist lysophosphatidylglucoside (LPG) or the selective GPR55 antagonist CID16020046 (CID), under light- and dark-adapted conditions. Thirteen vervet monkeys (Chlorocebus sabaeus) were used in this study: four controls (injected with the vehicle dimethyl sulfoxide, DMSO), four injected with LPG and five with CID. We analyzed amplitudes and latencies of the a-wave (photoreceptor responses) and the b-wave (rod and cone system responses) of the ERG. Our results showed that after injection of LPG, the amplitude of the scotopic b-wave was significantly higher, whereas after the injection of CID, it was significantly decreased, compared to the vehicle (DMSO). On the other hand, the a-wave amplitude, and the a-wave and b-wave latencies, of the scotopic ERG responses were not significantly affected by the injection of either compound. Furthermore, the photopic ERG waveforms were not affected by either drug. These results support the hypothesis that GPR55 plays an instrumental role in mediating scotopic vision.


Subject(s)
Night Vision/physiology , Photoreceptor Cells, Vertebrate/physiology , Receptors, Cannabinoid/physiology , Receptors, G-Protein-Coupled/physiology , Animals , Azabicyclo Compounds/pharmacology , Benzoates/pharmacology , Cannabinoid Receptor Agonists/pharmacology , Cannabinoid Receptor Antagonists/pharmacology , Chlorocebus aethiops , Electroretinography , Female , Glycerophosphates/pharmacology , Intravitreal Injections , Male , Photic Stimulation , Receptors, G-Protein-Coupled/agonists , Receptors, G-Protein-Coupled/antagonists & inhibitors
7.
Neural Plast ; 2016: 1253245, 2016.
Article in English | MEDLINE | ID: mdl-27069692

ABSTRACT

The expression patterns of the cannabinoid receptor type 1 (CB1R) and the cannabinoid receptor type 2 (CB2R) are well documented in rodents and primates. In vervet monkeys, CB1R is present in the retinal neurons (photoreceptors, horizontal cells, bipolar cells, amacrine cells, and ganglion cells) and CB2R is exclusively found in the retinal glia (Müller cells). However, the role of these cannabinoid receptors in normal primate retinal function remains elusive. Using full-field electroretinography in adult vervet monkeys, we recorded changes in neural activity following the blockade of CB1R and CB2R by the intravitreal administration of their antagonists (AM251 and AM630, resp.) in photopic and scotopic conditions. Our results show that AM251 increases the photopic a-wave amplitude at high flash intensities, whereas AM630 increases the amplitude of both the photopic a- and b-waves. In scotopic conditions, both blockers increased the b-wave amplitude but did not change the a-wave amplitude. These findings suggest an important role of CB1R and CB2R in primate retinal function.


Subject(s)
Membrane Potentials/physiology , Receptor, Cannabinoid, CB1/metabolism , Receptor, Cannabinoid, CB2/metabolism , Retina/metabolism , Retinal Neurons/metabolism , Animals , Chlorocebus aethiops , Electroretinography , Ependymoglial Cells/drug effects , Ependymoglial Cells/metabolism , Indoles/pharmacology , Membrane Potentials/drug effects , Photic Stimulation , Piperidines/pharmacology , Pyrazoles/pharmacology , Receptor, Cannabinoid, CB1/antagonists & inhibitors , Receptor, Cannabinoid, CB2/antagonists & inhibitors , Retina/drug effects , Retinal Neurons/drug effects
8.
J Neurophysiol ; 115(3): 1088-97, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26631145

ABSTRACT

Tactile learning transfers from trained to untrained fingers in a pattern that reflects overlap between the representations of fingers in the somatosensory system (e.g., neurons with multifinger receptive fields). While physical proximity on the body is known to determine the topography of somatosensory representations, tactile coactivation is also an established organizing principle of somatosensory topography. In this study we investigated whether tactile coactivation, induced by habitual inter-finger cooperative use (use pattern), shapes inter-finger overlap. To this end, we used psychophysics to compare the transfer of tactile learning from the middle finger to its adjacent fingers. This allowed us to compare transfer to two fingers that are both physically and cortically adjacent to the middle finger but have differing use patterns. Specifically, the middle finger is used more frequently with the ring than with the index finger. We predicted this should lead to greater representational overlap between the former than the latter pair. Furthermore, this difference in overlap should be reflected in differential learning transfer from the middle to index vs. ring fingers. Subsequently, we predicted temporary learning-related changes in the middle finger's representation (e.g., cortical magnification) would cause transient interference in perceptual thresholds of the ring, but not the index, finger. Supporting this, longitudinal analysis revealed a divergence where learning transfer was fast to the index finger but relatively delayed to the ring finger. Our results support the theory that tactile coactivation patterns between digits affect their topographic relationships. Our findings emphasize how action shapes perception and somatosensory organization.


Subject(s)
Fingers/physiology , Learning , Touch Perception , Adult , Female , Fingers/innervation , Humans , Male , Somatosensory Cortex/physiology
9.
Exp Brain Res ; 234(5): 1189-98, 2016 May.
Article in English | MEDLINE | ID: mdl-25869739

ABSTRACT

The ability to estimate a filled interval of time is affected by numerous non-temporal factors, such as the sensory modality, duration, and the intensity of the stimulus. Here we explore the role of modality (auditory or visual), stimulus intensity (low vs. high), and motor response speed on the ability to reproduce the duration of short (<1 s) filled intervals. In accordance with the literature, the reproduced duration was affected by both the modality and the intensity of the stimulus; longer reproduction times were generally observed for visual as compared to auditory stimuli, and for low as compared to high-intensity stimuli. We used general estimating equations in order to determine whether these factors independently affected participants' ability to reproduce a given duration, after eliminating the variability associated with reaction time, since it covaried with the reproduced durations. This analysis revealed that stimulus duration, modality, and intensity were all significant independent predictors of the reproduced durations. Additionally, duration interacted with intensity when reproducing auditory intervals. That is, after taking into account the general speeding-up effect that high-intensity stimuli have on responses, they seem to have an additional effect on the rate of the internal clock. These results support previous evidence suggesting that auditory and visual clocks run at different speeds.


Subject(s)
Auditory Perception/physiology , Psychophysics , Time Perception/physiology , Visual Perception/physiology , Acoustic Stimulation , Adult , Female , Humans , Male , Models, Theoretical , Photic Stimulation , Reaction Time/physiology , Time Factors
10.
Front Psychol ; 6: 819, 2015.
Article in English | MEDLINE | ID: mdl-26124739

ABSTRACT

Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio "high" frequencies correspond to a visual "up" defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.

11.
Curr Biol ; 24(5): 531-5, 2014 Mar 03.
Article in English | MEDLINE | ID: mdl-24530067

ABSTRACT

Developmental dyslexia affects 5%-10% of the population, resulting in poor spelling and reading skills. While there are well-documented differences in the way dyslexics process low-level visual and auditory stimuli, it is mostly unknown whether there are similar differences in audiovisual multisensory processes. Here, we investigated audiovisual integration using the redundant target effect (RTE) paradigm. Some conditions demonstrating audiovisual integration appear to depend upon magnocellular pathways, and dyslexia has been associated with deficits in this pathway; so, we postulated that developmental dyslexics ("dyslexics" hereafter) would show differences in audiovisual integration compared with controls. Reaction times (RTs) to multisensory stimuli were compared with predictions from Miller's race model. Dyslexics showed difficulty shifting their attention between modalities; but such "sluggish attention shifting" (SAS) appeared only when dyslexics shifted their attention from the visual to the auditory modality. These results suggest that dyslexics distribute their crossmodal attention resources differently from controls, causing different patterns in multisensory responses compared to controls. From this, we propose that dyslexia training programs should take into account the asymmetric shifts of crossmodal attention.


Subject(s)
Attention/physiology , Dyslexia , Reaction Time , Acoustic Stimulation , Case-Control Studies , Humans , Photic Stimulation , Visual Perception
12.
J Exp Psychol Hum Percept Perform ; 40(1): 15-23, 2014 Feb.
Article in English | MEDLINE | ID: mdl-23855526

ABSTRACT

Perceptual learning can improve our sensory abilities. Understanding its underlying mechanisms, in particular, when perceptual learning generalizes, has become a focus of research and controversy. Specifically, there is little consensus regarding the extent to which tactile perceptual learning generalizes across fingers. We measured tactile orientation discrimination abilities on 4 fingers (index and middle fingers of both hands), using psychophysical measures, before and after 4 training sessions on 1 finger. Given the somatotopic organization of the hand representation in the somatosensory cortex, the topography of the cortical areas underlying tactile perceptual learning can be inferred from the pattern of generalization across fingers; only fingers sharing cortical representation with the trained finger ought to improve with it. Following training, performance improved not only for the trained finger but also for its adjacent and homologous fingers. Although these fingers were not exposed to training, they nevertheless demonstrated similar levels of learning as the trained finger. Conversely, the performance of the finger that was neither adjacent nor homologous to the trained finger was unaffected by training, despite the fact that our procedure was designed to enhance generalization, as described in recent visual perceptual learning research. This pattern of improved performance is compatible with previous reports of neuronal receptive fields (RFs) in the primary somatosensory cortex (SI) spanning adjacent and homologous digits. We conclude that perceptual learning rooted in low-level cortex can still generalize, and suggest potential applications for the neurorehabilitation of syndromes associated with maladaptive plasticity in SI.


Subject(s)
Fingers/physiology , Learning/physiology , Touch Perception/physiology , Adult , Discrimination, Psychological/physiology , Female , Generalization, Psychological/physiology , Humans , Male , Psychophysics/methods , Somatosensory Cortex/physiology , Young Adult
13.
Multisens Res ; 26(3): 307-16, 2013.
Article in English | MEDLINE | ID: mdl-23964482

ABSTRACT

Humans are equipped with multiple sensory channels that provide both redundant and complementary information about the objects and events in the world around them. A primary challenge for the brain is therefore to solve the 'correspondence problem', that is, to bind those signals that likely originate from the same environmental source, while keeping separate those unisensory inputs that likely belong to different objects/events. Whether multiple signals have a common origin or not must, however, be inferred from the signals themselves through a causal inference process. Recent studies have demonstrated that cross-correlation, that is, the similarity in temporal structure between unimodal signals, represents a powerful cue for solving the correspondence problem in humans. Here we provide further evidence for the role of the temporal correlation between auditory and visual signals in multisensory integration. Capitalizing on the well-known fact that sensitivity to crossmodal conflict is inversely related to the strength of coupling between the signals, we measured sensitivity to crossmodal spatial conflicts as a function of the cross-correlation between the temporal structures of the audiovisual signals. Observers' performance was systematically modulated by the cross-correlation, with lower sensitivity to crossmodal conflict being measured for correlated as compared to uncorrelated audiovisual signals. These results therefore provide support for the claim that cross-correlation promotes multisensory integration. A Bayesian framework is proposed to interpret the present results, whereby stimulus correlation is represented on the prior distribution of expected crossmodal co-occurrence.


Subject(s)
Auditory Perception/physiology , Bayes Theorem , Cues , Visual Perception/physiology , Acoustic Stimulation/methods , Brain Mapping/methods , Humans , Photic Stimulation/methods
14.
Multisens Res ; 26(1-2): 3-18, 2013.
Article in English | MEDLINE | ID: mdl-23713197

ABSTRACT

Previous research showing systematic localisation errors in touch perception related to eye and head position has suggested that touch is at least partially localised in a visual reference frame. However, many previous studies had participants report the location of tactile stimuli relative to a visual probe, which may force coding into a visual reference. Also, the visual probe could itself be subject to an effect of eye or head position. Thus, it is necessary to assess the perceived position of a tactile stimulus using a within-modality measure in order to make definitive conclusions about the coordinate system in which touch might be coded. Here, we present a novel method for measuring the perceived location of a touch in body coordinates: the Segmented Space Method (SSM). In the SSM participants imagine the region within which the stimulus could be presented divided into several equally spaced, and numbered, segments. Participants then simply report the number corresponding to the segment in which they perceived the stimulus. The SSM represents a simple and novel method that can be easily extended to other modalities by dividing any response space into numbered segments centred on some appropriate reference point (e.g. the head, the torso, the hand, or some point in space off the body). Here we apply SSM to the forearm during eccentric viewing and report localisation errors for touch similar to those previously reported using a crossmodal comparison. The data collected with the SSM strengthen the theory that tactile spatial localisation is generally coded in a visual reference frame even when visual coding is not required by the task.


Subject(s)
Body Image , Neurosciences/methods , Space Perception/physiology , Touch Perception/physiology , Visual Perception/physiology , Arm , Cues , Female , Functional Laterality/physiology , Head Movements/physiology , Humans , Male , Orientation/physiology , Somatosensory Cortex/physiology , Touch/physiology , Visual Cortex/physiology , Young Adult
15.
Perception ; 40(7): 880-2, 2011.
Article in English | MEDLINE | ID: mdl-22128561

ABSTRACT

The flavour and pleasantness of food and drinks are affected by their colour, their texture or crunch, and even by the shape and weight of the plate or glass. But, can the colour of the bowl also affect the taste of the food it contains? To answer this question we served popcorn in four different coloured bowls, and participants rated sweetness, saltiness, and overall liking. The sweet popcorn, in addition to being sweet, was perceived as saltier when eaten out of a coloured (as compared to a white) bowl, and vice versa for the salty popcorn. These results demonstrate that colour in bowl design can be used to elicit perceptions of sweetness and saltiness in real foods.


Subject(s)
Color Perception/physiology , Taste Perception/physiology , Female , Humans , Male , Young Adult
16.
Exp Brain Res ; 214(3): 351-6, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21858502

ABSTRACT

Choosing what to eat is a complex activity for humans. Determining a food's pleasantness requires us to combine information about what is available at a given time with knowledge of the food's palatability, texture, fat content, and other nutritional information. It has been suggested that humans may have an implicit knowledge of a food's fat content based on its appearance; Toepel et al. (Neuroimage 44:967-974, 2009) reported visual-evoked potential modulations after participants viewed images of high-energy, high-fat food (HF), as compared to viewing low-fat food (LF). In the present study, we investigated whether there are any immediate behavioural consequences of these modulations for human performance. HF, LF, or non-food (NF) images were used to exogenously direct participants' attention to either the left or the right. Next, participants made speeded elevation discrimination responses (up vs. down) to visual targets presented either above or below the midline (and at one of three stimulus onset asynchronies: 150, 300, or 450 ms). Participants responded significantly more rapidly following the presentation of a HF image than following the presentation of either LF or NF images, despite the fact that the identity of the images was entirely task-irrelevant. Similar results were found when comparing response speeds following images of high-carbohydrate (HC) food items to low-carbohydrate (LC) food items. These results support the view that people rapidly process (i.e. within a few hundred milliseconds) the fat/carbohydrate/energy value or, perhaps more generally, the pleasantness of food. Potentially as a result of HF/HC food items being more pleasant and thus having a higher incentive value, it seems as though seeing these foods results in a response readiness, or an overall alerting effect, in the human brain.


Subject(s)
Dietary Fats/analysis , Discrimination, Psychological/physiology , Feeding Behavior/psychology , Food Analysis/methods , Form Perception/physiology , Psychomotor Performance/physiology , Adult , Female , Humans , Male , Neuropsychological Tests/standards , Reaction Time/physiology , Space Perception/physiology , Young Adult
17.
Exp Brain Res ; 203(3): 615-20, 2010 Jun.
Article in English | MEDLINE | ID: mdl-20428854

ABSTRACT

The perceived location of touch on the skin is affected by the position of the eyes in the head, suggesting that it is at least partially coded in a visual reference frame. This observation was made by comparing the perceived location of a touch to a visual reference. Here, we ask whether the location of a touch is coded differently when it guides an action. We tested the perceived position of four touches on the arm (approximately 5 cm apart) while participants adopted one of four eccentric fixations. A touch-sensitive screen was positioned over the stimulated left arm and subjects pointed, using their right arm, to the perceived touch location. The location that subjects pointed to varied with eye position, shifting by 0.016 cm/deg in the direction of eye eccentricity. The dependence on eye position suggests that tactile coding for action is also at least partially coded in a visual reference frame, as it is for perception.


Subject(s)
Psychomotor Performance , Touch Perception , Visual Perception , Adult , Analysis of Variance , Arm , Female , Fixation, Ocular , Humans , Linear Models , Male , Physical Stimulation , Psychophysics , Task Performance and Analysis
18.
Exp Brain Res ; 198(2-3): 403-10, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19533110

ABSTRACT

Here, we demonstrate a systematic shift in the perceived location of a tactile stimulus on the arm toward where the eye is looking. Participants reported the perceived position of touches presented between the elbow and the wrist while maintaining eye positions at various eccentricities. The perceived location of the touch was shifted by between 1 and 5 cm (1.9 degrees -9.5 degrees visual angle) by a change in eye position of +/-25 degrees from straight ahead. In a control condition, we repeat the protocol with the eyes fixating straight ahead. Changes in attention accounted for only 17% of the shift due to eye position. The pattern of tactile shifts due to eye position was comparable whether or not the arm was visible. However, touches at locations along the forearm were perceived as being farther apart when the arm was visible compared to when it was covered. These results are discussed in terms of the coding of tactile space, which seems to require integration of tactile, visual and eye position information.


Subject(s)
Eye , Fixation, Ocular , Touch Perception , Visual Perception , Adult , Attention , Elbow , Female , Forearm , Humans , Linear Models , Male , Physical Stimulation , Psychophysics , Wrist
19.
Percept Psychophys ; 70(5): 807-17, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18613629

ABSTRACT

This article compares the properties of apparent motion between a light and a touch with apparent motion between either two lights or two touches. Visual and tactile stimulators were attached to the tips of the two index fingers that were held apart at different distances. Subjects rated the quality of apparent motion between each stimulus combination for a range of stimulus onset asynchronies (SOAs). Subjects reported perceiving apparent motion between all three stimulus combinations. For light-light visual apparent motion, the preferred SOA and the direction threshold SOAs increased as the distance between the stimuli increased (consistent with Korte's third law of apparent motion). Touch-touch apparent motion also obeyed Korte's third law, but over a smaller range of distances, showing that proprioceptive information concerning the position of the fingers is integrated into the tactile motion system. The threshold and preferred SOAs for visuotactile apparent motion did not vary with distance, suggesting a different mechanism for multimodal apparent motion.


Subject(s)
Motion Perception , Touch , Visual Perception , Adult , Female , Humans , Male
20.
Brain Res ; 1242: 54-8, 2008 Nov 25.
Article in English | MEDLINE | ID: mdl-18634764

ABSTRACT

Tactile stimulation usually occurs as a combination of an active movement (reaching out to touch a surface) and a sensation (actually feeling the surface against the skin). The brain has information about the active component (the motor command) before it occurs because of efference copy, while the passive component must be transduced before it can be processed. Since the active and passive tactile components are available to the brain at different times, determining the time of touch requires calculation worked backwards from the passive sensation, and/or worked forward from the active motor command. In order to determine which touch process is perceived more quickly, we varied the relative delay between an active and a passive touch signal and determined the relative time perceived as simultaneous. A passive touch needed to be presented before an active key was pressed in order for the two touches to be perceived as simultaneous, but this timing difference was not significant. In order to test the plasticity of the active and passive touch systems, we exploited the fact that the point of subjective simultaneity between two stimuli can sometimes be altered by repeated exposure to asynchronous presentation. We exposed subjects to an active key press/ passive touch pair delayed by 250 ms. This exposure increased the range of relative delays between active and passive touches at which the pairs were judged as simultaneous. This is consistent with an adaptive change in the processing of active touch.


Subject(s)
Brain/physiology , Touch Perception/physiology , Touch/physiology , Adult , Female , Humans , Male , Sensory Thresholds , Time
SELECTION OF CITATIONS
SEARCH DETAIL
...