Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
Add more filters










Publication year range
1.
Curr Biol ; 34(10): 2238-2246.e5, 2024 05 20.
Article in English | MEDLINE | ID: mdl-38718799

ABSTRACT

To sense and interact with objects in the environment, we effortlessly configure our fingertips at desired locations. It is therefore reasonable to assume that the underlying control mechanisms rely on accurate knowledge about the structure and spatial dimensions of our hand and fingers. This intuition, however, is challenged by years of research showing drastic biases in the perception of finger geometry.1,2,3,4,5 This perceptual bias has been taken as evidence that the brain's internal representation of the body's geometry is distorted,6 leading to an apparent paradox regarding the skillfulness of our actions.7 Here, we propose an alternative explanation of the biases in hand perception-they are the result of the Bayesian integration of noisy, but unbiased, somatosensory signals about finger geometry and posture. To address this hypothesis, we combined Bayesian reverse engineering with behavioral experimentation on joint and fingertip localization of the index finger. We modeled the Bayesian integration either in sensory or in space-based coordinates, showing that the latter model variant led to biases in finger perception despite accurate representation of finger length. Behavioral measures of joint and fingertip localization responses showed similar biases, which were well fitted by the space-based, but not the sensory-based, model variant. The space-based model variant also outperformed a distorted hand model with built-in geometric biases. In total, our results suggest that perceptual distortions of finger geometry do not reflect a distorted hand model but originate from near-optimal Bayesian inference on somatosensory signals.


Subject(s)
Bayes Theorem , Fingers , Hand , Humans , Hand/physiology , Fingers/physiology , Female , Male , Adult , Young Adult , Touch Perception/physiology
2.
iScience ; 27(3): 109092, 2024 Mar 15.
Article in English | MEDLINE | ID: mdl-38405611

ABSTRACT

It has been suggested that our brain re-uses body-based computations to localize touch on tools, but the neural implementation of this process remains unclear. Neural oscillations in the alpha and beta frequency bands are known to map touch on the body in external and skin-centered coordinates, respectively. Here, we pinpointed the role of these oscillations during tool-extended sensing by delivering tactile stimuli to either participants' hands or the tips of hand-held rods. To disentangle brain responses related to each coordinate system, we had participants' hands/tool tips crossed or uncrossed at their body midline. We found that midline crossing modulated alpha (but not beta) band activity similarly for hands and tools, also involving a similar network of cortical regions. Our findings strongly suggest that the brain uses similar oscillatory mechanisms for mapping touch on the body and tools, supporting the idea that body-based neural processes are repurposed for tool use.

3.
eNeuro ; 10(11)2023 Nov.
Article in English | MEDLINE | ID: mdl-37848289

ABSTRACT

It is often claimed that tools are embodied by their user, but whether the brain actually repurposes its body-based computations to perform similar tasks with tools is not known. A fundamental computation for localizing touch on the body is trilateration. Here, the location of touch on a limb is computed by integrating estimates of the distance between sensory input and its boundaries (e.g., elbow and wrist of the forearm). As evidence of this computational mechanism, tactile localization on a limb is most precise near its boundaries and lowest in the middle. Here, we show that the brain repurposes trilateration to localize touch on a tool, despite large differences in initial sensory input compared with touch on the body. In a large sample of participants, we found that localizing touch on a tool produced the signature of trilateration, with highest precision close to the base and tip of the tool. A computational model of trilateration provided a good fit to the observed localization behavior. To further demonstrate the computational plausibility of repurposing trilateration, we implemented it in a three-layer neural network that was based on principles of probabilistic population coding. This network determined hit location in tool-centered coordinates by using a tool's unique pattern of vibrations when contacting an object. Simulations demonstrated the expected signature of trilateration, in line with the behavioral patterns. Our results have important implications for how trilateration may be implemented by somatosensory neural populations. We conclude that trilateration is likely a fundamental spatial computation that unifies limbs and tools.


Subject(s)
Touch Perception , Touch , Humans , Hand , Brain , Wrist
4.
J Neurophysiol ; 129(4): 793-798, 2023 04 01.
Article in English | MEDLINE | ID: mdl-36812143

ABSTRACT

The spatial limits of sensory acquisition (its sensory horizon) are a fundamental property of any sensorimotor system. In the present study, we sought to determine whether there is a sensory horizon for the human haptic modality. At first blush, it seems obvious that the haptic system is bounded by the space where the body can interact with the environment (e.g., the arm span). However, the human somatosensory system is exquisitely tuned to sensing with tools-blind-cane navigation being a classic example of this. The horizon of haptic perception therefore extends beyond body space, but to what extent is unknown. We first used neuromechanical modeling to determine the theoretical horizon, which we pinpointed as 6 m. We then used a psychophysical localization paradigm to behaviorally confirm that humans can haptically localize objects using a 6-m rod. This finding underscores the incredible flexibility of the brain's sensorimotor representations, as they can be adapted to sense an object many times longer than the user's own body.NEW & NOTEWORTHY There are often spatial limits to where an active sensory system can sample information from the environment. Hand-held tools can extend human haptic perception beyond the body, but the limits of this extension are unknown. We used theoretical modeling and psychophysics to determine these spatial limits. We find that the ability to spatially localize objects through a tool extends at least 6 m beyond the user's body.


Subject(s)
Stereognosis , Touch Perception , Humans , Psychophysics , Touch , Visual Perception
5.
Cortex ; 153: 207-219, 2022 08.
Article in English | MEDLINE | ID: mdl-35696732

ABSTRACT

To investigate the relationship between the sense of body ownership and motor control, we capitalized on a rare bizarre disorder wherein another person's hand is misattributed to their own body, i.e., a pathological form of embodiment (E+). Importantly, despite E+ is usually associated with motor deficits, we had the opportunity to test two E+ patients with spared motor function, thus able to perform a reaching task. Crucially, these patients had proprioceptive deafferentation, allowing us to purely isolate the embodiment-dependent effect from proprioception-dependent ones that are usually associated in experimental manipulations of body ownership in healthy participants. Previous evidence suggests that the reaching movement vector is attracted towards an embodied hand during the rubber hand illusion (RHI). However, these results are confounded by the spared proprioception, whose modulation alone could explain the effects on reach planning. The neuropsychological approach employed here provides unambiguous evidence about the role of body ownership in reach planning. Indeed, three brain-damaged patients with proprioceptive deafferentation, two E+ and a well-matched control patient without pathological embodiment (E-), and 10 age-matched healthy controls underwent a reaching task wherein they had to reach for a target from a fixed starting point, while an alien hand (the co-experimenter's) was placed on the table. Irrespective of proprioception, damaged in all patients, only in E+ patients reaching errors were significantly more shifted consistently with the pathological belief, i.e., as if they planned movements from the position of the alien (embodied) hand, as compared to controls. Furthermore, with an additional experiment on healthy participants, we demonstrated that reaching errors observed during the RHI correlate with the changes in ownership. In conclusion, our neuropsychological approach suggests that when planning a reach, we do so from where our owned hand is and not from its physical location.


Subject(s)
Brain Injuries , Illusions , Touch Perception , Body Image , Hand , Humans , Movement , Proprioception , Visual Perception
7.
J Cogn Neurosci ; 34(4): 675-686, 2022 03 05.
Article in English | MEDLINE | ID: mdl-35061032

ABSTRACT

The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7-14 Hz) and beta (15-30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.


Subject(s)
Spatial Processing , Touch Perception , Hand , Humans , Parietal Lobe , Space Perception , Touch
8.
Sci Rep ; 12(1): 109, 2022 01 07.
Article in English | MEDLINE | ID: mdl-34996925

ABSTRACT

Physical proximity is important in social interactions. Here, we assessed whether simulated physical proximity modulates the perceived intensity of facial emotional expressions and their associated physiological signatures during observation or imitation of these expressions. Forty-four healthy volunteers rated intensities of dynamic angry or happy facial expressions, presented at two simulated locations, proximal (0.5 m) and distant (3 m) from the participants. We tested whether simulated physical proximity affected the spontaneous (in the observation task) and voluntary (in the imitation task) physiological responses (activity of the corrugator supercilii face muscle and pupil diameter) as well as subsequent ratings of emotional intensity. Angry expressions provoked relative activation of the corrugator supercilii muscle and pupil dilation, whereas happy expressions induced a decrease in corrugator supercilii muscle activity. In proximal condition, these responses were enhanced during both observation and imitation of the facial expressions, and were accompanied by an increase in subsequent affective ratings. In addition, individual variations in condition related EMG activation during imitation of angry expressions predicted increase in subsequent emotional ratings. In sum, our results reveal novel insights about the impact of physical proximity in the perception of emotional expressions, with early proximity-induced enhancements of physiological responses followed by an increased intensity rating of facial emotional expressions.


Subject(s)
Emotions , Facial Expression , Facial Recognition , Social Interaction , Visual Perception , Adult , Electromyography , Facial Muscles/physiology , Female , Humans , Imitative Behavior , Male , Photic Stimulation , Pupil/physiology , Young Adult
9.
Proc Natl Acad Sci U S A ; 119(1)2022 01 04.
Article in English | MEDLINE | ID: mdl-34983835

ABSTRACT

Perhaps the most recognizable sensory map in all of neuroscience is the somatosensory homunculus. Although it seems straightforward, this simple representation belies the complex link between an activation in a somatotopic map and the associated touch location on the body. Any isolated activation is spatially ambiguous without a neural decoder that can read its position within the entire map, but how this is computed by neural networks is unknown. We propose that the somatosensory system implements multilateration, a common computation used by surveying and global positioning systems to localize objects. Specifically, to decode touch location on the body, multilateration estimates the relative distance between the afferent input and the boundaries of a body part (e.g., the joints of a limb). We show that a simple feedforward neural network, which captures several fundamental receptive field properties of cortical somatosensory neurons, can implement a Bayes-optimal multilateral computation. Simulations demonstrated that this decoder produced a pattern of localization variability between two boundaries that was unique to multilateration. Finally, we identify this computational signature of multilateration in actual psychophysical experiments, suggesting that it is a candidate computational mechanism underlying tactile localization.


Subject(s)
Neural Networks, Computer , Touch Perception/physiology , Touch/physiology , Adult , Animals , Bayes Theorem , Brain Mapping , Female , Humans , Mice , Models, Neurological , Neurons/physiology , Physical Stimulation , Somatosensory Cortex/physiology , Young Adult
10.
Sci Rep ; 10(1): 17275, 2020 10 14.
Article in English | MEDLINE | ID: mdl-33057121

ABSTRACT

Following tool-use, the kinematics of free-hand movements are altered. This modified kinematic pattern has been taken as a behavioral hallmark of the modification induced by tool-use on the effector representation. Proprioceptive inputs appear central in updating the estimated effector state. Here we questioned whether online proprioceptive modality that is accessed in real time, or offline, memory-based, proprioception is responsible for this update. Since normal aging affects offline proprioception only, we examined a group of 60 year-old adults for proprioceptive acuity and movement's kinematics when grasping an object before and after tool-use. As a control, participants performed the same movements with a weight-equivalent to the tool-weight-attached to their wrist. Despite hampered offline proprioceptive acuity, 60 year-old participants exhibited the typical kinematic signature of tool incorporation: Namely, the latency of transport components peaks was longer and their amplitude reduced after tool-use. Instead, we observed no kinematic modifications in the control condition. In addition, online proprioception acuity correlated with tool incorporation, as indexed by the amount of kinematics changes observed after tool-use. Altogether, these findings point to the prominent role played by online proprioception in updating the body estimate for the motor control of tools.


Subject(s)
Arm/physiology , Healthy Aging/physiology , Aged , Biomechanical Phenomena , Female , Hand Strength , Humans , Male , Middle Aged , Proprioception , Tool Use Behavior , Wrist/physiology
11.
Neurol Clin Pract ; 10(1): 29-39, 2020 Feb.
Article in English | MEDLINE | ID: mdl-32190418

ABSTRACT

OBJECTIVE: To assess the role of visual measures and retinal volume to predict the risk of Parkinson disease (PD) dementia. METHODS: In this cohort study, we collected visual, cognitive, and motor data in people with PD. Participants underwent ophthalmic examination, retinal imaging using optical coherence tomography, and visual assessment including acuity and contrast sensitivity and high-level visuoperception measures of skew tolerance and biological motion. We assessed the risk of PD dementia using a recently described algorithm that combines age at onset, sex, depression, motor scores, and baseline cognition. RESULTS: One hundred forty-six people were included in the study (112 with PD and 34 age-matched controls). The mean disease duration was 4.1 (±2·5) years. None of these participants had dementia. Higher risk of dementia was associated with poorer performance in visual measures (acuity: ρ = 0.29, p = 0.0024; contrast sensitivity: ρ = -0.37, p < 0.0001; skew tolerance: ρ = -0.25, p = 0.0073; and biological motion: ρ = -0.26, p = 0.0054). In addition, higher risk of PD dementia was associated with thinner retinal structure in layers containing dopaminergic cells, measured as ganglion cell layer (GCL) and inner plexiform layer (IPL) thinning (ρ = -0.29, p = 0.0021; ρ = -0.33, p = 0.00044). These relationships were not seen for the retinal nerve fiber layer that does not contain dopaminergic cells and were not seen in unaffected controls. CONCLUSION: Visual measures and retinal structure in dopaminergic layers were related to risk of PD dementia. Our findings suggest that visual measures and retinal GCL and IPL volumes may be useful to predict the risk of dementia in PD.

12.
Curr Biol ; 29(24): 4276-4283.e5, 2019 12 16.
Article in English | MEDLINE | ID: mdl-31813607

ABSTRACT

The extent to which a tool is an extension of its user is a question that has fascinated writers and philosophers for centuries [1]. Despite two decades of research [2-7], it remains unknown how this could be instantiated at the neural level. To this aim, the present study combined behavior, electrophysiology and neuronal modeling to characterize how the human brain could treat a tool like an extended sensory "organ." As with the body, participants localize touches on a hand-held tool with near-perfect accuracy [7]. This behavior is owed to the ability of the somatosensory system to rapidly and efficiently use the tool as a tactile extension of the body. Using electroencephalography (EEG), we found that where a hand-held tool was touched was immediately coded in the neural dynamics of primary somatosensory and posterior parietal cortices of healthy participants. We found similar neural responses in a proprioceptively deafferented patient with spared touch perception, suggesting that location information is extracted from the rod's vibrational patterns. Simulations of mechanoreceptor responses [8] suggested that the speed at which these patterns are processed is highly efficient. A second EEG experiment showed that touches on the tool and arm surfaces were localized by similar stages of cortical processing. Multivariate decoding algorithms and cortical source reconstruction provided further evidence that early limb-based processes were repurposed to map touch on a tool. We propose that an elementary strategy the human brain uses to sense with tools is to recruit primary somatosensory dynamics otherwise devoted to the body.


Subject(s)
Somatosensory Cortex/physiology , Touch Perception/physiology , Touch/physiology , Adult , Brain/physiology , Electroencephalography , Female , Humans , Male , Mechanoreceptors/physiology , Middle Aged , Neuronal Plasticity/physiology , Tool Use Behavior/physiology , Visual Perception/physiology
13.
J Cogn Neurosci ; 31(12): 1782-1795, 2019 12.
Article in English | MEDLINE | ID: mdl-31368823

ABSTRACT

Tool use leads to plastic changes in sensorimotor body representations underlying tactile perception. The neural correlates of this tool-induced plasticity in humans have not been adequately characterized. This study used ERPs to investigate the stage of sensory processing modulated by tool use. Somatosensory evoked potentials, elicited by median nerve stimulation, were recorded before and after two forms of object interaction: tool use and hand use. Compared with baseline, tool use-but not use of the hand alone-modulated the amplitude of the P100. The P100 is a mid-latency component that indexes the construction of multisensory models of the body and has generators in secondary somatosensory and posterior parietal cortices. These results mark one of the first demonstrations of the neural correlates of tool-induced plasticity in humans and suggest that tool use modulates relatively late stages of somatosensory processing outside primary somatosensory cortex. This finding is consistent with what has been observed in tool-trained monkeys and suggests that the mechanisms underlying tool-induced plasticity have been preserved across primate evolution.


Subject(s)
Evoked Potentials, Somatosensory/physiology , Exoskeleton Device , Somatosensory Cortex/physiology , Tool Use Behavior/physiology , Adult , Electroencephalography , Electroshock , Female , Hand/physiology , Humans , Male , Median Nerve/physiology , Neuronal Plasticity , Parietal Lobe/physiology , Young Adult
14.
Nature ; 561(7722): 239-242, 2018 09.
Article in English | MEDLINE | ID: mdl-30209365

ABSTRACT

The ability to extend sensory information processing beyond the nervous system1 has been observed throughout the animal kingdom; for example, when rodents palpate objects using whiskers2 and spiders localize prey using webs3. We investigated whether the ability to sense objects with tools4-9 represents an analogous information processing scheme in humans. Here we provide evidence from behavioural psychophysics, structural mechanics and neuronal modelling, which shows that tools are treated by the nervous system as sensory extensions of the body rather than as simple distal links between the hand and the environment10,11. We first demonstrate that tool users can accurately sense where an object contacts a wooden rod, just as is possible on the skin. We next demonstrate that the impact location is encoded by the modal response of the tool upon impact, reflecting a pre-neuronal stage of mechanical information processing akin to sensing with whiskers2 and webs3. Lastly, we use a computational model of tactile afferents12 to demonstrate that impact location can be rapidly re-encoded into a temporally precise spiking code. This code predicts the behaviour of human participants, providing evidence that the information encoded in motifs shapes localization. Thus, we show that this sensory capability emerges from the functional coupling between the material, biomechanical and neural levels of information processing13,14.


Subject(s)
Biomechanical Phenomena/physiology , Perception/physiology , Somatosensory Cortex/physiology , Wood , Action Potentials , Adult , Animals , Blindness/physiopathology , Female , Hand/physiology , Humans , Male , Mechanoreceptors/metabolism , Touch/physiology , Vibration , Vibrissae/physiology , Young Adult
15.
Mov Disord ; 33(4): 544-553, 2018 04.
Article in English | MEDLINE | ID: mdl-29473691

ABSTRACT

BACKGROUND: People with Parkinson's disease (PD) who develop visuo-perceptual deficits are at higher risk of dementia, but we lack tests that detect subtle visuo-perceptual deficits and can be performed by untrained personnel. Hallucinations are associated with cognitive impairment and typically involve perception of complex objects. Changes in object perception may therefore be a sensitive marker of visuo-perceptual deficits in PD. OBJECTIVE: We developed an online platform to test visuo-perceptual function. We hypothesised that (1) visuo-perceptual deficits in PD could be detected using online tests, (2) object perception would be preferentially affected, and (3) these deficits would be caused by changes in perception rather than response bias. METHODS: We assessed 91 people with PD and 275 controls. Performance was compared using classical frequentist statistics. We then fitted a hierarchical Bayesian signal detection theory model to a subset of tasks. RESULTS: People with PD were worse than controls at object recognition, showing no deficits in other visuo-perceptual tests. Specifically, they were worse at identifying skewed images (P < .0001); at detecting hidden objects (P = .0039); at identifying objects in peripheral vision (P < .0001); and at detecting biological motion (P = .0065). In contrast, people with PD were not worse at mental rotation or subjective size perception. Using signal detection modelling, we found this effect was driven by change in perceptual sensitivity rather than response bias. CONCLUSIONS: Online tests can detect visuo-perceptual deficits in people with PD, with object recognition particularly affected. Ultimately, visuo-perceptual tests may be developed to identify at-risk patients for clinical trials to slow PD dementia. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society.


Subject(s)
Cognition Disorders/diagnosis , Cognition Disorders/etiology , Online Systems , Parkinson Disease/complications , Perceptual Disorders/etiology , Visual Perception/physiology , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Motion Perception/physiology , Neuropsychological Tests , Perceptual Disorders/diagnosis , Psychomotor Performance/physiology , Recognition, Psychology , Signal Detection, Psychological , Visual Acuity/physiology
16.
Exp Brain Res ; 235(10): 2917-2926, 2017 10.
Article in English | MEDLINE | ID: mdl-28702834

ABSTRACT

Two decades of research have demonstrated that using a tool modulates spatial representations of the body. Whether this embodiment is specific to representations of the tool-using limb or extends to representations of other body parts has received little attention. Several studies of other perceptual phenomena have found that modulations to the primary somatosensory representation of the hand transfers to the face, due in part to their close proximity in primary somatosensory cortex. In the present study, we investigated whether tool-induced recalibration of tactile perception on the hand transfers to the cheek. Participants verbally estimated the distance between two tactile points applied to either their hand or face, before and after using a hand-shaped tool. Tool use recalibrated tactile distance perception on the hand-in line with previous findings-but left perception on the cheek unchanged. This finding provides support for the idea that embodiment is body-part specific. Furthermore, it suggests that tool-induced perceptual recalibration occurs at a level of somatosensory processing, where representations of the hand and face have become functionally disentangled.


Subject(s)
Cheek/physiology , Distance Perception/physiology , Hand/physiology , Motor Activity/physiology , Psychomotor Performance/physiology , Somatosensory Cortex/physiology , Touch Perception/physiology , Adult , Calibration , Female , Humans , Male , Young Adult
17.
Cognition ; 162: 32-40, 2017 05.
Article in English | MEDLINE | ID: mdl-28196765

ABSTRACT

Brief use of a tool recalibrates multisensory representations of the user's body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment.


Subject(s)
Body Image , Illusions , Psychomotor Performance , Tool Use Behavior , Touch Perception , Adult , Conflict, Psychological , Female , Humans , Male , Proprioception , Young Adult
18.
Conscious Cogn ; 40: 17-25, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26741857

ABSTRACT

Mental body representations underlying tactile perception do not accurately reflect the body's true morphology. For example, perceived tactile distance is dependent on both the body part being touched and the stimulus orientation, a phenomenon called Weber's illusion. These findings suggest the presence of size and shape distortions, respectively. However, whereas each morphological feature is typically measured in isolation, a complete morphological characterization requires the concurrent measurement of both size and shape. We did so in three experiments, manipulating both the stimulated body parts (hand; forearm) and stimulus orientation while requiring participants to make tactile distance judgments. We found that the forearm was significantly more distorted than the hand lengthwise but not widthwise. Effects of stimulus orientation are thought to reflect receptive field anisotropies in primary somatosensory cortex. The results of the present study therefore suggest that mental body representations retain homuncular shape distortions that characterize early stages of somatosensory processing.


Subject(s)
Body Image , Distance Perception/physiology , Illusions/physiology , Touch Perception/physiology , Adult , Female , Forearm/physiology , Hand/physiology , Humans , Male , Young Adult
20.
Front Hum Neurosci ; 8: 982, 2014.
Article in English | MEDLINE | ID: mdl-25538604

ABSTRACT

Language comprehension requires rapid and flexible access to information stored in long-term memory, likely influenced by activation of rich world knowledge and by brain systems that support the processing of sensorimotor content. We hypothesized that while literal language about biological motion might rely on neurocognitive representations of biological motion specific to the details of the actions described, metaphors rely on more generic representations of motion. In a priming and self-paced reading paradigm, participants saw video clips or images of (a) an intact point-light walker or (b) a scrambled control and read sentences containing literal or metaphoric uses of biological motion verbs either closely or distantly related to the depicted action (walking). We predicted that reading times for literal and metaphorical sentences would show differential sensitivity to the match between the verb and the visual prime. In Experiment 1, we observed interactions between the prime type (walker or scrambled video) and the verb type (close or distant match) for both literal and metaphorical sentences, but with strikingly different patterns. We found no difference in the verb region of literal sentences for Close-Match verbs after walker or scrambled motion primes, but Distant-Match verbs were read more quickly following walker primes. For metaphorical sentences, the results were roughly reversed, with Distant-Match verbs being read more slowly following a walker compared to scrambled motion. In Experiment 2, we observed a similar pattern following still image primes, though critical interactions emerged later in the sentence. We interpret these findings as evidence for shared recruitment of cognitive and neural mechanisms for processing visual and verbal biological motion information. Metaphoric language using biological motion verbs may recruit neurocognitive mechanisms similar to those used in processing literal language but be represented in a less-specific way.

SELECTION OF CITATIONS
SEARCH DETAIL
...