Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
Soc Neurosci ; 6(1): 31-47, 2011.
Article in English | MEDLINE | ID: mdl-20379900

ABSTRACT

Previous evidence indicates that we understand others' actions not only by perceiving their visual features but also by their sound. This raises the possibility that brain regions responsible for action understanding respond to cues coming from different sensory modalities. Yet no studies, to date, have examined if this extends to olfaction. Here we addressed this issue by using functional magnetic resonance imaging. We searched for brain activity related to the observation of an action executed towards an object that was smelled rather than seen. The results show that temporal, parietal, and frontal areas were activated when individuals observed a hand grasping a smelled object. This activity differed from that evoked during the observation of a mimed grasp. Furthermore, superadditive activity was revealed when the action target-object was both seen and smelled. Together these findings indicate the influence of olfaction on action understanding and its contribution to multimodal action representations.


Subject(s)
Brain Mapping , Comprehension/physiology , Olfactory Perception/physiology , Psychomotor Performance/physiology , Adult , Brain , Female , Hand Strength/physiology , Humans , Image Interpretation, Computer-Assisted , Magnetic Resonance Imaging , Male , Odorants , Visual Perception/physiology , Young Adult
2.
Exp Neurol ; 217(2): 252-7, 2009 Jun.
Article in English | MEDLINE | ID: mdl-19285072

ABSTRACT

Recent fMRI evidence indicates that both the execution and the observation of hand actions in multiple sclerosis (MS) patients increase recruitment of a portion of the so-called mirror neuron system. However, it remains unclear whether this is the expression of a compensatory mechanism for the coding of observed action or whether such a mechanism represents a rather unspecific functional adaptation process. Here we used fMRI on early relapsing remitting MS (RRMS) patients to clarify this issue. Functional images of 15 right-handed early RRMS patients and of 15 sex- and age-matched right-handed healthy controls were acquired using a 1.5 T scanner. During scanning, participants simply observed images depicting a human hand either grasping an object or resting alongside an object. As shown by a between-group analysis, when compared to controls, RRMS patients revealed a robust increase of activation in an extensive network of brain regions including frontal, parietal, temporal and visual areas usually activated during action observation. However, this pattern of hemodynamic activity was completely independent of the type of observed hand-object interaction as revealed by the lack of any significant between-group interaction. Our findings are in line with previous fMRI evidence demonstrating cortical reorganization in MS patients during action observation. However, based on our findings we go one step further and suggest that such functional cortical changes may be the expression of a generalized and unspecific compensatory mechanism, that is not necessarily involved in action understanding.


Subject(s)
Imitative Behavior/physiology , Movement Disorders/physiopathology , Movement/physiology , Multiple Sclerosis/physiopathology , Neuronal Plasticity/physiology , Psychomotor Performance/physiology , Adult , Brain Mapping , Cerebral Cortex/anatomy & histology , Cerebral Cortex/physiology , Cerebrovascular Circulation , Female , Functional Laterality/physiology , Hand/innervation , Hand/physiology , Hand Strength/physiology , Humans , Magnetic Resonance Imaging , Male , Movement Disorders/complications , Movement Disorders/psychology , Multiple Sclerosis/complications , Multiple Sclerosis/psychology , Nerve Net/anatomy & histology , Nerve Net/physiology , Neuropsychological Tests , Photic Stimulation , Young Adult
3.
Cereb Cortex ; 19(2): 367-74, 2009 Feb.
Article in English | MEDLINE | ID: mdl-18534989

ABSTRACT

Previous neuroimaging research on healthy humans has provided evidence for a neural system underlying the observation of another person's hand actions. However, whether the neural processes involved in this capacity are activated by the observation of other transitive hand actions such as pointing remains unknown. Therefore, using functional magnetic resonance imaging we investigated the neural mechanisms underlying the observation of static images representing the hand of a human model pointing to an object (pointing condition), grasping an object (grasping condition), or resting in proximity of an object (control condition). The results indicated that activity within portions of the lateral occipitotemporal and the somatosensory cortices modulates according to the type of observed transitive actions. Specifically, these regions were more activated for the grasping than for the pointing condition. In contrast, the premotor cortex, a neural marker of action observation, did not show any differential activity when contrasting the considered experimental conditions. Our findings may provide novel insights regarding a possible role of extrastriate and somatosensory brain areas for the perception of distinct types of human hand-object interactions.


Subject(s)
Brain/physiology , Psychomotor Performance/physiology , Space Perception/physiology , Adult , Arm/innervation , Arm/physiology , Female , Hand/innervation , Hand/physiology , Hand Strength/physiology , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Male , Motor Cortex/physiology , Oxygen/blood , Parietal Lobe/physiology , Photic Stimulation , Stereotaxic Techniques
4.
Brain Lang ; 108(1): 10-21, 2009 Jan.
Article in English | MEDLINE | ID: mdl-18082250

ABSTRACT

The widely known discovery of mirror neurons in macaques shows that premotor and parietal cortical areas are not only involved in executing one's own movement, but are also active when observing the action of others. The goal of this essay is to critically evaluate the substance of functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) studies whose aim has been to reveal the presence of a parallel system in humans. An inspection of this literature suggests that there is relatively weak evidence for the existence of a circuit with 'mirror' properties in humans, such as that described in monkeys.


Subject(s)
Cerebral Cortex/physiology , Functional Laterality/physiology , Nerve Net/physiology , Neurons/physiology , Animals , Cerebral Cortex/anatomy & histology , Haplorhini , Humans , Magnetic Resonance Imaging/methods , Nerve Net/anatomy & histology , Neurons/cytology , Positron-Emission Tomography/methods
5.
Soc Neurosci ; 3(1): 51-9, 2008.
Article in English | MEDLINE | ID: mdl-18633846

ABSTRACT

Our social abilities depend on specialized brain systems that allow us to perform crucial operations such as interpreting the actions of others. This functional magnetic resonance imaging (fMRI) study investigated whether human brain activity evoked by the observation of social interactions is modulated by gaze. During scanning participants observed social or individual actions performed by agents whose gaze could be either available or masked. Results demonstrated that the observation of social interactions evoked activity within a dorsal sector of the medial prefrontal cortex (MPFC), an area classically involved in social cognition. Importantly, activity within this area was modulated by whether the gaze of the agents performing the observed action was or was not available. The implications of these findings for a role played by the dorsal medial prefrontal cortex (dMPFC) in terms of inferential processes concerned with social interactions are considered.


Subject(s)
Fixation, Ocular/physiology , Interpersonal Relations , Adolescent , Adult , Brain/physiology , Brain Mapping/methods , Female , Humans , Male , Photic Stimulation/methods
6.
Neuropsychologia ; 46(2): 448-54, 2008 Jan 31.
Article in English | MEDLINE | ID: mdl-17920641

ABSTRACT

The ability to understand another person's action and, if needed, to imitate that action, is a core component of human social behaviour. Imitation skills have attracted particular attention in the search for the underlying causes of the social difficulties that characterize autism. In recent years, it has been reported that people with autism can bypass some of their social deficits by interacting with robots. However, the robot preference in terms of imitation has yet to be proved. Here we provide empirical evidence that interaction with robots can trigger imitative behaviour in children with autism. We compared a group of high functioning children with autism with a group of typically developing children in a visuomotor priming experiment. Participants were requested to observe either a human or a robotic arm model performing a reach-to-grasp action towards a spherical object. Subsequently, the observers were asked to perform the same action towards the same object. Two 'control' conditions in which participants performed the movement in the presence of either the static human or robot model were also included. Kinematic analysis was conducted on the reach-to-grasp action performed by the observer. Our results show that children with autism were facilitated - as revealed by a faster movement duration and an anticipated peak velocity - when primed by a robotic but not by a human arm movement. The opposite pattern was found for normal children. The present results show that interaction with robots has an effect on visuomotor priming processes. These findings suggest that in children with autism the neural mechanism underlying the coding of observed actions might be tailored to process socially simpler stimuli.


Subject(s)
Autistic Disorder/psychology , Imitative Behavior/physiology , Psychomotor Performance/physiology , Robotics , Social Perception , Adolescent , Analysis of Variance , Asperger Syndrome/psychology , Attention/physiology , Biomechanical Phenomena , Case-Control Studies , Child , Eye Movements , Female , Humans , Male , Matched-Pair Analysis , Movement , Reaction Time/physiology , Reference Values
7.
Neurosci Lett ; 430(3): 246-51, 2008 Jan 17.
Article in English | MEDLINE | ID: mdl-18063476

ABSTRACT

Event-related functional magnetic resonance imaging (fMRI) was used to explore how the human brain models gaze-object relations. During scanning participants observed a human model gazing towards or away a target object presented either in isolation or flanked by a distractor object. In two further conditions the model's gaze was shifted and subsequently maintained away from the stimulus/i. These four conditions were implemented within a factorial design in which the main factors were "type of observed behavior" (gaze vs. gaze-away) and "context" (target alone vs. target flanked by a distractor). Results revealed that premotor, parietal and temporal areas, known to sub-serve the understanding of other people actions, were significantly more activated by the observation of the model gazing towards rather than away from the stimulus/i. In addition, a significant interaction indicated that, when the target was presented in isolation, neural activity within the inferior frontal gyrus, another key area for action understanding, was influenced by gaze-object relations. Our findings suggest that this area is important for the establishment of intentional gaze-object relations and indicate that the presence of a distractor interferes with the representation of such relations.


Subject(s)
Attention/physiology , Cerebral Cortex/physiology , Eye Movements/physiology , Fixation, Ocular/physiology , Psychomotor Performance/physiology , Space Perception/physiology , Adult , Brain Mapping , Cerebral Cortex/anatomy & histology , Female , Functional Laterality/physiology , Humans , Magnetic Resonance Imaging , Male , Nerve Net/anatomy & histology , Nerve Net/physiology , Neuropsychological Tests , Pattern Recognition, Visual/physiology , Photic Stimulation
8.
Neurosci Lett ; 417(2): 171-5, 2007 May 01.
Article in English | MEDLINE | ID: mdl-17412509

ABSTRACT

The selection of objects in the visual environment is important in everyday life when acting in a goal-directed manner. Here we used functional magnetic resonance imaging (fMRI) to investigate brain activity while healthy subjects (N=15) selectively reached to grasp a three-dimensional (3D) target stimulus presented either in isolation or in the presence of 3D non-target stimuli. A pneumatic MRI compatible apparatus was designed to precisely control the presentation of 3D graspable stimuli within the scanner. During scanning subjects were instructed to reach and grasp towards a target presented at an unknown location either in isolation or flanked by two distractor objects. Results indicated that reaching towards and grasping the target object in the presence of other non-target stimuli was associated with greater activation within the contralateral primary motor cortex and the precuneus as compared to the execution of reach-to-grasp movements towards the target presented in isolation. We conclude that the presence of non-targets evokes a differential level of neural activity within areas responsible for the planning and execution of selective reach-to-grasp movement.


Subject(s)
Brain/physiology , Decision Making/physiology , Hand Strength/physiology , Movement/physiology , Psychomotor Performance/physiology , Space Perception/physiology , Adolescent , Adult , Brain/anatomy & histology , Brain Mapping , Female , Functional Laterality/physiology , Hand/innervation , Hand/physiology , Humans , Magnetic Resonance Imaging , Male , Motor Cortex/anatomy & histology , Motor Cortex/physiology , Neuropsychological Tests , Parietal Lobe/anatomy & histology , Parietal Lobe/physiology , Photic Stimulation
9.
J Cogn Neurosci ; 18(12): 2130-7, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17129195

ABSTRACT

Previous research has provided evidence for a neural system underlying the observation of another person's hand actions. Is the neural system involved in this capacity also important in inferring another person's motor intentions toward an object from their eye gaze? In real-life situations, humans use eye movements to catch and direct the attention of others, often without any accompanying hand movements or speech. In an event-related functional magnetic resonance imaging study, subjects observed videos showing a human model either grasping a target object (grasping condition) or simply gazing (gaze condition) at the same object. These two conditions were contrasted with each other and against a control condition in which the human model was standing behind the object without performing any gazing or grasping action. The results revealed activations within the dorsal premotor cortex, the inferior frontal gyrus, the inferior parietal lobule, and the superior temporal sulcus in both "grasping" and "gaze" conditions. These findings suggest that signaling the presence of an object through gaze elicits in an observer a similar neural response to that elicited by the observation of a reach-to-grasp action performed on the same object.


Subject(s)
Fixation, Ocular/physiology , Hand Strength/physiology , Psychomotor Performance/physiology , Adult , Brain Mapping , Cerebral Cortex/physiology , Data Interpretation, Statistical , Female , Humans , Magnetic Resonance Imaging , Male , Oxygen/blood
10.
Eur J Neurosci ; 23(7): 1949-55, 2006 Apr.
Article in English | MEDLINE | ID: mdl-16623852

ABSTRACT

Previous behavioural and neuroimaging data on humans demonstrated that kinematics and the level of brain activity vary according to whether participants reach towards and grasp a target object presented in isolation or flanked by a distractor object. Here we seek to explore whether a differential activation can be revealed by the mere observation of another person grasping an object in isolation or alongside a distractor. To this end we used event-related functional magnetic resonance imaging to localize neural activity related to action observation that was influenced by the presence of a distractor object. We found that observing a human model reaching-to-grasp a three-dimensional target alongside a distractor elicits a differential level of activation in a network of areas typically involved during action observation: the dorsal sectors of the premotor cortex and the inferior frontal gyrus. Whereas our previous understanding of the human action observation system has been restricted to actions directed to single objects, we provide compelling evidence that areas within this network modulate with respect to the context in which the observed action takes place. This may prove to be a fundamental process for our understanding of how others' actions can be represented at a neural level.


Subject(s)
Brain Mapping , Motor Cortex/physiology , Visual Perception , Adult , Humans , Magnetic Resonance Imaging , Nerve Net
11.
Appl Ergon ; 36(3): 335-43, 2005 May.
Article in English | MEDLINE | ID: mdl-15854577

ABSTRACT

The aim of the present study is to investigate interactions between vision and audition during a target acquisition task performed in a virtual environment. We measured the time taken to locate a visual target (acquisition time) signalled by auditory and/or visual cues in conditions of variable visual load. Visual load was increased by introducing a secondary visual task. The auditory cue was constructed using virtual three-dimensional (3D) sound techniques. The visual cue was constructed in the form of a 3D updating arrow. The results suggested that both auditory and visual cues reduced acquisition time as compared to an uncued condition. Whereas the visual cue elicited faster acquisition time than the auditory cue, the combination of the two cues produced the fastest acquisition time. The introduction of secondary visual task differentially affected acquisition time depending on cue modality. In conditions of high visual load, acquiring a target signalled by the auditory cue led to slower and more error-prone performance than acquiring a target signalled by either the visual cue alone or by both the visual and auditory cues.


Subject(s)
Cues , Pattern Recognition, Visual , Sound Localization/physiology , Space Perception/physiology , Task Performance and Analysis , Adolescent , Adult , Analysis of Variance , Female , Humans , Male , Reaction Time , User-Computer Interface
12.
Hum Factors ; 46(4): 728-37, 2004.
Article in English | MEDLINE | ID: mdl-15709333

ABSTRACT

The aim of the present study was to investigate interactions between vision and audition during a visual target acquisition task performed in a virtual environment. In two experiments, participants were required to perform an acquisition task guided by auditory and/or visual cues. In both experiments the auditory cues were constructed using virtual 3-D sound techniques based on nonindividualized head-related transfer functions. In Experiment 1 the visual cue was constructed in the form of a continuously updated 2-D arrow. In Experiment 2 the visual cue was a nonstereoscopic, perspective-based 3-D arrow. The results suggested that virtual spatial auditory cues reduced acquisition time but were not as effective as the virtual visual cues. Experiencing the 3-D perspective-based arrow rather than the 2-D arrow produced a faster acquisition time not only in the visually aided conditions but also when the auditory cues were presented in isolation. Suggested novel applications include providing 3-D nonstereoscopic, perspective-based visual information on radar displays, which may lead to a better integration with spatial virtual auditory information.


Subject(s)
Auditory Perception/physiology , Cues , Pattern Recognition, Visual , User-Computer Interface , Visual Perception/physiology , Adolescent , Adult , Cohort Studies , Female , Humans , Male , Orientation/physiology , Reaction Time , Sensitivity and Specificity , Task Performance and Analysis
SELECTION OF CITATIONS
SEARCH DETAIL
...