Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Iperception ; 14(6): 20416695231211699, 2023.
Article in English | MEDLINE | ID: mdl-37969571

ABSTRACT

Visuomotor synchrony in time and space induces a sense of embodiment towards virtual bodies experienced in first-person view using Virtual Reality (VR). Here, we investigated whether temporal visuomotor synchrony affects avatar embodiment even when the movements of the virtual arms are spatially altered from those of the user in a non-human-like manner. In a within-subjects design VR experiment, participants performed a reaching task controlling an avatar whose lower arms bent in inversed and biomechanically impossible directions from the elbow joints. They performed the reaching task using this "unnatural avatar" as well as a "natural avatar," whose arm movements and positions spatially matched the user. The reaching tasks were performed with and without a one second delay between the real and virtual movements. While the senses of body ownership and agency towards the unnatural avatar were significantly lower compared to those towards the natural avatar, temporal visuomotor synchrony did significantly increase the sense of embodiment towards the unnatural avatar as well as the natural avatar. These results suggest that temporal visuomotor synchrony is crucial for inducing embodiment even when the spatial match between the real and virtual limbs is disrupted with movements outside the pre-existing cognitive representations of the human body.

2.
Sci Rep ; 13(1): 13914, 2023 09 12.
Article in English | MEDLINE | ID: mdl-37699941

ABSTRACT

Humans feel empathic embarrassment by witnessing others go through embarrassing situations. We examined whether we feel such empathic embarrassment even with robot avatars. Participants observed a human avatar and a robot avatar face a series of embarrassing and non-embarrassing scenarios. We collected data for their empathic embarrassment and the cognitive empathy on a 7-point Likert scale. Both empathic embarrassment and cognitive empathy were significantly higher in the embarrassed condition compared to the non-embarrassed condition with both avatars, and the cognitive empathy was significantly higher with the human avatar. There was a tendency of participants showing a higher level of skin conductance while watching the human avatar go through embarrassing situations compared to the robot avatar. A following experiment showed that the average plausibility of the embarrassed condition was significantly higher with the human avatar compared to the robot avatar. However, plausibility scores for emotion were not significantly different among the conditions. These results suggest that humans can feel empathic embarrassment as well as cognitive empathy for robot avatars while cognitive empathy for robot avatars is comparatively lower, and that part of the empathic difference between human and robot avatars might be due to the difference of their plausibility.


Subject(s)
Embarrassment , Empathy , Humans , Emotions
3.
PLoS One ; 18(1): e0278022, 2023.
Article in English | MEDLINE | ID: mdl-36602991

ABSTRACT

Even if we cannot control them, or when we receive no tactile or proprioceptive feedback from them, limbs attached to our bodies can still provide indirect proprioceptive and haptic stimulations to the body parts they are attached to simply due to the physical connections. In this study we investigated whether such indirect movement and haptic feedbacks from a limb contribute to a feeling of embodiment towards it. To investigate this issue, we developed a 'Joint Avatar' setup in which two individuals were given full control over the limbs in different sides (left and right) of an avatar during a reaching task. The backs of the two individuals were connected with a pair of solid braces through which they could exchange forces and match the upper body postures with one another. Coupled with the first-person view, this simulated an experience of the upper body being synchronously dragged by the partner-controlled virtual arm when it moved. We observed that this passive synchronized upper-body movement significantly reduced the feeling of the partner-controlled limb being owned or controlled by another. In summary, our results suggest that even in total absence of control, connection induced upper body movements synchronized with the visible limb movements can positively affect the sense of embodiment towards partner-controlled or autonomous limbs.


Subject(s)
Human Body , Movement , Humans , Posture , Touch , Extremities
4.
Sci Rep ; 12(1): 11453, 2022 07 26.
Article in English | MEDLINE | ID: mdl-35882868

ABSTRACT

We explored a concept called "virtual co-embodiment", which enables users to share their virtual avatars with others. Co-embodiment of avatars and robots can be applied for collaboratively performing complicated tasks, skill training, rehabilitation, and aiding disabled users. We conducted an experiment where two users could co-embody one "joint avatar" in first person view and control different arms to collaboratively perform three types of reaching tasks. We measured their senses of agency and ownership towards the two arms of the avatar and changes in skin conductance levels in response to visual stimuli threatening the two virtual arms. We found that sense of agency, ownership, and skin conductance were significantly higher towards the virtual arm with control compared to the arm controlled by the partner. Furthermore, the senses of agency and ownership towards the arm controlled by the partner were significantly higher when the participant dyads shared a common intention or when they were allowed to see their partner's target, compared to when the partner's target was invisible. These results show that while embodiment towards partner-controlled limbs is lower compared to limbs with control, visual information necessary for predicting the partner's intentions can significantly enhance embodiment towards partner-controlled limbs during virtual co-embodiment.


Subject(s)
Intention , User-Computer Interface , Humans , Movement/physiology , Ownership
SELECTION OF CITATIONS
SEARCH DETAIL
...