Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Main subject
Language
Publication year range
1.
IEEE Trans Cybern ; 51(12): 5954-5968, 2021 Dec.
Article in English | MEDLINE | ID: mdl-32149676

ABSTRACT

For social robots to effectively engage in human-robot interaction (HRI), they need to be able to interpret human affective cues and to respond appropriately via display of their own emotional behavior. In this article, we present a novel multimodal emotional HRI architecture to promote natural and engaging bidirectional emotional communications between a social robot and a human user. User affect is detected using a unique combination of body language and vocal intonation, and multimodal classification is performed using a Bayesian Network. The Emotionally Expressive Robot utilizes the user's affect to determine its own emotional behavior via an innovative two-layer emotional model consisting of deliberative (hidden Markov model) and reactive (rule-based) layers. The proposed architecture has been implemented via a small humanoid robot to perform diet and fitness counseling during HRI. In order to evaluate the Emotionally Expressive Robot's effectiveness, a Neutral Robot that can detect user affects but lacks an emotional display, was also developed. A between-subjects HRI experiment was conducted with both types of robots. Extensive results have shown that both robots can effectively detect user affect during the real-time HRI. However, the Emotionally Expressive Robot can appropriately determine its own emotional response based on the situation at hand and, therefore, induce more user positive valence and less negative arousal than the Neutral Robot.


Subject(s)
Robotics , Bayes Theorem , Communication , Emotions , Humans , Social Interaction
SELECTION OF CITATIONS
SEARCH DETAIL
...