Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
IEEE Trans Cybern ; 51(12): 5954-5968, 2021 Dec.
Article in English | MEDLINE | ID: mdl-32149676

ABSTRACT

For social robots to effectively engage in human-robot interaction (HRI), they need to be able to interpret human affective cues and to respond appropriately via display of their own emotional behavior. In this article, we present a novel multimodal emotional HRI architecture to promote natural and engaging bidirectional emotional communications between a social robot and a human user. User affect is detected using a unique combination of body language and vocal intonation, and multimodal classification is performed using a Bayesian Network. The Emotionally Expressive Robot utilizes the user's affect to determine its own emotional behavior via an innovative two-layer emotional model consisting of deliberative (hidden Markov model) and reactive (rule-based) layers. The proposed architecture has been implemented via a small humanoid robot to perform diet and fitness counseling during HRI. In order to evaluate the Emotionally Expressive Robot's effectiveness, a Neutral Robot that can detect user affects but lacks an emotional display, was also developed. A between-subjects HRI experiment was conducted with both types of robots. Extensive results have shown that both robots can effectively detect user affect during the real-time HRI. However, the Emotionally Expressive Robot can appropriately determine its own emotional response based on the situation at hand and, therefore, induce more user positive valence and less negative arousal than the Neutral Robot.


Subject(s)
Robotics , Bayes Theorem , Communication , Emotions , Humans , Social Interaction
2.
Stud Health Technol Inform ; 184: 465-7, 2013.
Article in English | MEDLINE | ID: mdl-23400203

ABSTRACT

With recent advances in electronics and mechanics, a new trend in interaction is taking place changing how we interact with our environment, daily tasks and other people. Even though sensor based technologies and tracking systems have been around for several years, recently they have become affordable and used in several areas such as physical and mental rehabilitation, educational applications, physical exercises, and natural interactions, among others. This work presents the integration of two mainstream videogame interfaces as tools for developing an interactive lower and upper member therapy tool. The goal is to study the potential of these devices as complementing didactic elements for improving and following user performance during a series of exercises with virtual and real devices.


Subject(s)
Actigraphy/instrumentation , Lower Extremity/physiopathology , Movement Disorders/rehabilitation , Therapy, Computer-Assisted/instrumentation , Upper Extremity/physiopathology , User-Computer Interface , Video Games , Equipment Design , Equipment Failure Analysis , Humans , Man-Machine Systems , Transducers
SELECTION OF CITATIONS
SEARCH DETAIL
...