Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Language
Publication year range
1.
Sensors (Basel) ; 22(24)2022 Dec 17.
Article in English | MEDLINE | ID: mdl-36560330

ABSTRACT

Artificial Intelligence (AI) for human emotion estimation, such as facial emotion estimation, has been actively studied. On the other hand, there has been little research on unconscious phenomena in cognition and psychology (i.e., cognitive biases) caused by viewing AI emotion estimation information. Therefore, this study verifies RQ "Do people have a cognitive bias in which impressions of others (i.e., how to see and feel about others) are changed by viewing biased AI's emotion estimation information? If it exists, can impression manipulation methods that intentionally use this cognitive bias be realized?" The proposed method for verification makes the emotion estimation system biased so as to estimate emotion more positively/negatively than AI without bias. A prototype system was implemented. Evaluation using video showed that the presentation of biased emotion estimation information causes a phenomenon that quickly and unconsciously changes the way people see and feel others' impressions, which supported the RQ. Specifically, viewing information that estimated others' emotions more positively/negatively caused the phenomenon in which the user's self-judgment was overridden and others' impressions of emotions, words, and actions were perceived more positively/negatively. The existence of this phenomenon and method indicates that biased emotion estimation AI has the potential to both cause adverse effects on people and support people for good purposes through the manipulation of their impressions. This study provides helpful insights for the design and use of emotion estimation AI considering cognitive biases.


Subject(s)
Artificial Intelligence , Emotions , Humans , Attitude , Cognition , Bias
SELECTION OF CITATIONS
SEARCH DETAIL
...