Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Language
Publication year range
1.
Hum Factors ; : 187208231202572, 2023 Sep 21.
Article in English | MEDLINE | ID: mdl-37734726

ABSTRACT

OBJECTIVE: The objective of our research is to advance the understanding of behavioral responses to a system's error. By examining trust as a dynamic variable and drawing from attribution theory, we explain the underlying mechanism and suggest how terminology can be used to mitigate the so-called algorithm aversion. In this way, we show that the use of different terms may shape consumers' perceptions and provide guidance on how these differences can be mitigated. BACKGROUND: Previous research has interchangeably used various terms to refer to a system and results regarding trust in systems have been ambiguous. METHODS: Across three studies, we examine the effect of different system terminology on consumer behavior following a system failure. RESULTS: Our results show that terminology crucially affects user behavior. Describing a system as "AI" (i.e., self-learning and perceived as more complex) instead of as "algorithmic" (i.e., a less complex rule-based system) leads to more favorable behavioral responses by users when a system error occurs. CONCLUSION: We suggest that in cases when a system's characteristics do not allow for it to be called "AI," users should be provided with an explanation of why the system's error occurred, and task complexity should be pointed out. We highlight the importance of terminology, as this can unintentionally impact the robustness and replicability of research findings. APPLICATION: This research offers insights for industries utilizing AI and algorithmic systems, highlighting how strategic terminology use can shape user trust and response to errors, thereby enhancing system acceptance.

SELECTION OF CITATIONS
SEARCH DETAIL
...