Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters











Database
Language
Publication year range
1.
J Exp Psychol Appl ; 2024 Apr 11.
Article in English | MEDLINE | ID: mdl-38602790

ABSTRACT

Many decisions rely on intuitive predictions based on time series data showing a trend. For instance, the current upward trend in global temperatures might lead to specific predictions about the extent to which global temperatures will rise in the future, and these predictions might be used to inform judgments about the urgency with which climate change must be addressed. However, those predictions often need to be revised to incorporate the effects of unexpected events that might accelerate a trend (i.e., increase its rate of change), such as an unanticipated increase in CO2 emissions, or decelerate a trend (i.e., decrease its rate of change), such as an unanticipated reduction in CO2 emissions. In this work, we uncover a new cognitive bias by which people neglect how much a trend can accelerate (vs. decelerate) due to unexpected events. We explain this bias in terms of momentum theory and a naive understanding of physics. These findings have important implications for businesses and policymakers seeking to communicate information about topics such as climate change, stock market prices, or disease prevention. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

2.
J Exp Psychol Gen ; 151(9): 2250-2258, 2022 Sep.
Article in English | MEDLINE | ID: mdl-35143248

ABSTRACT

As algorithms increasingly replace human decision-makers, concerns have been voiced about the black-box nature of algorithmic decision-making. These concerns raise an apparent paradox. In many cases, human decision-makers are just as much of a black-box as the algorithms that are meant to replace them. Yet, the inscrutability of human decision-making seems to raise fewer concerns. We suggest that one of the reasons for this paradox is that people foster an illusion of understanding human better than algorithmic decision-making, when in fact, both are black-boxes. We further propose that this occurs, at least in part, because people project their own intuitive understanding of a decision-making process more onto other humans than onto algorithms, and as a result, believe that they understand human better than algorithmic decision-making, when in fact, this is merely an illusion. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Subject(s)
Illusions , Decision Making , Humans
3.
J Exp Psychol Appl ; 27(2): 447-459, 2021 Jun.
Article in English | MEDLINE | ID: mdl-33749300

ABSTRACT

Algorithms have been the subject of a heated debate regarding their potential to yield biased decisions. Prior research has focused on documenting algorithmic bias and discussing its origins from a technical standpoint. We look at algorithmic bias from a psychological perspective, raising a fundamental question that has received little attention: are people more or less likely to perceive decisions that yield disparities as biased, when such decisions stem from algorithms as opposed to humans? We find that algorithmic decisions that yield gender or racial disparities are less likely to be perceived as biased than human decisions. This occurs because people believe that algorithms, unlike humans, decontextualize decision-making by neglecting individual characteristics and blindly applying rules and procedures irrespective of whom they are judging. In situations that entail the potential for discrimination, this belief leads people to think that algorithms are more likely than humans to treat everyone equally, thus less likely to yield biased decisions. This asymmetrical perception of bias, which occurs both in the general population and among members of stigmatized groups, leads people to endorse stereotypical beliefs that fuel discrimination and reduces their willingness to act against potentially discriminatory outcomes. (PsycInfo Database Record (c) 2021 APA, all rights reserved).


Subject(s)
Algorithms , Bias , Humans
SELECTION OF CITATIONS
SEARCH DETAIL