Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
Add more filters










Publication year range
1.
Nature ; 600(7889): 478-483, 2021 12.
Article in English | MEDLINE | ID: mdl-34880497

ABSTRACT

Policy-makers are increasingly turning to behavioural science for insights about how to improve citizens' decisions and outcomes1. Typically, different scientists test different intervention ideas in different samples using different outcomes over different time intervals2. The lack of comparability of such individual investigations limits their potential to inform policy. Here, to address this limitation and accelerate the pace of discovery, we introduce the megastudy-a massive field experiment in which the effects of many different interventions are compared in the same population on the same objectively measured outcome for the same duration. In a megastudy targeting physical exercise among 61,293 members of an American fitness chain, 30 scientists from 15 different US universities worked in small independent teams to design a total of 54 different four-week digital programmes (or interventions) encouraging exercise. We show that 45% of these interventions significantly increased weekly gym visits by 9% to 27%; the top-performing intervention offered microrewards for returning to the gym after a missed workout. Only 8% of interventions induced behaviour change that was significant and measurable after the four-week intervention. Conditioning on the 45% of interventions that increased exercise during the intervention, we detected carry-over effects that were proportionally similar to those measured in previous research3-6. Forecasts by impartial judges failed to predict which interventions would be most effective, underscoring the value of testing many ideas at once and, therefore, the potential for megastudies to improve the evidentiary value of behavioural science.


Subject(s)
Behavioral Sciences/methods , Clinical Trials as Topic/methods , Exercise/psychology , Health Promotion/methods , Research Design , Adult , Female , Humans , Male , Motivation , Regression Analysis , Reward , Time Factors , United States , Universities
2.
Proc Natl Acad Sci U S A ; 117(13): 7103-7107, 2020 03 31.
Article in English | MEDLINE | ID: mdl-32179683

ABSTRACT

Honest reporting is essential for society to function well. However, people frequently lie when asked to provide information, such as misrepresenting their income to save money on taxes. A landmark finding published in PNAS [L. L. Shu, N. Mazar, F. Gino, D. Ariely, M. H. Bazerman, Proc. Natl. Acad. Sci. U.S.A. 109, 15197-15200 (2012)] provided evidence for a simple way of encouraging honest reporting: asking people to sign a veracity statement at the beginning instead of at the end of a self-report form. Since this finding was published, various government agencies have adopted this practice. However, in this project, we failed to replicate this result. Across five conceptual replications (n = 4,559) and one highly powered, preregistered, direct replication (n = 1,235) conducted with the authors of the original paper, we observed no effect of signing first on honest reporting. Given the policy applications of this result, it is important to update the scientific record regarding the veracity of these results.


Subject(s)
Contracts , Deception , Humans
3.
Proc Natl Acad Sci U S A ; 116(48): 23989-23995, 2019 11 26.
Article in English | MEDLINE | ID: mdl-31719198

ABSTRACT

The "veil of ignorance" is a moral reasoning device designed to promote impartial decision making by denying decision makers access to potentially biasing information about who will benefit most or least from the available options. Veil-of-ignorance reasoning was originally applied by philosophers and economists to foundational questions concerning the overall organization of society. Here, we apply veil-of-ignorance reasoning in a more focused way to specific moral dilemmas, all of which involve a tension between the greater good and competing moral concerns. Across 7 experiments (n = 6,261), 4 preregistered, we find that veil-of-ignorance reasoning favors the greater good. Participants first engaged in veil-of-ignorance reasoning about a specific dilemma, asking themselves what they would want if they did not know who among those affected they would be. Participants then responded to a more conventional version of the same dilemma with a moral judgment, a policy preference, or an economic choice. Participants who first engaged in veil-of-ignorance reasoning subsequently made more utilitarian choices in response to a classic philosophical dilemma, a medical dilemma, a real donation decision between a more vs. less effective charity, and a policy decision concerning the social dilemma of autonomous vehicles. These effects depend on the impartial thinking induced by veil-of-ignorance reasoning and cannot be explained by anchoring, probabilistic reasoning, or generic perspective taking. These studies indicate that veil-of-ignorance reasoning may be a useful tool for decision makers who wish to make more impartial and/or socially beneficial choices.


Subject(s)
Decision Making/ethics , Morals , Problem Solving/ethics , Female , Humans , Judgment , Male , Motor Vehicles , Policy Making
4.
Harv Bus Rev ; 92(7-8): 116-9, 131, 2014.
Article in English | MEDLINE | ID: mdl-25065255

ABSTRACT

We'd like to think that no smart, upstanding manager would ever overlook or turn a blind eye to threats or wrongdoing that ultimately imperil his or her business. Yet it happens all the time. We fall prey to obstacles that obscure or drown out important signals that things are amiss. Becoming a "first-class noticer," says Max H. Bazerman, a professor at Harvard Business School, requires conscious effort to fight ambiguity, motivated blindness, conflicts of interest, the slippery slope, and efforts of others to mislead us. As a manager, you can develop your noticing skills by acknowledging responsibility when things go wrong rather than blaming external forces beyond your control. Bazerman also advises taking an outsider's view to challenge the status quo. Given the string of ethical failures of corporations around the world in recent years--from BP to GM to JP Morgan Chase--it's clear that leaders not only need to act more responsibly themselves but also must develop keen noticing skills in their employees and across their organizations.


Subject(s)
Ethics, Business , Personnel Management/methods , United States
5.
Proc Natl Acad Sci U S A ; 109(38): 15197-200, 2012 Sep 18.
Article in English | MEDLINE | ID: mdl-22927408

ABSTRACT

Many written forms required by businesses and governments rely on honest reporting. Proof of honest intent is typically provided through signature at the end of, e.g., tax returns or insurance policy forms. Still, people sometimes cheat to advance their financial self-interests-at great costs to society. We test an easy-to-implement method to discourage dishonesty: signing at the beginning rather than at the end of a self-report, thereby reversing the order of the current practice. Using laboratory and field experiments, we find that signing before-rather than after-the opportunity to cheat makes ethics salient when they are needed most and significantly reduces dishonesty.


Subject(s)
Ethics , Fraud , Adult , Behavior , Commerce , Deception , Female , Humans , Male , Motivation , Public Policy , Self Report , Young Adult
6.
Harv Bus Rev ; 89(4): 58-65, 137, 2011 Apr.
Article in English | MEDLINE | ID: mdl-21510519

ABSTRACT

Companies are spending a great deal of time and money to install codes of ethics, ethics training, compliance programs, and in-house watchdogs. If these efforts worked, the money would be well spent. But unethical behavior appears to be on the rise. The authors observe that even the best-intentioned executives may be unaware of their own or their employees' unethical behavior. Drawing from extensive research on cognitive biases, they offer five reasons for this blindness and suggest what to do about them. Ill-conceived goals may actually encourage negative behavior. Brainstorm unintended consequences when devising your targets. Motivated blindness makes us overlook unethical behavior when remaining ignorant would benefit us. Root out conflicts of interest. Indirect blindness softens our assessment of unethical behavior when it's carried out by third parties. Take ownership of the implications when you outsource work. The slippery slope mutes our awareness when unethical behavior develops gradually. Be alert for even trivial infractions and investigate them immediately. Overvaluing outcomes may lead us to give a pass to unethical behavior. Examine good outcomes to ensure they're not driven by unethical tactics.


Subject(s)
Administrative Personnel/ethics , Commerce/ethics , Humans , United States
7.
Pers Soc Psychol Bull ; 37(3): 330-49, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21307176

ABSTRACT

People routinely engage in dishonest acts without feeling guilty about their behavior. When and why does this occur? Across four studies, people justified their dishonest deeds through moral disengagement and exhibited motivated forgetting of information that might otherwise limit their dishonesty. Using hypothetical scenarios (Studies 1 and 2) and real tasks involving the opportunity to cheat (Studies 3 and 4), the authors find that one's own dishonest behavior increased moral disengagement and motivated forgetting of moral rules. Such changes did not occur in the case of honest behavior or consideration of the dishonest behavior of others. In addition, increasing moral saliency by having participants read or sign an honor code significantly reduced unethical behavior and prevented subsequent moral disengagement. Although dishonest behavior motivated moral leniency and led to forgetting of moral rules, honest behavior motivated moral stringency and diligent recollection of moral rules.


Subject(s)
Attitude , Conscience , Deception , Morals , Analysis of Variance , Decision Making , Female , Fraud , Humans , Male , Motivation , Social Behavior , Young Adult
8.
Perspect Psychol Sci ; 5(2): 209-12, 2010 Mar.
Article in English | MEDLINE | ID: mdl-26162128

ABSTRACT

Bennis, Medin, and Bartels (2010, this issue) have contributed an interesting article on the comparative benefit of moral rules versus cost-benefit analysis (CBA). Many of their specific comments are accurate, useful, and insightful. At the same time, we believe they have misrepresented CBA and have reached a set of conclusions that are misguided and, if adopted wholesale, potentially dangerous. Overall, they offer wise suggestions for making CBA more effective, rather than eliminating CBA as a decision-making tool.

9.
J Am Diet Assoc ; 109(6): 1088-91, 2009 Jun.
Article in English | MEDLINE | ID: mdl-19465193

ABSTRACT

Research during the last several decades indicates the failure of existing nutritional labels to substantially improve the healthfulness of consumers' food/beverage choices. The present study aims to fill this void by developing a nutrition metric that is more comprehensible to the average shopper. The healthfulness ratings of 205 sample foods/beverages by leading nutrition experts formed the basis for a linear regression that places weights on 12 nutritional components (ie, total fat, saturated fat, cholesterol, sodium, total carbohydrate, dietary fiber, sugars, protein, vitamin A, vitamin C, calcium, and iron) to predict the average healthfulness rating that experts would give to any food/beverage. Major benefits of the model include its basis in expert judgment, its straightforward application, the flexibility of transforming its output ratings to any linear scale, and its ease of interpretation. This metric serves the purpose of distilling expert knowledge into a form usable by consumers so that they are empowered to make more healthful decisions.


Subject(s)
Expert Testimony , Food/standards , Models, Biological , Nutritional Sciences , Nutritive Value , Humans , Least-Squares Analysis , Linear Models
10.
Perspect Psychol Sci ; 4(4): 379-83, 2009 Jul.
Article in English | MEDLINE | ID: mdl-26158985

ABSTRACT

The optimal moment to address the question of how to improve human decision making has arrived. Thanks to 50 years of research by judgment and decision-making scholars, psychologists have developed a detailed picture of the ways in which human judgment is bounded. This article argues that the time has come to focus attention on the search for strategies that will improve bounded judgment because decision-making errors are costly and are growing more costly, decision makers are receptive, and academic insights are sure to follow from research on improvement. In addition to calling for research on improvement strategies, this article organizes the existing literature pertaining to improvement strategies and highlights promising directions for future research.

11.
Perspect Psychol Sci ; 3(4): 324-38, 2008 Jul.
Article in English | MEDLINE | ID: mdl-26158952

ABSTRACT

Although observers of human behavior have long been aware that people regularly struggle with internal conflict when deciding whether to behave responsibly or indulge in impulsivity, psychologists and economists did not begin to empirically investigate this type of want/should conflict until recently. In this article, we review and synthesize the latest research on want/should conflict, focusing our attention on the findings from an empirical literature on the topic that has blossomed over the last 15 years. We then turn to a discussion of how individuals and policy makers can use what has been learned about want/should conflict to help decision makers select far-sighted options.

12.
Harv Bus Rev ; 85(9): 72-6, 78, 148, 2007 Sep.
Article in English | MEDLINE | ID: mdl-17886485

ABSTRACT

Negotiators often fail to achieve results because they channel too much effort into selling their own position and too little into understanding the other party's perspective. To get the best deal -or, sometimes, any deal at al--egotiators need to think like detectives, digging for information about why the other side wants what it does. This investigative approach entails a mind-set and a methodology, say Harvard Business School professors Malhotra and Bazerman. Inaccurate assumptions about the other side's motivations can lead negotiators to propose solutions to the wrong problems, needlessly give away value, or derail deals altogether. Consider, for example, the pharmaceutical company that deadlocked with a supplier over the issue of exclusivity in an ingredient purchase. Believing it was a ploy to raise the price, the drugmaker upped its offer--unsuccessfully. In fact, the supplier was balking because a relative's company needed a small amount of the ingredient to make a local product. Once the real motivation surfaced, a compromise quickly followed. Understanding the other side's motives and goals is the first principle of investigative negotiation. The second is to figure out what constraints the other party faces. Often when your counterpart's behavior appears unreasonable, his hands are tied somehow, and you can reach agreement by helping overcome those limitations. The third is to view onerous demands as a window into what the other party prizes most--and use that information to create opportunities. The fourth is to look for common ground; even fierce competitors may have complementary interests that lead to creative agreements. Finally, if a deal appears lost, stay at the table and keep trying to learn more. Even if you don't win, you can gain insights into a customer's future needs, the interests of similar customers, or the strategies of competitors.


Subject(s)
Negotiating/methods , Administrative Personnel , Humans , United States
13.
J Pers Soc Psychol ; 91(5): 857-71, 2006 Nov.
Article in English | MEDLINE | ID: mdl-17059306

ABSTRACT

Individuals working in groups often egocentrically believe they have contributed more of the total work than is logically possible. Actively considering others' contributions effectively reduces these egocentric assessments, but this research suggests that undoing egocentric biases in groups may have some unexpected costs. Four experiments demonstrate that members who contributed much to the group outcome are actually less satisfied and less interested in future collaborations after considering others' contributions compared with those who contributed little. This was especially true in cooperative groups. Egocentric biases in responsibility allocation can create conflict, but this research suggests that undoing these biases can have some unfortunate consequences. Some members who look beyond their own perspective may not like what they see.


Subject(s)
Cooperative Behavior , Decision Making/physiology , Ethics , Group Processes , Personal Satisfaction , Altruism , Authorship , Competitive Behavior/ethics , Competitive Behavior/physiology , Conflict, Psychological , Decision Making/ethics , Ego , Humans , Interpersonal Relations , Judgment/ethics , Judgment/physiology , Manuscripts as Topic , Peer Group , Perception/ethics , Perception/physiology , Prejudice , Social Behavior , Students/psychology , Surveys and Questionnaires
14.
J Pers Soc Psychol ; 91(5): 872-89, 2006 Nov.
Article in English | MEDLINE | ID: mdl-17059307

ABSTRACT

Group members often reason egocentrically, believing that they deserve more than their fair share of group resources. Leading people to consider other members' thoughts and perspectives can reduce these egocentric (self-centered) judgments such that people claim that it is fair for them to take less; however, the consideration of others' thoughts and perspectives actually increases egoistic (selfish) behavior such that people actually take more of available resources. A series of experiments demonstrates this pattern in competitive contexts in which considering others' perspectives activates egoistic theories of their likely behavior, leading people to counter by behaving more egoistically themselves. This reactive egoism is attenuated in cooperative contexts. Discussion focuses on the implications of reactive egoism in social interaction and on strategies for alleviating its potentially deleterious effects.


Subject(s)
Altruism , Ethics , Interpersonal Relations , Social Behavior , Analysis of Variance , Competitive Behavior/ethics , Competitive Behavior/physiology , Conflict, Psychological , Decision Making/ethics , Decision Making/physiology , Ego , Group Processes , Humans , Judgment/ethics , Judgment/physiology , Negotiating/psychology , Perception/ethics , Perception/physiology , Students/psychology , Surveys and Questionnaires
15.
Harv Bus Rev ; 84(1): 88-97, 133, 2006 Jan.
Article in English | MEDLINE | ID: mdl-16447372

ABSTRACT

By the time Merck withdrew its pain relief drug Vioxx from the market in 2004, more than 100 million prescriptions had been filled in the United States alone. Yet researchers now estimate that Vioxx may have been associated with as many as 25,000 heart attacks and strokes. Evidence of the drug's risks was available as early as 2000, so why did so many doctors keep prescribing it? The answer, say the authors, involves the phenomenon of bounded awareness--when cognitive blinders prevent a person from seeing, seeking, using, or sharing highly relevant, easily accessible, and readily perceivable information during the decision-making process. Doctors prescribing Vioxx, for instance, more often than not received positive feedback from patients. So, despite having access to information about the risks, physicians may have been blinded to the actual extent of the risks. Bounded awareness can occur at three points in the decision-making process. First, executives may fail to see or seek out the important information needed to make a sound decision. Second, they may fail to use the information that they do see because they aren't aware of its relevance. Third, executives may fail to share information with others, thereby bounding the organization's awareness. Drawing on examples such as the Challenger disaster and Citibank's failures in Japan, this article examines what prevents executives from seeing what's right in front of them and offers advice on how to increase awareness. Of course, not every decision requires executives to consciously broaden their focus. Collecting too much information for every decision would waste time and other valuable resources. The key is being mindful. If executives think an error could generate almost irrecoverable damage, then they should insist on getting all the information they need to make a wise decision.


Subject(s)
Awareness , Decision Making , Commerce/organization & administration , Humans , United States
16.
Perspect Psychol Sci ; 1(2): 123-32, 2006 Jun.
Article in English | MEDLINE | ID: mdl-26151467

ABSTRACT

We offer a psychological perspective to explain the failure of governments to create near-Pareto improvements. Our tools for analyzing these failures reflect the difficulties people have trading small losses for large gains: the fixed-pie approach to negotiations, the omission bias and status quo bias, parochialism and dysfunctional competition, and the neglect of secondary effects. We examine the role of human judgment in the failure to find wise trade-offs by discussing diverse applications of citizen and government decision making, including AIDS treatment, organ-donation systems, endangered-species protection, subsidies, and free trade. Our overall goal is to offer a psychological approach for understanding suboptimality in government decision making.

17.
Harv Bus Rev ; 81(3): 72-80, 140, 2003 Mar.
Article in English | MEDLINE | ID: mdl-12632806

ABSTRACT

Think hard about the problems in your organization or about potential upheavals in the markets in which you operate. Could some of those problems--ones no one is attending to--turn into disasters? If you're like most executives, you'll sheepishly answer yes. As Harvard Business School professors Michael Watkins and Max Bazerman illustrate in this timely article, most of the "unexpected" events that buffet companies should have been anticipated--they're "predictable surprises." Such disasters take many forms, from financial scandals to disruptions in operations, from organizational upheavals to product failures. Some result in short-term losses or distractions, while others cause damage that takes years to repair. Some are truly catastrophic--the events of September 11, 2001, are a tragic example of a predictable surprise. The bad news is that all companies, including your own, are vulnerable to predictable surprises. The good news is that recent research helps explain why that's so and what companies can do to minimize their risk. The authors contend that organizations' inability to prepare for predictable surprises can be traced to three sets of vulnerabilities: psychological, organizational, and political. To address these vulnerabilities, the authors recommend the RPM approach. More than just the usual environmental scanning and contingency planning, RPM requires a chain of actions--recognizing, prioritizing, and mobilizing--that companies must meticulously adhere to. Failure to apply any one of these steps, the authors say, can leave an organization vulnerable. Given the extraordinarily high stakes involved, it should be every business leader's core responsibility to apply the RPM approach, the authors conclude.


Subject(s)
Commerce/economics , Decision Making, Organizational , Disaster Planning/organization & administration , Leadership , Commerce/trends , Financial Management , Humans , Organizational Culture , Organizational Innovation , United States
18.
Harv Bus Rev ; 81(12): 56-64, 125, 2003 Dec.
Article in English | MEDLINE | ID: mdl-14712544

ABSTRACT

Answer true or false: "I am an ethical manager." If you answered "true," here's an Uncomfortable fact: You're probably wrong. Most of us believe we can objectively size up a job candidate or a venture deal and reach a fair and rational conclusion that's in our, and our organization's, best interests. But more than two decades of psychological research indicates that most of us harbor unconscious biases that are often at odds with our consciously held beliefs. The flawed judgments arising from these biases are ethically problematic and undermine managers' fundamental work--to recruit and retain superior talent, boost individual and team performance, and collaborate effectively with partners. This article explores four related sources of unintentional unethical decision making. If you're surprised that a female colleague has poor people skills, you are displaying implicit bias--judging according to unconscious stereotypes rather than merit. Companies that give bonuses to employees who recommend their friends for open positions are encouraging ingroup bias--favoring people in their own circles. If you think you're better than the average worker in your company (and who doesn't?), you may be displaying the common tendency to overclaim credit. And although many conflicts of interest are overt, many more are subtle. Who knows, for instance, whether the promise of quick and certain payment figures into an attorney's recommendation to settle a winnable case rather than go to trial? How can you counter these biases if they're unconscious? Traditional ethics training is not enough. But by gathering better data, ridding the work environment of stereotypical cues, and broadening your mind-set when you make decisions, you can go a long way toward bringing your unconscious biases to light and submitting them to your conscious will.


Subject(s)
Administrative Personnel/ethics , Commerce/organization & administration , Decision Making/ethics , Administrative Personnel/psychology , Commerce/ethics , Conflict of Interest , Culture , Data Collection , Group Processes , Humans , Personnel Management/standards , Prejudice , United States
19.
Harv Bus Rev ; 80(11): 96-102, 134, 2002 Nov.
Article in English | MEDLINE | ID: mdl-12422793

ABSTRACT

On July 30, President Bush signed into law the Sarbanes-Oxley Act addressing corporate accountability. A response to recent financial scandals, the law tightened federal controls over the accounting industry and imposed tough new criminal penalties for fraud. The president proclaimed, "The era of low standards and false profits is over." If only it were that easy. The authors don't think corruption is the main cause of bad audits. Rather, they claim, the problem is unconscious bias. Without knowing it, we all tend to discount facts that contradict the conclusions we want to reach, and we uncritically embrace evidence that supports our positions. Accountants might seem immune to such distortions because they work with seemingly hard numbers and clear-cut standards. But the corporate-auditing arena is particularly fertile ground for self-serving biases. Because of the often subjective nature of accounting and the close relationships between accounting firms and their corporate clients, even the most honest and meticulous of auditors can unintentionally massage the numbers in ways that mask a company's true financial status, thereby misleading investors, regulators, and even management. Solving this problem will require far more aggressive action than the U.S. government has taken thus far. What's needed are practices and regulations that recognize the existence of bias and moderate its effects. True auditor independence will entail fundamental changes to the way the accounting industry operates, including full divestiture of consulting and tax services, rotation of auditing firms, and fixed-term contracts that prohibit client companies from firing their auditors. Less tangibly, auditors must come to appreciate the profound impact of self-serving biases on their judgment.


Subject(s)
Accounting/standards , Commerce/economics , Financial Audit/standards , Accounting/legislation & jurisprudence , Commerce/legislation & jurisprudence , Financial Audit/legislation & jurisprudence , Fraud , Judgment , Prejudice , Social Responsibility , United States
20.
J Appl Psychol ; 87(1): 87-95, 2002 Feb.
Article in English | MEDLINE | ID: mdl-11916219

ABSTRACT

This study investigated whether cognitions and behavior in an asymmetric social dilemma can be predicted by national culture. Results indicated that, as predicted, groups of decision makers from Japan--a collectivist, hierarchical culture-were more cooperative, expected others to be more cooperative, and were more likely to adopt an equal allocation distribution rule to resolve the dilemma than were groups of decision makers from the United States-an individualist, egalitarian culture. An opportunity for communication had a greater impact on expectations of others' behavior in groups of U.S. decision makers than in groups of Japanese decision makers.


Subject(s)
Cognition , Decision Making , Social Conditions , Adult , Communication , Cultural Characteristics , Female , Humans , Japan/ethnology , Leadership , Male , United States/ethnology
SELECTION OF CITATIONS
SEARCH DETAIL
...