The ‘knowledge curse’: More isn’t necessarily better
By Tom Fleischman, Cornell Chronicle
A year ago, economics professor Kaushik Basu was in his office hosting a colleague from Sweden, mapping ideas out on a blackboard.
Among those ideas: Can an increase in knowledge ever be a bad thing? They theorized that yes, it could – when people use it to act in their own self-interest rather than in the best interests of the larger group.
“We were standing in front of a blackboard talking about different themes in game theory and Kaushik, chalk in hand, drew up a striking example of the phenomenon,” said Jörgen Weibull, professor emeritus at the Stockholm School of Economics.
Even for a group of rational individuals, greater knowledge can backfire. And, they said, enhanced knowledge about an existing reality – such as the cost-benefit of wearing a face mask to help prevent the spread of disease – may hinder cooperation among purely self-interested individuals.
“We assume that a scientific breakthrough that gives us a deeper understanding of the world can only help,” said Basu, the Carl Marks Professor of International Studies at the Mario Einaudi Center for International Studies, and professor in the College of Arts and Sciences and the Cornell SC Johnson College of Business. “Our paper shows that in the real world, where many people live and strive individually or in small groups to do well for themselves, this intuition may not hold. Science may not be the panacea we take it to be.”
Basu and Weibull are co-authors of “A Knowledge Curse: How Knowledge Can Reduce Human Welfare,” published Aug. 7 in Royal Society Open Science.
Their paper bounces through history like gravitational waves. From Robert Oppenheimer’s dread following the first atomic bomb test, to 17th-century astronomer Galileo’s studies of the moon, to the classic theoretical game Prisoner’s Dilemma, to the famous 1990 tapper-listener experiment by then-Stanford graduate student Elizabeth Newton, Basu and Weibull build the case – with modeling in a theoretical two-player Base Game – that the “knowledge curse” can happen if, at first, only a few people are privy to the greater knowledge.
In the Base Game, each player has two actions to choose from. Hence, there are four combinations of actions, each with expected payoffs to both players. Each player chooses so as to maximize their own payoff.
However, if another set of options is added that introduces the chance that the other player would get nothing, along with an option of a very small payoff for both, the mutual small reward becomes more appealing – a form of the Prisoner’s Dilemma, in which two “prisoners” can either cooperate for mutual benefit, or betray their partner for individual reward. In other words, more “knowledge” can lead to worse overall outcomes.
The paper however goes further, and shows that a scientific breakthrough that does not add any new option but simply deepens the players’ understanding of the payoffs and their fluctuations can make the players worse off.
The authors extend their theoretical calculations into real-world dilemmas, such as crafting policy without knowing the full contours of a problem. The drafting of a nation’s constitution, for instance, must anticipate and address problems likely to occur well into a future with unknowable sets of circumstances. “Such preemptive laws have conferred large benefits to humankind,” the authors wrote.
The authors make the case that just because a bad situation is human-caused doesn’t mean it can be prevented.
“Game theory is a reminder,” they wrote, “that when a group of people interact, and each person is held responsible for her actions, we cannot always carry this logic over to the group, holding the group responsible for the combined action profile that the group ‘chooses.’ As analysts, we have to design rules to guard against the adversities.”
At the end of a highly theoretical exercise, Basu offers a concrete takeaway to their work, the seeds of which were planted in front of a blackboard.
“By drawing attention to this paradoxical result,” he said, “the paper urges policymakers and even the lay person to think of preemptive actions, agreements and moral commitments that we as human beings should take and make to avert disasters that future scientific advances can cause.
“Science can yield huge benefits, but we need safeguards,” he said. “What those are, we do not know. But the paper urges us to pay attention to this.”
Funding for this research came from the Jan Wallander and Tom Hedelius Foundation.
Media Contact
Get Cornell news delivered right to your inbox.
Subscribe