Imagine that someone offers you and some other anonymous person $100 to share. The rules are strict and known to both players. The two of you are in separate rooms and cannot exchange information. A coin toss decides which of you will propose how to share the money. If you are the proposer you can make a single offer of how to split the sum, and the other person—the responder—can say yes or no. If the responder’s answer is yes, the deal goes ahead. If the answer is no, neither of you gets anything. In both cases, the game is over and will not be repeated. What will you do?
Instinctively, many people feel they should offer 50 percent, because such a division is “fair” and therefore likely to be accepted. More daring people, however, think they might get away with offering somewhat less than half of the sum.
You may not be surprised to learn that two-thirds of the offers are between 40 and 50 percent. Only four in 100 people offer less than 20 percent. Proposing such a small amount is risky, because it might be rejected. More than half of all responders reject offers that are less than 20 percent. But why should anyone reject an offer as “too small”? The responder has just two choices: take what is offered or receive nothing. The only rational option for a selfish individual is to accept any offer. A selfish proposer who is sure that the responder is also selfish will therefore make the smallest possible offer and keep the rest. This game-theory analysis, which assumes that people are selfish and rational, tells you that the proposer should offer the smallest possible share and the responder should accept it. But this is not how most people play the game.
The scenario just described, called the Ultimatum Game, was devised some twenty years ago. Experimenters subsequently studied the Ultimatum Game intensively in many places using diverse sums. The results proved remarkably robust. Behavior in the game did not appreciably depend on the players’ sex, age, schooling, or numeracy. Moreover, the amount of money involved had surprisingly little effect on results. Yet the range of players remained limited, because the studies primarily involved people in more developed countries and often university students.
Recently, a cross-cultural study in fifteen small-scale societies showed that there were sizable differences in the way some people play the Ultimatum Game. Within the Machiguenga tribe (from the Amazon) the mean offer was considerably lower than in typical Western-type civilizations—26 percent instead of 45 percent. Conversely, many members of the Au tribe (from Papua New Guinea) offered more than 50 percent. Cultural traditions in gift giving, and the strong obligations that result from accepting a gift, play a major role among some tribes, such as the Au. Yet despite these cultural variations, the outcome was always far from what rational analysis would dictate for selfish players. Most people all over the world place a high value on fair outcomes.
For a long time, theoretical economists postulated a being called Homo economicus—a rational individual relentlessly bent on maximizing a purely selfish reward. But the lesson from the Ultimatum Game and similar experiments is that real people are a crossbreed of H. economicus and H. emoticus, a complicated hybrid species that can be ruled as much by emotion as by cold logic and selfishness. An interesting challenge is to understand how Darwinian evolution would produce creatures instilled with emotions and behaviors that do not immediately seem geared toward reaping the greatest benefit for individuals or their genes.
Adapted from K. Sigmund, E. Fehr, and M.A. Nowak, “The economics of fair play.” ©2002 by Scientific American, Inc.