Beat Me, Whip Me, Make Me Honest!
How punishment and reputation combine to create cooperation
What is the most amazing thing about our species? Upright stature? Toolmaking? Orgasms for both sexes? Language? I think that it is the fact that we voluntarily cooperate with non-kin members of our species. This is a trick that few other species have been able to pull off. The result: vast webs of cooperation among human strangers that now span the globe.
Cooperation depends on reciprocal altruism—I scratch your back and you scratch mine. I give you something you want and in return you give me something I desire. So far so good, but sustained cooperation is a puzzle because the temptation to cheat—gain benefits without paying costs—is great. Why don't you just take what I give you and not return the favor? Part of the answer is that you fear that I might "go postal" on you and make you lose more than you dishonestly took. Researchers call such people altruistic punishers and it turns out that they are very important in maintaining cooperation in groups. Somebody has to be willing to bear the costs of beating up free riders from time to time so that everyone gets the message that cheating won't pay. But new research suggests that there is more to encouraging cooperation than the threat of punishment.
In the current issue of Nature, University of Erfurt economist Bettina Rockenbach and Max-Planck-Institute evolutionary ecologist Manfred Milinski, publish an article "The efficient interaction of indirect reciprocity and costly punishment" that looks at how the interplay of reputation and punishment work together to create even greater levels of cooperation (not available online). These German researchers set up experimental public goods games with 144 students. An instance of a typical public goods game is one in which every participant is given an initial endowment that they are free to keep for themselves or contribute to a common pool. Whatever is contributed by the players to the common pool is doubled. So if four players are given $1 each and each puts into the pool, they it will double to $8 and each will receive $2. However, let's say that only 3 contribute to the pool that is then doubled to $6 from which each of the four players receives $1.50. That means the non-cooperator keeps his $1 and gets $1.50 making a total of $2.50 while the other players get only $1.50. Not surprisingly, experiments show that without the ability to punish chiselers cooperation will soon break down in this situation.
The German researchers wanted to look at how reputation affects cooperation. They note that "the mechanisms of punishment and reputation substantially differ in their means of 'disciplining' non-cooperators. Direct punishment incurs salient costs for both the punisher and the punished, whereas reputation mechanisms discipline by withholding action, immediately saving costs for the 'punisher.'" Rockencbach and Milinski wondered if costly punishment may become extinct in environments in which effective reputation building offers a cheaper way to sustain cooperation. To probe this question the researchers designed two sets of public goods games. In one set, at the beginning of each period of play, participants could choose to join a group in which the public goods game is followed by both a punishment game (punishment option game) and then a reciprocity game that establishes a player's reputation or one which is followed solely by a reciprocity game (punishment free game). As a control they ran a second set of public goods games in which participants could choose between punishment and punishment free options, with no reputation game involved.
In both sets of games, at the beginning of public goods rounds, the each player received 10 monetary units (MUs) which they could contribute to the common pool. All MUs contributed to the common pool were multiplied by 1.6 and the benefit was equally distributed to all. All MUs kept back from the pool were retained by players. In a second round, all players received an additional 7 MUs. In the punishment free game players simply keep them for their private account.
However, in the punishment option game, players could use their 7 MUs to punish other players. A punisher could spend 1 MU to reduce the gains of a non-cooperator by 3 MUs. Any MUs not used to punish low-down cheaters are transferred to each player's private account.
Then the first set of common pool games moves to the reputation round, in which players each get 3 MUs. In this round each player can send 3 MUs to another player. If the sender chooses to send, the MUs are tripled so a receiver gains 9 MUs. If a receiver refuses to send, he keeps his 3 MUs and stands to gain 9 from a trusting sender for a total of 12 MUs. At the start of each round the experimenters mix up sender/receiver pairs in order to prevent long-term reciprocity being established between two players. Before sending begins, each player is informed of how much the receiving player has previously contributed to the common pool and how much he had previously sent to other receivers.
So what did they find? Punishment does not go "extinct," but it is used less when people can also rely on reputation. In line with previous findings, the punishment option games encouraged people to contribute a lot more to the common good—almost double the punishment free options. Combining both punishment and reputation boosted over all contributions to the common pool even more and raised the benefits to players to almost their theoretical maximum. On the other hand, relying on withholding based on reputation boosted cooperation a little bit, but by not much more than games that included no punishment at all. In other words, shunning cheaters doesn't deter them all that much.
The good news is that the researchers found that allowing players to punish and to withhold benefits based on reputation did reduce punishment behavior by almost two-thirds compared to games in which punishment only was permitted "This marked effect was achieved by concentrating the remaining punishing acts on the smaller number of free riders in the [combined punishment and reputation games] which were thus more heavily punished than the free riders in [solely punishment games]," report the researchers. In addition, the punishment effect on free riders was further amplified by withholding during the reputation rounds. They found that this double punishment dramatically boosted cooperation from even recalcitrant free-riders.
The researchers conclude, "Reputation mechanisms generate an environment in which the execution of punishment has to be less frequent, however, without taking away its deterring force. The interaction of both mechanisms not only comes closer to real life but also provides a convenient, low-cost solution to social dilemmas, keeping the teeth of costly punishment covered but ready to bite." In other words, being known as a nice person helps people get along, but you will always need to carry a big stick even if you do have to use it less.
Ronald Bailey is Reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Show Comments (0)