February 08, 2004

Reading: Axelrod's "The Evolution of Cooperation" (1984)

Cooperation serves as the life blood of distributed, self-organizing systems. Without a Hobbesian central command, such a system's collective success depends upon the emergence and duration of cooperative behavior. Axelrod's seminal analysis of "durable iterated Prisoner's Dilemma" simulations proposes a theory of cooperation "based upon an investigation of individuals who pursue their own self-interest without the aid of a central authority to force them to cooperate with each other." Alexrod (1984) p. 6.

Emergence of cooperation is not dependent upon consciousness or friendship, as Axelrod illustrates through documented examples and reasoning in his book. It does depend upon what Axelrod dubbed "the shadow of the future," (the likelihood and importance of future interaction) and the power of reputation resulting from recognition of other players and recall of their past behavior. He provides four guidelines for individual success in such encounters, as well as a prescription for shaping one's environment to promote cooperation. (More ... )

Durable, Iterated Prisoner's Dilemma

A staple of game theory experiments, the Prisoner's Dilemma represents the situation in which two allied suspects are separated and their cooperation with each other tested. Their captors (and the prisoners) know that if both remain mute, they both obtain the modest Reward (R) of making the prosecution's job harder so that they might (or might not) go free. So the captors offer each one a substantial Temptation (T) for being the first and only of the pair to testify against the other, but make clear that if both squeal, both get only Punishment (P). If only one succumbs to temptation and talks, his faithfully mute confederate gets the Sucker's payoff (S).

Give the four outcomes numerical weights (such as R=3, T=5, P=0 and S=1), and you can represent the game mathematically and simulate it on a computer. When you run it through thousands of instances using various strategies competing against each other, it is said to be "iterated." When the various strategies that represent simulated players repeatedly go against each other and are allowed to recognize and remember the preceding behavior of the other simulated players, the iterated game is said to be "durable." Combined, the simulation is a "durable, iterated Prisoner's Dilemma."

In his book, Axelrod analyzes the elements that determined the success or failure of various strategies in the durable, iterated Prisoner's Dilemma simulation. He used his results to develop a Cooperation Theory "based upon an investigation of individuals who pursue their own self-interest without the aid of a central authority to force them to cooperate with each other." Alexrod (1984) p. 6.

Cooperation Can Emerge if the "Shadow of the Future" is Great Enough

Axelrod organized several open tournaments in which he invited game theorists, psychologists, computer scientists, game playing youths and the general public to write and submit strategic algorithms. He then ran those strategies against each other in durable iterated Prisoner's Dilemma simulations.

Axelrod found distinct differences in the relative success of various strategies based upon the importance given to future encounters with the same players. If the players expect to meet again, they are more likely to cooperate than when dealing with a "one time only" counterpart. "The future can therefore cast a shadow back upon the present," writes Axelrod, "and thereby affect the current strategic situation." For purposes of mathematical representation in his simulations, he gave the relative weight of the next move a "discount parameter" represented by the letter "w" in his equations.

Strategies based upon cooperation can be successful, he found. So, also can strategies based upon refusal to cooperate. Which succeeds depended on the circumstances and the values of the various parameters of the game .

The Success of "Tit for Tat"

In the simulation tournament Axelrod organized, the "Tit for Tat" strategy consistently got the highest total score after many iterations. That strategy is simply: start out cooperating, then do whatever the other player did (cooperate or defect) on the last move.

Axelrod's analysis of the data from the tournaments (detailed in the book) resulted in his identification of four characteristics that proved advantageous for all strategies, and in which Tit for Tat excelled.

1) They were "nice," never the first to defect.
2) They were "forgiving," able to cooperate after a defection.
3) They were "provocable," retaliating for defection with defection.
4) They were "clear," their strategy easy for other players to understand.

"The Gear Wheels of Social Evolution Have A Ratchet"

Axelrod also analyzed the chronology of the emergence of the success of the cooperative "Tit for Tat" strategy, developing theoretical propositions for when various strategies are "collectively stable." A strategy is "collectively stable" if it survives by continuing to outscore new strategies that attempt to invade its environment.

His analysis led him to the conclusion that the primeval "All Defect" strategy can be collectively stable. But his data showed that cooperation can emerge, even in an environment characterized by "All Defect" players ("a world of 'meanies'") if "nice" players (such as Tit for Tat players) enter in clusters, and the players in those clusters can distinquish "meanies" from "nice" players. If a "nice" strategy is provocable (like Tit for Tat) it can successfully "invade" a hostile environment, become stable and established, then defend against the invasion of clusters or individuals using another strategy. "Thus, the gear wheels of social evolution have a ratchet." Axelrod (1984) p. 21.

His analysis derives from the inherent logic of game theory. It is not dependent upon affinity between the cooperative parties, because it has been observed emerging between warring armies in trench warfare. It is not dependent upon consciousness, because it emerges among lower forms of biological organisms.

The Biological Realm and Dawkins' "Selfish Gene"

For support in the biological realm, he points to the Kinship Theory represented by Richard Dawkins' "The Selfish Gene" (1976) and Reciprocity Theory illustrated by studies of symbiotic relationships between unrelated biological species. As Dawkins has demonstrated, biologically related animals tend to cluster together, and they tend to instinctively behave altruistically among their own family group. These characteristics are adaptive and tend to be naturally selected, becoming genetically "hard wired." This provides the clustering of cooperators necessary for game theory to predict success and "collective stability," even in a hostile environment, given the right balance of payoff parameters.

Axelrod's Prescription for Shaping the Environment

Axelrod provides five suggestions for transforming a strategic setting to foster the emergence of cooperation.
1) Enlarge the shadow of the future by making interactions between players more frequent and more durable. This can be done by keeping others away (exclusive clubs are one example), by establishing hierarchy and bureaucracy that concentrates interaction between specialists, and by decomposing issues into smaller, more frequent encounters rather than a few large ones.
2) Change the payoffs. Governments and gangs change payoffs by increasing the penalties for defection. Even a small change can tip the balance between the value of the reward for cooperation and the penalty for defection.
3) Teach altruism. Valuing cooperation for its own sake tends to be self-reinforcing.
4) Teach reciprocity. Always turning the other cheek can encourage exploitative behavior. Discouraging exploitation by being provocable promotes overall cooperation.
5) Improve recognition capabilities. Reliable identification of players is essential to the ability to verify which have cooperated or defected in the past and to act accordingly.

If cooperation is "hard wired" into our universe by the mathematical laws of game theory, Axelrod's work and writing are an important element in the understanding of that phenomenon.

Robert Axelrod, "The Evolution of Cooperation" (1984).
The author was Professor of Political Science and Public Policy at the University of Michigan when this book was first published.


Posted by dougsimpson at February 8, 2004 02:06 PM | TrackBack