I greatly enjoyed reading Robert Axelrod’s 1984 book Evolution of
Cooperation.
Making great science takes two steps. First you need to
finding a simple model for a large
class of real world situations. Second, and this is the fun part, as you
tease out the implications of your model the grand hope is it
will cast an exciting light on how the real world works.
Axelrod managed to do just that. He takes an old and venerable model – the Prisoner’s delema
problem from game theory – and make it new again.
The Prisoner’s delema takes it’s name from a story. Imagine yourself
a crook. You’ve been hauled in by cops, along with your accomplice.
The cops offer you a deal. Look, they say your going up the river.
“A year in the slammer. But wait! Here’s the deal: rat on your buddy
and we’ll let you go! He’ll get 7 years.” But their’s a catch. If
he rats too you both get 3 years.
This framework is amazingly common in the real world. Cooperate and
you both do ok. Defect and you stand to gain, but if everybody
defects everybody is then worse off. Two examples. Image you’ve driving
and you come to a four way stop. You could defect, and not bother to stop.
If the other guy cooperates your golden. Consider voting.
You could not bother, but if everybody skips out then only the nuts
are left voting.
Over the years thousands of undergraduates have been given this puzzle
in various forms by psychologists, economists, game theory experts,
economists and other assorted researchers. Hundreds of papers have
been written.
Axelrod innovation was to see what happens if the same two
crooks play the game numerous times. He asked the question what
would be a good strategy for you, in your role as crook, to adopt.
The first thing I loved about this book is that Axelrod avoided
the hard work. Instead of puzzling out a whole range of possible
strategies he let the Internet do the work. He held a contest and
invited experts all over the planet to submit strategies. Then
he published the results and did it again. In effect he tapped
into some of the Open Source karma, create a good binding surface
and the talent will come do the work for the fun of it.
The second thing I loved is his conclusion about what the best
strategy looks like:
- be Nice, Cooperative, so you play nice with others,
- be Quick to Reciprocate, so you clearly signals nice
or mean as approprate,
- be Forgiving and patient, so you don’t get stuck being mean,
- be Simple, so the other players can understand what your doing.
It is a notable a consequence of those rules that envy isn’t much
use since it is just likely to engender unproductive actions that
will be seen as mean by other players.
A good example of such a strategy is a simple tit-for-tat. If
your nice to it on one turn it’s nice to you on the next one; and
visa-versa. In fact this simple rule dominated both contests.
Notice that tit-for-tat is so amazingly simple that you don’t need a
lot of machinery to make it work. For example he hypothizes that even
a species of bacteria could implement it. That might explain a range
of symbiotic relationships found in nature. It’s simple enough that
even advisaries can implement it. He demonstrates that in the first
world war trench warfare regularly transitioned to cooperation using
just such a strategy.
Notice that the key requirement is multiple rounds. If you know that
this is the last round then you might better off being mean. In the
tourniments the chance the game would continue was controled by a
parameter W. W was 98-99% percent; so the games tended to go on for
quite a while. Of course in the real world we never know the value of
W. We have all kinds of rituals to try and keep W high. Loyality
oaths, stock-options, contracts, etc. I was struck that whenever
somebody leaves a community this signals to all the existing players
that W might just be lower than they thought.
I see this a lot in Open Source projects where contributors come to
learn that if they are in for the longterm it is very much to their
advantage to cooperate rather then act selfishly. Though it may take
a while for the advantages of that strategy to get thru somepeople’s
thick heads.
On this foundation Axelrod then teases out some very delightful
conclusions. He asks a question about culture. If a culture
has adopted a certain strategy how stable is it. First he shows
that a diverse culture of strategies will trend toward nice.
For a while some mean stratigies may do ok, but as the players
they prey on die out the nice players will begin to dominate.
He shows, sadly, that a culture purely of mean strategies is in fact
stable and even that the arrival of single players who are nice won’t
disrupt it. But, happilly, a small community of nice players can
enter into such a culture and slowly come to dominate it. This is
particularly true if the small community can tend to play with each
other and hence capture more of the advantage of cooperation. To me
this speaks volumes about the human tendency to form semi-closed
communities. It’s suggestive (a confusion of cause and effect) why
people tend to assume that those on the outside are mean.
There is a lot to chew on in this book. Facinating things about
minorities, hierarchies, enclaves, etc. etc. But in the end what’s
marvelous about this book is it’s message of hope. It lays an
exciting foundation. In short it strongly suggests that nice cooperative
behavior (with players willing to react when they are misused) is a
dominate strategy over the selfish behavior of those imaginary
rational men who populate so much of pop-economics.