Saturday, January 16, 2010

Prisoner's Dilemma

I am having great fun reading William Poundstone's book on John Von Neumann, Game Theory, the RAND Corporation, etc. So much so, I have to try some of this myself! So I bring up a fresh project space in Linux KDE, and away I go.

How about a parameterized TIT FOR TAT. For example, there can be a couple of probabilities - depending on what the opponent did on the last move, cooperate or defect, this new strategy can use one probability or the other to decide whether to cooperate or defect this turn.

Next, suppose we have a whole population of TIT FOR TAT players, each with its own parameter values. We can play every one against the other and accumulate scores. Then we can pick the parameter values with the best scores, maybe the top half or whatever fraction we like.

The idea is to use this as a model for evolution. I will have to read Axelrod's The Evolution of Cooperation, but sometimes it's just more fun to jump in and try it before getting too bogged down reading about all the approaches everybody else has tried!

Now that I have selected the fittest strategies, I will reproduce them with a bit of variation to create a new generation of strategies. My current idea is to mimic biology: I will create a new strategy from a pair of existing strategies. For each parameter, the pair of existing strategies defines a range of values. I will pick the corresponding parameter value for the new strategy by picking randomly from that range, plus allowing a little extra mutation, a little running outside that range.

So, select the fittest in each generation, then reproduce with variation to create each next generation, repeat for many generations... what will happen?

My first experiment gives a curious result: there seems to be a kind of cycle, where gradually the strategies evolve into a polite TIT FOR TAT with little cheating, then there is a sudden burst of cheating, which again settles slowly back into politeness.

Is this a real property of this kind of system or just a bug? More investigation is needed! What fun!

No comments:

Post a Comment