What organizations can learn from the Prisoner's Dilemma to structure their cooperation
Two bank robbers have been caught. The police interrogates the criminals - separately, of course. This is the starting point of a story that has found its way into (social) psychological literature as the "prisoner's dilemma". It can be used as a thought experiment, to define structures that promote cooperation within organizations.
Back to the two crooks: If no confession ensues, no evidence can be held against them. Both bank robbers would only receive a small fine for illegal possession of guns. If one of them confesses, but the other remains silent, the traitor will get off penalty-free as a key witness. If both confess, their punishment is reduced, but they still go to jail. How would you decide? Would you keep your mouth shut and take the risk of your partner bailing on you?
(drawn by Anna-Giulia Deutsch, CHC)
The dilemma is turned into a game
The political scientist Robert Axelrod* has researched the mathematic aspect of the prisoner's dilemma and turned the situation in the interrogation room into a game: If both of the bank robbers keep their mouth shut, they create their ideal situation: each one gets three points. If one of them confesses and the other doesn't, one gets five points, the other one is left empty-handed. If both of them betray their partner, each one receives only one point. The aim of the game is very simple: as a player you obviously want to take as many points home as possible (or in other words: to spend as few years behind bars as possible).
Axelrod asks himself a simple question: What would be the best strategy if the game continues after one round. Meaning each player can make several moves, and can decide for each round whether they cooperate with their partner or if they try to keep their five points. The question is which game strategy would be most successful then?
Computer competition: may the best strategy win!
To find out, Axelrod had various strategies compete against each other in a computer simulation. One of the most successful strategies under different conditions proved to be "Tit for Tat". The strategy is very simple: Whoever chooses to play by "Tit for Tat" always starts by cooperating and continues by simply imitating the player's previous move.
What this has to do with the workplace? According to Axelrod, a strategy like "Tit for Tat" which relies on the principle of cooperation works best if nobody knows when the game is going to end. If there is a good chance that I will meet my co-player again, it is often worthwhile to rely on cooperation – that goes for both parties. From this insight, we can derive indications of how organizations can create structures that promote cooperation. One is to promote long-term working relationships.
Restructuring is poison for peoples’ willingness to cooperate
Given this knowledge, frequent restructurings are poison for peoples’ willingness to cooperate, because employees lose their knowledge of "how the other person plays". According to Axelrod's findings, transferring managers to a different position every few years is counterproductive, because the manager is, so to speak, getting pulled out of the game.
Put in a new position, decisions of the past no longer have any effect. The present counts more than the future, decreasing the willingness to cooperate. Trust in the other parties’ willingness to cooperate must be established first - which takes time. In other words: I have to play the game a few times to understand how my team-mate acts.
Fixed team structures, on the other hand, have what it takes to strengthen cooperative behavior. According to Axelrod, this can also be applied to customer relationships: The cooperative relationship between client and provider becomes more stable, the more often they come together.
Divide the subject of negotiation - increase the frequency of interaction
Another advice that Axelrod deduces from this: When negotiating, it's best to break the subject of negotiation into small units: The negotiating partners see each other more often - ideally, both sides approach each other slowly, bit by bit, after one party signaled willingness to cooperate in their previous move. If our two crooks have known each other for decades and have often pulled some shady tricks together, both know that the other one will keep their mouth shut.
Ideally, the strategy of the other party is predictable or should at least be recognizable: Stringency, transparency and openness thus become core elements in organizational communication. They reduce the risk of misinterpreting the behavior of the other party. Overly complex game strategies prevent the other person from recognizing a clear, reliable pattern. As Axelrod puts it: "Don't be too clever."
Reward cooperation stronger than individual success
In the long run cooperation pays off for both parties. Executives who establish a culture in their company, which rewards cooperation stronger than individual success, have a good chance of establishing successful cooperative relationships among their employees. This theory can be taken in direct execution, for example, in establishing bonus systems within organizations. According to Axelrod, if you reward the performance of a team rather than that of individuals, the willingness to cooperate should increase.
If non-cooperation will result in a negative outcome for a player (for example, if both criminals know that every confession will result in a bullet in the head) - the dilemma described above disappears. The crooks can take the interrogation at ease. So, all companies have to do is raise the negative results for non-cooperation. Just refrain from using violence please. Sounds simple, doesn't it?
*vgl. Robert Axelrod: The Evolution of Cooperation. Revised Edition. New York, 2006