I have written a lot about connectivity as has my colleague Tom Barnett. An extremely interesting New York Times article addressed the related subject of cooperation [“In Games, an Insight into the Rules of Evolution,” by Carl Zimmer, 31 July 2007]. You obviously cannot cooperate if don’t connect, but cooperation goes far beyond connectivity. Zimmer points out, however, that cooperation is scientifically defined in different terms than we normally think of.
“When biologists speak of cooperation, they speak more broadly than the rest of us. Cooperation is what happens when someone or something gets a benefit because someone or something else pays a cost. The benefit can take many forms, like money or reproductive success. A friend takes off work to pick you up from the hospital. A sterile worker bee tends to eggs in a hive. Even the cells in the human body cooperate. Rather than reproducing as fast as it can, each cell respects the needs of the body, helping to form the heart, the lungs or other vital organs. Even the genes in a genome cooperate, to bring an organism to life.”
The term “cost” is a relative term. Most people don’t think in terms of “cost” when they pick a friend up from the hospital or perform some other act of service. Why? A Washington Post article by Shankar Vedantam helps explain why [“If It Feels Good to Be Good, It Might Be Only Natural,” 28 May 2007]. Vedantam’s article discusses experiments by neuroscientists at the National Institutes of Health that involved participants thinking about scenarios in which they either donated money to good causes or kept it for themselves. The results were surprising.
“The results were showing that when the volunteers placed the interests of others before their own, the generosity activated a primitive part of the brain that usually lights up in response to food or sex. Altruism, the experiment suggested, was not a superior moral faculty that suppresses basic selfish urges but rather was basic to the brain, hard-wired and pleasurable. Their 2006 finding that unselfishness can feel good lends scientific support to the admonitions of spiritual leaders such as Saint Francis of Assisi, who said, ‘For it is in giving that we receive.’ But it is also a dramatic example of the way neuroscience has begun to elbow its way into discussions about morality and has opened up a new window on what it means to be good.”
This kind of hardwired behavior facilitates cooperation in humans. The question is whether bacteria or other living organisms are also hardwired this way. Vedantam reports:
“No one can say whether giraffes and lions experience moral qualms in the same way people do because no one has been inside a giraffe’s head, but it is known that animals can sacrifice their own interests: One experiment found that if each time a rat is given food, its neighbor receives an electric shock, the first rat will eventually forgo eating.”
So at least the possibility exists that intrinsic mechanisms that foster cooperation are hardwired throughout nature. That’s good news for those of us interested in fostering cooperation in the area of development. Let’s get back to Zimmer’s article, which discusses the work being done by Harvard professor Martin Novak.
“In recent papers, Dr. Nowak has argued that cooperation is one of the three basic principles of evolution. The other two are mutation and selection. On their own, mutation and selection can transform a species, giving rise to new traits like limbs and eyes. But cooperation is essential for life to evolve to a new level of organization. Single-celled protozoa had to cooperate to give rise to the first multicellular animals. Humans had to cooperate for complex societies to emerge. ‘We see this principle everywhere in evolution where interesting things are happening,’ Dr. Nowak said. While cooperation may be central to evolution, however, it poses questions that are not easy to answer. How can competing individuals start to cooperate for the greater good? And how do they continue to cooperate in the face of exploitation? To answer these questions, Dr. Nowak plays games.”
What Nowak is exploring, in a larger sense, is resilience; that is, how cooperation can be used to make organizations or societies more resilient. One need look no further than the situation in Iraq to understand how important it is to find a strategy that permits competing individuals to cooperate for the greater good. In fact, some would say you needn’t look any further than Washington, DC!
“[Nowak’s] games are the intellectual descendants of a puzzle known as the Prisoner’s Dilemma. Imagine two prisoners are separately offered the same deal: if one of them testifies and the other doesn’t talk, the talker will go free and the holdout will go to jail for 10 years. If both refuse to talk, the prosecutor will only be able to put them in jail for six months. If each prisoner rats out the other, they will both get five-year sentences. Not knowing what the other prisoner will do, how should each one act? The way the Prisoner’s Dilemma pits cooperation against defection distills an important feature of evolution. In any encounter between two members of the same species, each one may cooperate or defect. Certain species of bacteria, for example, spray out enzymes that break down food, which all the bacteria can then suck up. It costs energy to make these enzymes. If one of the microbes stops cooperating and does not make the enzymes, it can still enjoy the meal. It can gain a potential reproductive edge over bacteria that cooperate. The Prisoner’s Dilemma may be abstract, but that’s why Dr. Nowak likes it. It helps him understand fundamental rules of evolution, just as Isaac Newton discovered that objects in motion tend to stay in motion. ‘If you were obsessed with friction, you would have never discovered this law,’ Dr. Nowak said. ‘In the same sense, I try to get rid of what is inessential to find the essential. Truth is simple.’ Dr. Nowak found his first clues to the origin of cooperation in graduate school, collaborating with his Ph.D. adviser, Karl Sigmund. They built a version of the Prisoner’s Dilemma that captured more of the essence of how organisms behave and evolve. In their game, an entire population of players enters a round-robin competition. The players are paired up randomly, and each one chooses whether to cooperate or defect. To make a choice, they can recall their past experiences with other individual players. Some players might use a strategy in which they had a 90-percent chance of cooperating with a player with whom they have cooperated in the past. The players get rewarded based on their choices. The most successful players get to reproduce. Each new player had a small chance of randomly mutating its strategy. If that strategy turned out to be more successful, it could dominate the population, wiping out its ancestors. Dr. Nowak and Dr. Sigmund observed this tournament through millions of rounds. Often the winners used a strategy that Dr. Nowak called, ‘win-stay, lose-shift.’ If they did well in the previous round, they did the same thing again. If they did not do so well, they shifted. Under some conditions, this strategy caused cooperation to become common among the players, despite the short-term payoff of defecting.”
Nowak’s results are interesting, but getting adversaries to adopt cooperative strategies is difficult because rational thought is often overtaken by pride, greed, anger, or other common human emotion. Recalcitrant leaders more often think in terms of personal, party, or tribal gains than in terms of societal good. That is what is happening in Iraq. It turns out, that reputation can play a role in turning things around.
“Dr. Nowak and his colleagues found that when they put players into a network, the Prisoner’s Dilemma played out differently. Tight clusters of cooperators emerge, and defectors elsewhere in the network are not able to undermine their altruism. ‘Even if outside our network there are cheaters, we still help each other a lot,’ Dr. Nowak said. That is not to say that cooperation always emerges. Dr. Nowak identified the conditions when it can arise with a simple equation: B/C>K. That is, cooperation will emerge if the benefit-to-cost (B/C) ratio of cooperation is greater than the average number of neighbors (K). ‘It’s the simplest possible thing you could have expected, and it’s completely amazing,’ he said. Another boost for cooperation comes from reputations. When we decide whether to cooperate, we don’t just rely on our past experiences with that particular person. People can gain reputations that precede them. Dr. Nowak and his colleagues pioneered a version of the Prisoner’s Dilemma in which players acquire reputations. They found that if reputations spread quickly enough, they could increase the chances of cooperation taking hold. Players were less likely to be fooled by defectors and more likely to benefit from cooperation. In experiments conducted by other scientists with people and animals, Dr. Nowak’s mathematical models seem to fit. Reputation has a powerful effect on how people play games. People who gain a reputation for not cooperating tend to be shunned or punished by other players. Cooperative players get rewarded.”
Some of Nowak’s findings fit neatly with the Development-in-a-Box™ approach, particularly the development of communities of practice that increase the cost/benefit ratio and the adoption of recognized standards that help improve reputations and increase cooperation. Nowak’s work also supports conclusions of a RAND study that indicated the U.S. military needs to rebrand itself in Iraq to create better cooperation [see my post Putting on a Good Face]. Good cooperation results in better connectivity because it generates trust. That is one of the results I’m hoping to achieve in Kurdistan — helping businesses here develop trust with international customers. Cooperation research is an area that should continue to enlighten us about how to become and help others become more resilient.