Sunday, April 12, 2009

The Science Of The Future Of War

By Betsy Mason Email
November 21, 2008 | 7:19:18 PM

Categories: Biology, Military, Survival
Courtesy Of Wired Blog Network

Futureofwar

The new book by Malcom Potts and Thomas Hayden will be widely available December 1, and is currently available on Amazon. Hear more about the book from the authors in a Q&A with Wired.com.

TODAY'S MOST BRUTAL WARS are also the most primal. They are fought with machetes in West Africa, with fire and rape and fear in Darfur, and with suicide bombs and improvised explosive devices in Israel, Iraq, and elsewhere. But as horrifying as these conflicts are, they are not the greatest threat to our survival as a species. We humans are a frightening animal. Throughout our species’s existence, we have used each new technology we have developed to boost the destructive power of our ancient predisposition for killing members of our own species. From hands and teeth tearing at isolated individuals, to coordinated raids with clubs and bows and arrows, to pitched battles, prolonged sieges, and on into the age of firearms, the impulse has remained the same but as the efficiency of our weapons has increased, the consequences have grown ever more extreme.

KidshellsThe evidence of history is that no advance which can be applied to the killing of other human beings goes unused. As scientific knowledge continues to explode, it would be naïve, to expect any different. As if we needed any more reasons to confront the role of warfare in our lives, the present supply and future potential of WMDs should convince us that the time has come once and for all to bring our long, violent history of warring against each other to an end.

The nineteenth century was dominated by discoveries in chemistry, from dyes to dynamite. The twentieth century belonged to physics, from subatomic particles and black holes to nuclear weapons. The twenty-first century is set to see great advances in biological knowledge, from our growing understanding of the genome and stem cells to, it’s a shame to say, new and expanded forms of biological warfare. In the past, each iteration of the application of scientific discovery to warfare has produced more horrible and destructive weapons. Sometimes temporary restraint is exercised, as in the successful ban on poison gas in the Second World War, but such barriers burst easily, as the deliberate bombing of civilians in the same war attest. Human beings have always appropriated new ideas to build increasingly formidable weapons and there is no reason to think that competitive, creative impulse will disappear on its own. As weapons become ever more horrifying—and, with the rise of biological weapons, increasingly insidious—it is no longer enough just to limit the use of one killing technology or another. We need to limit the conditions that lead to war in the first place.

It has become almost a cliché to note that we live in an increasingly complex and interdependent society. But this point is crucially important as we consider the future of war. Our cities once were fortresses, the walled sanctums where our ancestors sought refuge from marauders. The firebombing of the Second World War revealed a new urban vulnerability, but even that insecurity is nothing by today’s standards. We live in giant cities, supplied with piped water and electricity, with trains in tunnels and cars on elevated roadways, with fiber optics under the pavement and air-conditioning plants for buildings with windows that cannot be opened. Our new urban centers have the vulnerability to terrorism and attack built right into them. Any modern city can be held hostage by a single Unabomber, brought to a halt by nineteen fanatical men, or devastated by any small raiding party drawing on modern scientific knowledge, from malicious computer programming to radioactive “dirty bombs” to infectious bacteriology. To understand the dangerous future of these WMDs, we’ll first take a quick look at their history.

Poison Gas

Poisongas On April 22, 1915, near the Belgian town of Ypres, the German army mounted the first poison gas attack in history. Fritz Haber, who would later receive the Nobel Prize for his work producing nitrogen fertilizer, labored day and night to develop chlorine gas into a weapon and supervised its first release in person. The 168 tons of gas deployed that day ripped a four-mile gap of gasping, suffocating men in the British lines. (The German commanders—as is so often the case when new weapons are used—had insufficient resources to exploit their opportunity.) In a revealing example of the difference between the attitudes of men and women toward war, Haber’s wife Clara, who was also a chemist, begged her husband to stop his work on poison gas. After a dinner held to celebrate her husband’s appointment as a general, Frau Haber shot herself in the garden—and Haber left the funeral arrangements to others while he traveled to the Eastern front to supervise the first gas attack on the Russians. Unprepared, the Russians suffered 25,000 casualties. In one of the grimmer ironies in the history of dehumanizing others, while Haber was dismissed from the directorship of the Kaiser Wilhelm Institut in Berlin in 1933 because he was a Jew (he later escaped Nazi Germany), his invention, Zyklon gas, was used in the gas chambers of Nazi concentration camps to kill other Jews.

Despite the obvious horrors of gas warfare, the British began their own chemical weapons research in 1916. They tested 150,000 compounds including dichlorethyl sulfide, which they rejected as insufficiently lethal. The Germans disagreed, and took up its development. On initial exposure, victims didn’t notice much except for an oily or “mustard” smell, and so the first men exposed to this “mustard gas” did not even don their gasmasks. Only after a few hours did exposed skin began to blister, as the vocal cords became raw and the lungs filled with liquid. Affected soldiers died or were rendered medically unfit for months, and often succumbed years or decades later to lung disease. At first the British were outraged at its use, but later they sent supplies of poison gas to their own troops in British India, for use against
Afghan tribesmen in the North-West Frontier.

By 1918, one-third of all shells being used in World War I were filled with poison gas. In all, 125,000 British soldiers were gassed, along with 70,000 Americans. Three weeks before the end of the war, the British shelled the 16th Bavarian Reserve Infantry with mustard gas. A young corporal named Adolf Hitler was blinded in the attack—and would later claim that the recovery of his sight was a supernatural sign he should become a politician and save “Germany.”

Nuclear Weapons

AtomiccloudBetween the ages of eleven and seventeen, I was lucky to attend the Perse School in Cambridge, only a mile from the Cavendish Laboratory where much of the early work on atomic physics was conducted. Today, I teach at the University of California, Berkeley, an important site for early work on nuclear physics, and still the managing institution for Los Alamos National Laboratory in New Mexico, where the atomic bomb was developed. The knowledge to create the most destructive weapons in history was developed by clever men in pleasant surroundings, pushing the analytical power of their Stone Age brains to the limit. In that task, deep-seated human emotions and brilliant science clashed in complex ways.

The main motivating factor behind America’s Manhattan Project was fear—fear that Nazi Germany would develop the atomic bomb first. In the 1930s, a Hungarian theoretical physicist living in London, Leo Szilárd, foresaw that a nuclear chain reaction might be possible, and in December 1938, Otto Hahn in Germany conducted the crucial experiment confirming Szilárd's hypothesis. As a young German officer, Hahn had helped release the first chlorine gas at Ypres in 1915, but when the possibility of a nuclear weapon arose he had serious reservations, saying, “if my work should lead to a nuclear weapon I would kill myself.” (Lise Meitner, another physicist, was the first to understand the potential of nuclear fission. She worked with Hahn in Berlin before being expelled from Germany because she was Jewish, and she refused any part in the development of the American bomb.) But while virtually every physicist who saw the potential for nuclear weapons recoiled in horror, scientific genies which can be weaponized are always difficult to keep in their bottles, and impossible during wartime. By the time Hitler invaded Czechoslovakia in March of 1939, science had advanced to the point that the best physicists in both Europe and America could see how an atomic bomb was scientifically possible. Soon, many would come to consider it necessary as well.

A German effort to build the bomb was launched, and headed by Werner Heisenberg, famous for his “uncertainty principle” of quantum physics. Germany failed to make an atomic bomb by a wide margin, and there is some evidence, controversial to be sure, that Heisenberg and other German physicists had intentionally dragged their heels. Whether true or not, it hardly mattered—Szilárd was convinced the Nazis were making progress and that only the Americans could beat them to the nuclear finish line. He drafted a warning letter, and together with Albert Einstein sent it to President Roosevelt. The Manhattan Project soon followed.

The U.S. tested its first atomic weapon in the New Mexico desert at 2:41 A.M. on May 7, 1945—just as the Allies were accepting Nazi Germany’s unconditional surrender. But the war with Japan raged on, and the new U.S. President, Harry Truman, struggled with the power he now controlled. “Even if the Japs are savages, ruthless, merciless and fanatic, we as the leader of the world...cannot drop this terrible bomb on the old capital [Kyoto],” he confided to his diary. “The target will be a purely military one and we will issue a warning statement asking the Japs to surrender.” In fact, Japan was on the verge of surrender and it might well have capitulated had they been told the Emperor could remain on his throne.* The Allies, however, insisted on unconditional surrender, and the Japanese refused. At 8:16 A.M. on August 6, a uranium-235 device called Little Boy was dropped on Hiroshima; a plutonium bomb, “Fat Man,” was dropped on Nagasaki two days later. On September 2, 1945, the Japanese formally surrendered. The genie was out of the bottle.

Within months of the end of the war, Edward Teller, a Hungarian who was part of the team that had developed the U.S. bomb, was working on the hydrogen bomb, an even more powerful weapon. In the Soviet Union, Stalin had authorized work on an atomic bomb as early as 1942, and the Russians were helped initially by lease-lend shipments of uranium and other material from the U.S., and by Manhattan Project secrets leaked by the left-wing physicist Klaus Fuchs. His betrayal is said to have advanced the Soviet work by perhaps eighteen months, and captured German scientists added an extra boost after the war. Russia exploded her first atomic bomb just four years after the Americans. The British had their atomic bomb by 1953, the French by 1960, and the Chinese in 1964. Israel has never confirmed its membership, but is thought to have joined the nuclear club by the late 1970s.

Germ Warfare

Hazmat The Shoshone Indians of Nevada, before battle, killed a sheep, drained its blood into a length of intestine, buried the draught in the ground to ferment, and then smeared their war arrows with the microbial brew. This would have guaranteed severe infection and probably death following even a superficial arrow wound. A 3,400-year-old clay tablet found in modern Turkey carries a cuneiform inscription with the intriguing phrase, “The country that finds them shall take over this evil pestilence.” Molecular biologist Siro Trevisanato from Ontario, Canada, suggests that this may be a reference to a disease called tularemia which infects sheep, donkeys, rabbits, and human beings, and that it is the first instance of biological warfare in recorded history. Tularemia is a highly infectious disease leading to a painful death from fever, skin ulcers, and pneumonia. It was the cause of serious epidemics in early civilizations stretching from present-day Cyprus to Iraq, and the historical record suggests that infected sheep and donkeys were driven into enemy lines in order to spread infection. During the French and Indian Wars (1754–1763), the British very likely gave hostile Indian tribes blankets infected with smallpox, and certainly considered the idea. Once you have dehumanized your enemy, the evidence is that it matters little which way you kill him. But biological weapons represent a particularly insidious and dangerous form of WMDs. They may lack the immediate gruesome effects of chemical weapons or the sheer destructive power of the atomic bomb. But they are inherently stealthy, potentially lethal on a global scale, and when living infectious organisms are involved, all but uncontrollable.

Both Japan and the U.S. worked on biological weapons during World War II, and the Japanese used anthrax and plague bacteria against the chinese. U.S. research continued after the war until 1969, when President Richard Nixon renounced “the use of lethal biological agents and weapons, and other methods of biological warfare.” The U.S. unilaterally destroyed its stockpiled biological weapons, a bold step which led to the 1972 Biological Weapons convention. But although the convention was ratified by 140 nations, it lacked policing capacity and within one year of its passage, the Soviet Union began the largest biological weapons program in history. Vladimir Pasechnick, who would defect to the U.S. in 1994, reported overseeing 400 research scientists working on the program in Leningrad, with another 6,000 professionals throughout the country involved in the manufacture of huge quantities of anthrax and smallpox. Iraq also ignored the 1972 convention and in 1990, just before the First Gulf War, a factory south of Baghdad manufactured 5,400 liters of botulinum toxin. The coalition forces had insufficient vaccines to protect their soldiers, and U.S. Secretary of State James Baker used diplomatic channels to let Saddam Hussein know that the U.S. would launch a nuclear response if attacked with biological weapons. By the time of the Second Gulf War, Hussein’s biological weapon program had disintegrated.

As a physician, I must say that I find germ warfare to be particularly loathsome. There are three possible levels on which it could be waged, each more distressing that the one before. First, a bacterium such as anthrax, which is very stable, could be sprayed or spread around a community. Anyone who inhaled it would come down with a non-specific fever and fatigue, which looks like the onset of flu but, left untreated, leads to fatal pneumonia. An anthrax victim, however, could not infect another person. Second, an infectious agent, such as smallpox, could be used to start an epidemic. Third, a new and terrible disease could be genetically engineered that not only infects, but also avoids detection and resists treatment with our current arsenal of vaccines and antibiotics. This final scenario is the most chilling of all.

If anything qualifies as a miracle of modern medicine, it is the World Health Organization’s use of vaccination to eradicate smallpox in the 1960s and 1970s. The last case of this ancient killer of millions was identified in October 1977 in Somalia. Yet the very fact of our medical triumph over smallpox makes it a particularly devastating weapon. The virus is highly infectious; causes severe, painful disease with a high rate of mortality; and unlike HIV, for example, is quite robust, and can persist in the environment for months or years. Unlike most viral diseases, it is possible to halt smallpox infection by vaccination after exposure. However, the smallpox vaccination must be given within the first forty-eight hours after exposure, and large-scale smallpox vaccination was stopped thirty years ago. A smallpox-based attack now could devastate a large population. But even if an outbreak were quickly contained, it would bring a nation to a halt and be exceedingly frightening and painful.

All smallpox samples were supposed to be destroyed following eradication, with the exception of two batches. One is stored at the U.S. Centers for Disease control and Prevention in Atlanta, Georgia, and the other at the Russian State Research Institute of Virology and Biotechnology outside Novosibirsk, Siberia. It is possible, however, that clandestine stocks were kept by Russia, Iraq, Israel, or some other countries, and shortly after 9/11, the World Health Organization decided to postpone the destruction of the final Russian and U.S. samples in case they are needed to provide scientific information to counter a bioterrorism attack in the future.

Many other pox viruses and other infectious agents provided by nature could potentially be used as weapons. But the Frankenstein-like creation of novel germs is perhaps an even greater fear. A lethal virus might be assembled accidentally, as happened in Australia in 2000 when an experiment to sterilize rodent pests turned sour. The unintentionally lethal virus killed all the experimental animals, despite attempts at vaccination. And the deliberate quest to make germ warfare more effective by genetically modifying existing bacteria and viruses has already begun. Sergei Popov, a Russian molecular biologist who worked in the Soviet biological weapons program, developed a microbe with the potential to cause a slow death from multiple sclerosis. “We never doubted,” he said after defecting to Britain in 1992, “that we did the right thing. We tried to defend our country.” His words echo those spoken by Werner Heisenberg and other German nuclear scientists after the Second World War almost exactly.

Biological agents need not kill to be effective terror weapons. In the case of rodent pest control, thought has been given to using a modified virus that would cause infected female animals to make antibodies against the coat surrounding their own eggs. As a pest control strategy, it would produce a generation of sterile rats. If a similar virus were developed against human beings, it might be years before a slowly emerging epidemic of infertility was even recognized as a deliberate attack. As one scientist has remarked, “the main thing that stands between the human species and the creation of a supervirus is a sense of responsibility among individual biologists.” With an ever-growing population of scientists with the skill to manipulate the genes of bacteria and viruses, “individual responsibility” may prove a gossamer defense indeed.

Manufacturing Destruction

The nuclear arms race between the United States and the Soviet Union in many ways defined the mid-twentieth century. But in some ways we can learn even more from the nuclear confrontation that has played out on the Indian subcontinent. In 1948, Indian Prime Minister Jawaharlal Nehru, despite being an advocate of non-aggression and ending atomic tests, admitted that, if threatened, “no pious statements will stop the nation from using it that way.” Nehru was right and on May 11, 1974, India detonated a plutonium bomb the size of the Hiroshima weapon. As the Indian threat increased, Zulfikar Ali Bhutto, then Pakistan’s Foreign Minister, declared that his country would sacrifice everything to make an atomic bomb, “even if we have to eat grass or leaves or to remain hungry.” Many people in that impoverished nation did in fact remain hungry as Pakistan poured its meager resources into a weapons program, which finally resulted in a series of nuclear tests in March 1998.

The disturbing lesson is that the technical and economic barriers to WMD acquisition are steadily dropping. The Manhattan project cost two trillion dollars in the money of the time, and involved an industrial effort as large as the whole of the U.S. automobile industry. Pakistan managed the same feat as an unstable third-world country with a fraction of the resources. If Iran and North Korea soon join the nuclear club, it will be in part thanks to nuclear secrets purchased from A. Q. Khan, the “father” of the Pakistani bomb. Perhaps most disturbing of all, there are thousands of pounds of high-grade nuclear material still in the former Soviet Union, left over from the cold War. Some is unaccounted for, and much of the rest is poorly secured, vulnerable to purchase or theft by terror groups.

In much the same way, Germany’s World War I chemical weapons were produced by the most advanced chemical industry in the world at the time. The sarin gas released into the Tokyo subway by the Aum religious sect in 1995, which killed seven people and made 2,000 ill, was made by a single, poorly qualified biochemist, Seichi Endo. Also in 1995, an American survivalist purchased plague bacteria on the open market from the America Type culture collection for just $300. Whether used by nations against their enemies, or by small bands of terrorists bent on causing ever greater fear, there is simply too little we can do to stop WMDs and their effects once they have been constructed. Our best hope of security is to encourage and enforce control, while also redoubling our efforts to understand and counteract the conditions that might lead to their use in the first place.

The Battle for Resources

We have already stated several times that all team aggression, all raiding, and all wars are ultimately about resources, even if the combatants aren’t consciously aware of it. All life, in fact, at its most fundamental level is about competition for resources. Evolution has been driven by this competition for billions of years, and today’s animals, plants, bacteria, protozoa, and fungi all exist because they competed successfully with their rivals in the past. If we are to have any chance of avoiding the wars of tomorrow, as the destructive power of today’s weapons tells us we must, then we have to address this most basic of biological problems: The fact that as the population of any species grows, the pressure on its natural resources increases and competition becomes more severe.

Biology has invented a million ways for plants and animals to compete with each other. A tree may compete for light by growing taller; early mammals competed with dinosaurs by only coming out at night; humans and chimpanzees—especially the males—compete for food, space, and reproductive opportunities by fighting with each other. Human wars may come wrapped in a veneer of religion or political philosophy, but the battle for resources is usually just below the surface. When Pope Urban II exhorted the nobles of Europe to join the First Crusade, he contrasted the lands where they lived, which had “scarcely enough food for their cultivators,” with Palestine, where the crusaders would be able to appropriate land from the Infidels. In World War II, the need for land and resources was expressed as Hitler’s concept of lebensraum, or “living space.” “The aim [of] the efforts and sacrifices of the German people in this war,” he wrote, “must be to win territory in the East for the German people.” The Japanese attacked Pearl Harbor because they knew they had to destroy the American Pacific fleet if they were to access the Indonesian oil they needed to supply their industries. As we saw earlier, while rapid population growth and massive unemployment in some settings, such as the Gaza Strip, do not cause wars or terrorist attacks by themselves, they certainly make them more likely.

Showmeyourwarfare The predisposition for team aggression may be an inherent part of chimpanzee and human makeup, but the degree of competition for resources varies with the situation. For example, it seems that team aggression among chimpanzees is less common in the congo, where there are more forest resources, than in Tanzania, where human encroachment has driven the animals into a limited area of forest. The human migrants who crossed the Bering Strait into the Americas about 15,000 years ago found a continent filled with large, easy-to-hunt mammals, and among their limited human skeletal remains we find no evidence of violence. But by about 5000 B.C., as numbers and competition increased, some human skeletons from hunter-gatherer societies in North America show evidence of scalping, or have arrowheads embedded in them. A thousand years ago, in the American Southwest, the Anasazi and Fremont peoples were foragers who also grew maize. Some built elaborate cliff dwellings. The study of tree rings demonstrates that the area was subject to some decade-long droughts, and during these times the region seems to have been beset by raids and warfare. The population retreated to high pinnacles on the edges of deep canyons. They hid small caches of grain in hard to reach places and positioned boulders to roll down on enemy clans. Human skeletons show signs of malnutrition, decapitation, and cut marks on long-bones suggesting cannibalism.

Some Rousseauean anthropologists protest that reports of cannibalism represent a racist desire to denigrate other cultures, but the scientific evidence suggests otherwise. Excavating an Anasazi site in the American Southwest dating from 1150 a.d., Brian Billman of the University of North Carolina at Chapel Hill found cooking vessels and the butchered remains of four adults and an adolescent. Sensitive immunological tests revealed evidence of human muscle protein in the pots; even more convincing, the same tests found evidence of human meat in preserved human feces found at the site. When food is scarce, competition becomes increasingly intense and cannibalism, like team aggression, aids survival.

Critics have argued that the archaeological evidence for endemic violence in drought-ridden areas is too scattered and circumstantial to draw strong conclusions. A recent study of environment and warfare in contemporary Africa helps put that criticism to rest. Edward Miguel of the University of California, Berkeley, and colleagues Shanker Satyanath and Ernest Sergenti of New York University compared rainfall levels and incidents of civil conflict across the African continent, and found that as one increased, the other declined, with a statistical certainty of 95 percent. Interestingly, the effect was found across many different cultures and irrespective of whether the country was well or poorly governed.

Competition for resources has led to violence everywhere we look. When Polynesian seafarers reached Easter Island about 1,300 to 1,700 years ago, they landed on a forested island full of flightless birds. By about 500 years ago, the trees had been cut down, the animals had all been eaten, and the clans, who identified themselves with the curious stone statues that still dot the island, fell to fighting each other. The population plummeted from an estimated 20,000 to just 2,000 by the time Europeans arrived in the eighteenth century. Here too we find archeological evidence of cannibalism, which lives on in the oral tradition of the islanders. A local insult used on Easter Island even today is, “The flesh of your mother sticks between my teeth.”

The thought that rapid population growth could increase conflict is hardly new, and certainly Thomas Malthus accepted this relationship in his 1798 Essay on the Principle of Population. As with so many efforts to interpret human behavior, however, the link between resource depletion and conflict has been obscured by extreme arguments. As Shridath Ramphal and Steven Sinding, then of the UN commission on Global Governance and the Rockefeller Foundation, write, “there has been considerably more heat than light in the international dialogue” and efforts have been made that “suit a political, as opposed to a scientific interest.” Those looking at the same landscape of facts but through different lenses end up sparring instead of seeking synthesis. Nancy Peluso and Michael Watts, colleagues of ours at Berkeley, castigate writers such as Robert Kaplan, author of The Coming Anarchy: How Scarcity, Crime, Overpopulation, and Disease Are Rapidly Destroying the Social Fabric of Our Planet, for making too direct a link between resource scarcity and conflict. They point out, citing Karl Marx (who did in fact get a few things right), that economic patterns also help determine who controls and who has access to resources. No doubt some conflicts could be avoided by a more equitable distribution of resources; there is nothing contradictory in arguing for greater social and economic equality while also recognizing that high birth rates can overwhelm the ability of a finite region to sustain its human population regardless of such equality.

John May, the World Bank’s demographer for Africa, has drawn attention to the demographic pressure that had built up in Rwanda by the time of the 1994 genocide. The population of Rwanda was two million people in 1950, and on average each woman had almost 8 children. By 1994, average family size had fallen slightly to 6.2, but the population had quadrupled to almost eight million, resulting in a population density of 292 people per square kilometer, the highest in all of Africa. James Fairhead, an anthropologist from the School of Oriental and African Studies in London, adds an economic dimension to the analysis. Preceding the Rwanda genocide, Fairhead points out, agricultural land prices had reached an astronomical $4,000 per hectare in a country where many people lived on less than $500 a year. “Land,” Fairhead concludes, “is worth fighting for and defending.” Tragically, the fighting which took place in 1994 left between 500,000 and one million dead. It was cast as an ethnic conflict, and senseless. Once its roots in resource competition are laid bare, however, the violent extermination of an identifiable outgroup takes on the all-too familiar logic of team aggression.

Can all conflict be reduced beyond even team aggression and resource competition, down to the single factor of population growth? It’s not quite that simple, but a deeper investigation of the role of population increase shows quite clearly that growth rate and population demographics function as significant triggers for raiding, wars, and even terrorism. If we hope to reduce the number and severity of these violent incidents in our world, this is a relationship we need to understand. Peter Turchin of the University of Connecticut and his Russian colleague Andrey Korotayev provide important quantitative insight into the dynamic connections between population growth and conflict. In a careful study of English, Chinese, and Roman history, they showed a statistical correlation between an increase in population density and warfare, although not surprisingly the impact of population growth was not immediate but took some time to develop. It is not the infant playing at the hearth but the hungry landless peasant twenty years later who causes the conflict. Adjusting for this and other variables (such as the fact that wars themselves tend to reduce population), and using robust data on population growth from church records in England along with historical data on conflict, Turchin and Korotayev found that intervals of relative peace and rapid population growth were followed by periods of conflict and slower population growth. Their study suggests that population growth accounts for a powerful 80–90 percent* of the variation between periods of war and peace. Even if the influence of population is substantially less than that, it remains outstandingly important. But here is the crucial point: Rapid population growth is not just an important cause of violent conflicts. In the contemporary world, population growth is a cause that can be contained by purely voluntary means.

In the past fifty years the world has accommodated rapid population growth tolerably well, although as rising oil and food prices suggest, this may not be true in the future. The combination of the industrial revolution and science-based technology increased global wealth at an astonishing rate. We have been a little like those first people to cross into North America, or the Polynesians who first landed at Easter Island, in more ways than one, however. Presented with vast new supplies of food, energy, building materials, and luxury goods our forbears could never have imagined, we have gorged ourselves on consumption, and we have driven.

Our global population from just one billion people in 1800 to six billion in 2000. We live in a globalized world now, and worldwide population is expected to increase to over eight billion by 2030. The evidence of that increase is now all around us, in our polluted environment, our warming climate, our disappearing rainforests, and our increasingly degraded farmland: We are, as a species, in the process of proving Malthus’s proposition that population will always outstrip resources.

Has the age of rapid resource expansion really come to an end? Human ingenuity continues as unchecked as our population growth, and we will no doubt find ways to squeeze more food, water, and energy out of the existing supplies. But there are natural limits on how far efficiency and invention can take us. Thomas Homer-Dixon, Director of Peace and conflict Studies at the University of Toronto, and Ambassador Richard Benedick, who was the chief U.S. negotiator for the 1987 Montreal Protocol on atmospheric ozone levels, argue that resource wars will become increasingly common in many parts of the world in the twenty-first century.* Water, for example, is becoming a key constraint on development and quality of life in many places. Thanks to dwindling supplies and burgeoning populations, the Middle East and much of North Africa now have one-third as much water per capita as in 1960. Israel has already exploited 95 percent of the available water supply in the country, and uses it efficiently; there is no new supply to tap. In the Gaza Strip, seawater is contaminating groundwater supplies as fresh water is pumped out to supply the growing population.

Egypt has depended on the Nile for irrigation, drinking water, and flushing its waste for thousands of years. But even that vast stream of water is now reaching its limits. Martha and I have watched millions of gallons of clear water pour over the Blue Nile falls near Bahir Dar in Ethiopia, and we have sat beside the origin of the White Nile at Jinja on Lake Victoria in Uganda. The two branches join at Khartoum in the middle of the Sudanese desert to make a vast, life-giving flow that has sustained forests, wildlife, and human populations since time immemorial. But by the time the Nile reaches the Mediterranean Sea, it is a sadly depleted shadow of its former self. In the year 2000, there were 170 million people in Ethiopia, Sudan, and Egypt, all dependent on the waters of the Nile. There is significant demand for family planning in these countries, but for cultural and political reasons, that demand remains largely unmet. The populations of these three countries will continue to expand rapidly from 190 million today to a UN-estimated 337 million people by 2050. Population will more than double, but there will be no new water supply—all 337 million will be dependent on a source that is already under strain. In a region with a volatile mix of cultures, religions, and ethnicities, the added stress of severe water shortages may well be the spark that sets the team aggression impulse ablaze on a vast and horrifying scale.

And yet our consumption continues to increase. In recent decades, a billion new consumers have arisen in China, India, South East Asia, India, Brazil, Mexico, and parts of the former Soviet bloc. When the incomes of these newly affluent people are adjusted to take into account local purchasing power, their potential to buy better quality food, more consumer goods, and more automobiles will equal that of the U.S. While we should welcome the improved living standards and decreased poverty in many parts of the world, finite resources also make it essential that everything possible is done in the West and among the newly affluent to prevent runaway population growth. Norman Myers of Oxford University has shown that if the newly wealthy Chinese were to eat fish at the Japanese per capita rate, they would empty the seas, and if they used cars at the U.S. rate, they alone would consume today’s total global output of oil. In fifteen years, Martha and I have seen Beijing’s and Shanghai’s roads go from two-lane streets filled with bicycles to six-lane super-highways bursting with cars. The price of oil around the world continues to rise with the increased demand, and it is not going to fall to the low levels that Americans expected almost as a natural right just a decade or two ago. As competition for oil and other resources increases, will nations solve their differences through diplomacy, or through war?

Optimists point out that some countries, such as the Netherlands, are densely populated but still maintain a high standard of living. The implication is that good government and modern technology can help prevent the worst problems of expanding populations. But such arguments overlook the fact that we all need space to grow the food we need, to collect the water we use, and to absorb the pollution we create. calculated realistically, the Netherlands has an ecological footprint fourteen times its area on the map, because it imports food for people and fodder for cattle, consumes drinking water that fell as rain in Switzerland, and pumps carbon dioxide from its power stations into the global atmosphere.

For billions of years, evolution has been driven by competition caused by the simple fact that, left unchecked, all living things can reproduce faster than their environment can sustain. Our population growth today is largely unchecked by hunger, disease, or predators, and it is highly likely that our numbers and industrial demands have already exceeded the environment’s capacity to support them. Mathias Wackernagel in California, Norman Myers in England, and others calculate that we may have exceeded Earth’s carrying capacity as long ago as 1975. According to these calculations, we already need a planet 20 percent larger than the one we have. Such estimates are difficult to make and open to criticism. But it doesn’t take much more than an open set of eyes to realize that current human population growth and economic expansion are going to be impossible to sustain in the long term. competition for resources is about to increase markedly.

Pullquote

Lessons

Human beings are animated by curiosity. This same impulse to investigate our surroundings which today drives the scientific enterprise originally adapted our ancestors to a harsh, competitive environment. But unfortunately, the mixture of curiosity, the tendency to overreact when threatened, and unquestioning loyalty to our ingroup has become a lethal combination in today’s world. We can expand the envelope of empathy to include greater numbers of people, but in times of war, or perceived threats to our safety, it too often collapses again.

Power, patriotism, and curiosity can drive even the most intelligent and informed men—and it is virtually always men—to turn new scientific discoveries into weapons of mass destruction. The witness of history seems to be that the predisposition to fight and to defend ourselves against attack is so powerful that human beings, once they perceive themselves to be in a life or death struggle of any kind, will always justify research and development of new weapons, however horrendous their effects. It is sobering to note how many winners of Nobel Prizes for science contributed directly or indirectly to the development of weapons of mass destruction—and how many achievements honored with a Nobel Peace Prize fell apart soon after they were awarded. If the Nobel Prize for physics is awarded for accomplishment, the Peace Prize seems very often to reward only effort. But this does not mean that true peace is impossible— so long as we understand the biology of war.

We live in very different evolutionary times than any of our ancestors. After 3.5 billion years of competition, life on Earth has reached its carrying capacity. More competition at this point means fighting harder over a constantly dwindling pool of available resources. As we seek ways to solve our environmental crises, address the warming climate, and combat emerging diseases and global poverty, our very survival as a species requires finding more ways to cooperate rather than compete. And thanks especially to WMDs, the survival of our species now also means bringing an end to war as we know it. It is time to leave our history of team aggression behind.

These are daunting challenges, to say the least. Each will require the commitment and individual efforts of literally billions of our fellow humans, as well as many careful, specific programs put into effect by entire populations. But there is one action that we must take, individually and as a world, if any of the others are to be successful. It directly contradicts some of our deepest evolutionary programming, but if we are to survive as a species, we must stabilize or even reduce population size. As we’ll see in the coming chapter, to a very large extent that means recognizing that the natural tendencies of men are not consistent with the survival and well-being of their sexual partners, their children, and future generations to come. The most aggressive and violent aspects of men’s inherited behaviors—summarized in the predisposition to team aggression—too often overshadow the more benign aims of women, especially that to have surviving and healthy children. Fortunately, women’s impulses and aims are also based on deep evolutionary programming. All we have to do is create the conditions that allow them to be expressed.

Image credit: 1. UNICEF photo/Pierre Holtz 2. Library of Congress: American soldiers in WWI protecting themselves from poison gas. 3. A nuclear test from archive.org. flickr/sandcastlematt 4. A chimpanzee at Lowry Park Zoo in Tampa. flickr/wordman1

No comments: