Tomgram

Barbara Ehrenreich, The Fog of (Robot) War

Posted on

Last week, William Wan and Peter Finn of the Washington Post reported that at least 50 countries have now purchased or developed pilotless military drones.  Recently, the Chinese had more than two dozen models in some stage of development on display at the Zhuhai Air Show, some of which they are evidently eager to sell to other countries. 

So three cheers for a thoroughly drone-ified world.  In my lifetime, I’ve repeatedly seen advanced weapons systems or mind-boggling technologies of war hailed as near-utopian paths to victory and future peace (just as the atomic bomb was soon after my birth). Include in that the Vietnam-era, “electronic battlefield,” President Ronald Reagan’s Strategic Defense Initiative (aka “Star Wars”), the “smart bombs” and smart missiles of the first Gulf War, and in the twenty-first century, “netcentric warfare,” that Rumsfeldian high-tech favorite.

You know the results of this sort of magical thinking about wonder weapons (or technologies) just as well as I do. The atomic bomb led to an almost half-century-long nuclear superpower standoff/nightmare, to nuclear proliferation, and so to the possibility that someday even terrorists might possess such weapons. The electronic battlefield was incapable of staving off defeat in Vietnam. Reagan’s “impermeable” anti-missile shield in space never came even faintly close to making it into the heavens. Those “smart bombs” of the Gulf War proved remarkably dumb, while the 50 “decapitation” strikes the Bush administration launched against Saddam Hussein’s regime on the first day of the 2003 invasion of Iraq took out not a single Iraqi leader, but dozens of civilians. And the history of the netcentric military in Iraq is well known. Its “success” sent Secretary of Defense Rumsfeld into retirement and ignominy.

In the same way, robot drones as assassination weapons will prove to be just another weapons system rather than a panacea for American warriors. None of these much-advertised wonder technologies ever turns out to perform as promised, but that fact never stops them, as with drones today, from embedding themselves in our world. From the atomic bomb came a whole nuclear landscape that included the Strategic Air Command, weapons labs, production plants, missile silos, corporate interests, and an enormous world-destroying arsenal (as well as proliferating versions of the same, large and small, across the planet). Nor did the electronic battlefield go away. Quite the opposite — it came home and entered our everyday world in the form of sensors, cameras, surveillance equipment, and the like, now implanted from our borders to our cities.

Rarely do wonder weapons or wonder technologies disappoint enough to disappear.  And those latest wonders, missile- and bomb-armed drones, are now multiplying like so many electronic rabbits.  And yet there is always hope.  Back in 1997, Barbara Ehrenreich went after the human attraction to violence in her book Blood Rites: Origins and History of the Passions of War.  In it, among other brilliant insights, she traced the beginnings of our modern blood rites not to Man, the Aggressor, but to human beings, the prey (in a dangerous early world of predators).  Now, in an updated, adapted version of an afterword she did for the British edition of that book, she turns from the origins of war to its end point, suggesting in her usual provocative way that drones and other warrior robotics may, in the end, do us one strange favor: they may finally bring home to us that war is not a human possession, that it is not what we are and must be. (To catch Timothy MacBain’s latest TomCast audio interview in which Ehrenreich discusses the nature of war and how to fight against it, click here, or download it to your iPod here.) Tom

War Without Humans

For a book about the all-too-human “passions of war,” my 1997 work Blood Rites ended on a strangely inhuman note: I suggested that, whatever distinctly human qualities war calls upon — honor, courage, solidarity, cruelty, and so forth — it might be useful to stop thinking of war in exclusively human terms.  After all, certain species of ants wage war and computers can simulate “wars” that play themselves out on-screen without any human involvement.

More generally, then, we should define war as a self-replicating pattern of activity that may or may not require human participation. In the human case, we know it is capable of spreading geographically and evolving rapidly over time — qualities that, as I suggested somewhat fancifully, make war a metaphorical successor to the predatory animals that shaped humans into fighters in the first place.

A decade and a half later, these musings do not seem quite so airy and abstract anymore. The trend, at the close of the twentieth century, still seemed to be one of ever more massive human involvement in war — from armies containing tens of thousands in the sixteenth century, to hundreds of thousands in the nineteenth, and eventually millions in the twentieth century world wars.

It was the ascending scale of war that originally called forth the existence of the nation-state as an administrative unit capable of maintaining mass armies and the infrastructure — for taxation, weapons manufacture, transport, etc. — that they require. War has been, and we still expect it to be, the most massive collective project human beings undertake. But it has been evolving quickly in a very different direction, one in which human beings have a much smaller role to play.

One factor driving this change has been the emergence of a new kind of enemy, so-called “non-state actors,” meaning popular insurgencies and loose transnational networks of fighters, none of which are likely to field large numbers of troops or maintain expensive arsenals of their own. In the face of these new enemies, typified by al-Qaeda, the mass armies of nation-states are highly ineffective, cumbersome to deploy, difficult to maneuver, and from a domestic point of view, overly dependent on a citizenry that is both willing and able to fight, or at least to have their children fight for them.

Yet just as U.S. military cadets continue, in defiance of military reality, to sport swords on their dress uniforms, our leaders, both military and political, tend to cling to an idea of war as a vast, labor-intensive effort on the order of World War II. Only slowly, and with a reluctance bordering on the phobic, have the leaders of major states begun to grasp the fact that this approach to warfare may soon be obsolete.

Consider the most recent U.S. war with Iraq. According to then-president George W. Bush, the casus belli was the 9/11 terror attacks.  The causal link between that event and our chosen enemy, Iraq, was, however, imperceptible to all but the most dedicated inside-the-Beltway intellectuals. Nineteen men had hijacked airplanes and flown them into the Pentagon and the World Trade Center — 15 of them Saudi Arabians, none of them Iraqis — and we went to war against… Iraq?

Military history offers no ready precedents for such wildly misaimed retaliation. The closest analogies come from anthropology, which provides plenty of cases of small-scale societies in which the death of any member, for any reason, needs to be “avenged” by an attack on a more or less randomly chosen other tribe or hamlet.

Why Iraq? Neoconservative imperial ambitions have been invoked in explanation, as well as the American thirst for oil, or even an Oedipal contest between George W. Bush and his father. There is no doubt some truth to all of these explanations, but the targeting of Iraq also represented a desperate and irrational response to what was, for Washington, an utterly confounding military situation.

We faced a state-less enemy — geographically diffuse, lacking uniforms and flags, invulnerable to invading infantries and saturation bombing, and apparently capable of regenerating itself at minimal expense. From the perspective of Secretary of Defense Donald Rumsfeld and his White House cronies, this would not do.

Since the U.S. was accustomed to fighting other nation-states — geopolitical entities containing such identifiable targets as capital cities, airports, military bases, and munitions plants — we would have to find a nation-state to fight, or as Rumsfeld put it, a “target-rich environment.” Iraq, pumped up by alleged stockpiles of “weapons of mass destruction,” became the designated surrogate for an enemy that refused to play our game.

The effects of this atavistic war are still being tallied: in Iraq, we would have to include civilian deaths estimated at possibly hundreds of thousands, the destruction of civilian infrastructure, and devastating outbreaks of sectarian violence of a kind that, as we should have learned from the dissolution of Yugoslavia, can readily follow the death or removal of a nationalist dictator.

But the effects of war on the U.S. and its allies may end up being almost as tragic. Instead of punishing the terrorists who had attacked the U.S., the war seems to have succeeded in recruiting more such irregular fighters, young men (and sometimes women) willing to die and ready to commit further acts of terror or revenge. By insisting on fighting a more or less randomly selected nation-state, the U.S. may only have multiplied the non-state threats it faces.

Unwieldy Armies

Whatever they may think of what the U.S. and its allies did in Iraq, many national leaders are beginning to acknowledge that conventional militaries are becoming, in a strictly military sense, almost ludicrously anachronistic. Not only are they unsuited to crushing counterinsurgencies and small bands of terrorists or irregular fighters, but mass armies are simply too cumbersome to deploy on short notice.

In military lingo, they are weighed down by their “tooth to tail” ratio — a measure of the number of actual fighters in comparison to the support personnel and equipment the fighters require. Both hawks and liberal interventionists may hanker to airlift tens of thousands of soldiers to distant places virtually overnight, but those soldiers will need to be preceded or accompanied by tents, canteens, trucks, medical equipment, and so forth. “Flyover” rights will have to be granted by neighboring countries; air strips and eventually bases will have to be constructed; supply lines will have be created and defended — all of which can take months to accomplish.

The sluggishness of the mass, labor-intensive military has become a constant source of frustration to civilian leaders. Irritated by the Pentagon’s hesitation to put “boots on the ground” in Bosnia, then-Secretary of State Madeline Albright famously demanded of Secretary of Defense Colin Powell, “What good is this marvelous military force if we can never use it?” In 2009, the Obama administration unthinkingly proposed a troop surge in Afghanistan, followed by a withdrawal within a year and a half that would have required some of the troops to start packing up almost as soon as they arrived. It took the U.S. military a full month to organize the transport of 20,000 soldiers to Haiti in the wake of the 2010 earthquake — and they were only traveling 700 miles to engage in a humanitarian relief mission, not a war.

Another thing hobbling mass militaries is the increasing unwillingness of nations, especially the more democratic ones, to risk large numbers of casualties. It is no longer acceptable to drive men into battle at gunpoint or to demand that they fend for themselves on foreign soil. Once thousands of soldiers have been plunked down in a “theater,” they must be defended from potentially hostile locals, a project that can easily come to supersede the original mission.

We may not be able clearly to articulate what American troops were supposed to accomplish in Iraq or Afghanistan, but without question one part of their job has been “force protection.” In what could be considered the inverse of “mission creep,” instead of expanding, the mission now has a tendency to contract to the task of self-defense.

Ultimately, the mass militaries of the modern era, augmented by ever-more expensive weapons systems, place an unacceptable economic burden on the nation-states that support them — a burden that eventually may undermine the militaries themselves. Consider what has been happening to the world’s sole military superpower, the United States. The latest estimate for the cost of the wars in Iraq and Afghanistan is, at this moment, at least $3.2 trillion, while total U.S. military spending equals that of the next 15 countries combined, and adds up to approximately 47% of all global military spending.

To this must be added the cost of caring for wounded and otherwise damaged veterans, which has been mounting precipitously as medical advances allow more of the injured to survive.  The U.S. military has been sheltered from the consequences of its own profligacy by a level of bipartisan political support that has kept it almost magically immune to budget cuts, even as the national debt balloons to levels widely judged to be unsustainable.

The hard right, in particular, has campaigned relentlessly against “big government,” apparently not noticing that the military is a sizable chunk of this behemoth.  In December 2010, for example, a Republican senator from Oklahoma railed against the national debt with this statement: “We’re really at war. We’re on three fronts now: Iraq, Afghanistan, and the financial tsunami  [arising from the debt] that is facing us.” Only in recent months have some Tea Party-affiliated legislators broken with tradition by declaring their willingness to cut military spending.

How the Warfare State Became the Welfare State

If military spending is still for the most part sacrosanct, ever more spending cuts are required to shrink “big government.”  Then what remains is the cutting of domestic spending, especially social programs for the poor, who lack the means to finance politicians, and all too often the incentive to vote as well. From the Reagan years on, the U.S. government has chipped away at dozens of programs that had helped sustain people who are underpaid or unemployed, including housing subsidies, state-supplied health insurance, public transportation, welfare for single parents, college tuition aid, and inner-city economic development projects.

Even the physical infrastructure — bridges, airports, roads, and tunnels — used by people of all classes has been left at dangerous levels of disrepair. Antiwar protestors wistfully point out, year after year, what the cost of our high-tech weapon systems, our global network of more than 1,000 military bases, and our various “interventions” could buy if applied to meeting domestic human needs. But to no effect.  

This ongoing sacrifice of domestic welfare for military “readiness” represents the reversal of a historic trend. Ever since the introduction of mass armies in Europe in the seventeenth century, governments have generally understood that to underpay and underfeed one’s troops — and the class of people that supplies them — is to risk having the guns pointed in the opposite direction from that which the officers recommend.  

In fact, modern welfare states, inadequate as they may be, are in no small part the product of war — that is, of governments’ attempts to appease soldiers and their families. In the U.S., for example, the Civil War led to the institution of widows’ benefits, which were the predecessor of welfare in its Aid to Families with Dependent Children form. It was the bellicose German leader Otto von Bismarck who first instituted national health insurance.

World War II spawned educational benefits and income support for American veterans and led, in the United Kingdom, to a comparatively generous welfare state, including free health care for all. Notions of social justice and fairness, or at least the fear of working class insurrections, certainly played a part in the development of twentieth century welfare states, but there was a pragmatic military motivation as well: if young people are to grow up to be effective troops, they need to be healthy, well-nourished, and reasonably well-educated.

In the U.S., the steady withering of social programs that might nurture future troops even serves, ironically, to justify increased military spending. In the absence of a federal jobs program, Congressional representatives become fierce advocates for weapons systems that the Pentagon itself has no use for, as long as the manufacture of those weapons can provide employment for some of their constituents.

With diminishing funds for higher education, military service becomes a less dismal alternative for young working-class people than the low-paid jobs that otherwise await them. The U.S. still has a civilian welfare state consisting largely of programs for the elderly (Medicare and Social Security). For many younger Americans, however, as well as for older combat veterans, the U.S. military is the welfare state — and a source, however temporarily, of jobs, housing, health care and education.

Eventually, however, the failure to invest in America’s human resources — through spending on health, education, and so forth — undercuts the military itself. In World War I, public health experts were shocked to find that one-third of conscripts were rejected as physically unfit for service; they were too weak and flabby or too damaged by work-related accidents.

Several generations later, in 2010, the U.S. Secretary of Education reported that “75 percent of young Americans, between the ages of 17 to 24, are unable to enlist in the military today because they have failed to graduate from high school, have a criminal record, or are physically unfit.” When a nation can no longer generate enough young people who are fit for military service, that nation has two choices: it can, as a number of prominent retired generals are currently advocating, reinvest in its “human capital,” especially the health and education of the poor, or it can seriously reevaluate its approach to war.

The Fog of (Robot) War

Since the rightward, anti-“big government” tilt of American politics more or less precludes the former, the U.S. has been scrambling to develop less labor-intensive forms of waging war. In fact, this may prove to be the ultimate military utility of the wars in Iraq and Afghanistan: if they have gained the U.S. no geopolitical advantage, they have certainly served as laboratories and testing grounds for forms of future warfare that involve less human, or at least less governmental, commitment.

One step in that direction has been the large-scale use of military contract workers supplied by private companies, which can be seen as a revival of the age-old use of mercenaries.  Although most of the functions that have been outsourced to private companies — including food services, laundry, truck driving, and construction — do not involve combat, they are dangerous, and some contract workers have even been assigned to the guarding of convoys and military bases.

Contractors are still men and women, capable of bleeding and dying — and surprising numbers of them have indeed died.  In the initial six months of 2010, corporate deaths exceeded military deaths in Iraq and Afghanistan for the first time. But the Pentagon has little or no responsibility for the training, feeding, or care of private contractors.  If wounded or psychologically damaged, American contract workers must turn, like any other injured civilian employees, to the Workers’ Compensation system, hence their sense of themselves as a “disposable army.”  By 2009, the trend toward privatization had gone so far that the number of private contractors in Afghanistan exceeded the number of American troops there.

An alternative approach is to eliminate or drastically reduce the military’s dependence on human beings of any kind.  This would have been an almost unthinkable proposition a few decades ago, but technologies employed in Iraq and Afghanistan have steadily stripped away the human role in war. Drones, directed from sites up to 7,500 miles away in the western United States, are replacing manned aircraft.

Video cameras, borne by drones, substitute for human scouts or information gathered by pilots. Robots disarm roadside bombs. When American forces invaded Iraq in 2003, no robots accompanied them; by 2008, there were 12,000 participating in the war.  Only a handful of drones were used in the initial invasion; today, the U.S. military has an inventory of more than 7,000, ranging from the familiar Predator to tiny Ravens and Wasps used to transmit video images of events on the ground.  Far stranger fighting machines are in the works, like swarms of lethal “cyborg insects” that could potentially replace human infantry.

These developments are by no means limited to the U.S. The global market for military robotics and unmanned military vehicles is growing fast, and includes Israel, a major pioneer in the field, Russia, the United Kingdom, Iran, South Korea, and China. Turkey is reportedly readying a robot force for strikes against Kurdish insurgents; Israel hopes to eventually patrol the Gaza border with “see-shoot” robots that will destroy people perceived as transgressors as soon as they are detected.

It is hard to predict how far the automation of war and the substitution of autonomous robots for human fighters will go. On the one hand, humans still have the advantage of superior visual discrimination.  Despite decades of research in artificial intelligence, computers cannot make the kind of simple distinctions — as in determining whether a cow standing in front of a barn is a separate entity or a part of the barn — that humans can make in a fraction of a second.

Thus, as long as there is any premium on avoiding civilian deaths, humans have to be involved in processing the visual information that leads, for example, to the selection of targets for drone attacks. If only as the equivalent of seeing-eye dogs, humans will continue to have a role in war, at least until computer vision improves.

On the other hand, the human brain lacks the bandwidth to process all the data flowing into it, especially as new technologies multiply that data. In the clash of traditional mass armies, under a hail of arrows or artillery shells, human warriors often found themselves confused and overwhelmed, a condition attributed to “the fog of war.” Well, that fog is growing a lot thicker. U.S. military officials, for instance, put the blame on “information overload” for the killing of 23 Afghan civilians in February 2010, and the New York Times reported that:

“Across the military, the data flow has surged; since the attacks of 9/11, the amount of intelligence gathered by remotely piloted drones and other surveillance technologies has risen 1,600 percent. On the ground, troops increasingly use hand-held devices to communicate, get directions and set bombing coordinates. And the screens in jets can be so packed with data that some pilots call them “drool buckets” because, they say, they can get lost staring into them.”

When the sensory data coming at a soldier is augmented by a flood of instantaneously transmitted data from distant cameras and computer search engines, there may be no choice but to replace the sloppy “wet-ware” of the human brain with a robotic system for instant response.

War Without Humans

Once set in place, the cyber-automation of war is hard to stop.  Humans will cling to their place “in the loop” as long as they can, no doubt insisting that the highest level of decision-making — whether to go to war and with whom — be reserved for human leaders. But it is precisely at the highest levels that decision-making may most need automating. A head of state faces a blizzard of factors to consider, everything from historical analogies and satellite-derived intelligence to assessments of the readiness of potential allies. Furthermore, as the enemy automates its military, or in the case of a non-state actor, simply adapts to our level of automation, the window of time for effective responses will grow steadily narrower. Why not turn to a high-speed computer? It is certainly hard to imagine a piece of intelligent hardware deciding to respond to the 9/11 attacks by invading Iraq.

So, after at least 10,000 years of intra-species fighting — of scorched earth, burned villages, razed cities, and piled up corpses, as well, of course, as all the great epics of human literature — we have to face the possibility that the institution of war might no longer need us for its perpetuation. Human desires, especially for the Earth’s diminishing supply of resources, will still instigate wars for some time to come, but neither human courage nor human bloodlust will carry the day on the battlefield.

Computers will assess threats and calibrate responses; drones will pinpoint enemies; robots might roll into the streets of hostile cities. Beyond the individual battle or smaller-scale encounter, decisions as to whether to match attack with counterattack, or one lethal technological innovation with another, may also be eventually ceded to alien minds.

This should not come as a complete surprise. Just as war has shaped human social institutions for millennia, so has it discarded them as the evolving technology of war rendered them useless. When war was fought with blades by men on horseback, it favored the rule of aristocratic warrior elites. When the mode of fighting shifted to action-at-a-distance weapons like bows and guns, the old elites had to bow to the central authority of kings, who, in turn, were undone by the democratizing forces unleashed by new mass armies.

Even patriarchy cannot depend on war for its long-term survival, since the wars in Iraq and Afghanistan have, at least within U.S. forces, established women’s worth as warriors. Over the centuries, human qualities once deemed indispensable to war fighting — muscular power, manliness, intelligence, judgment — have one by one become obsolete or been ceded to machines.

What will happen then to the “passions of war”? Except for individual acts of martyrdom, war is likely to lose its glory and luster. Military analyst P.W. Singer quotes an Air Force captain musing about whether the new technologies will “mean that brave men and women will no longer face death in combat,” only to reassure himself that “there will always be a need for intrepid souls to fling their bodies across the sky.”

Perhaps, but in a 2010 address to Air Force Academy cadets, an under secretary of defense delivered the “bad news” that most of them would not be flying airplanes, which are increasingly unmanned. War will continue to be used against insurgencies as well as to “take out” the weapons facilities, command centers, and cities of designated rogue states. It may even continue to fascinate its aficionados, in the manner of computer games. But there will be no triumphal parades for killer nano-bugs, no epics about unmanned fighter planes, no monuments to fallen bots.

And in that may lie our last hope. With the decline of mass militaries and their possible replacement by machines, we may finally see that war is not just an extension of our needs and passions, however base or noble. Nor is it likely to be even a useful test of our courage, fitness, or national unity. War has its own dynamic or — in case that sounds too anthropomorphic — its own grim algorithms to work out. As it comes to need us less, maybe we will finally see that we don’t need it either. We can leave it to the ants.

Barbara Ehrenreich is the author of a number of books including Nickel and Dimed: On (Not) Getting By in America and Bright-Sided: How the Relentless Promotion of Positive Thinking Has Undermined America. This essay is a revised and updated version of the afterword to the British edition of Blood Rites: Origins and History of the Passions of War (Granta, 2011).  To listen to Timothy MacBain’s latest TomCast audio interview in which Ehrenreich discusses the nature of war and how to fight against it, click here, or download it to your iPod here.

Copyright 2011 Barbara Ehrenreich

Barbara Ehrenreich (1941-2022) was the author of 17 books, including the bestsellers Nickel and Dimed and Bait and Switch. A frequent contributor to Harper's and the Nation, she had also been a columnist at the New York Times and Time magazine.  To listen to the TomDispatch audio interview with Ehrenreich that accompanies this piece, click here.