by Heather Wilson
On July 20, 1969, a Purdue graduate from Wapakoneta, Ohio, stepped onto the surface of the moon. I watched it on a black-and-white Zenith television sitting on the floor in the den of our New England farmhouse with my two brothers. That den was not a big room, and the television was wedged between the fireplace and the family bookshelf. In the next room, under windows that looked out on a hedge of lilacs, was a stereo—a solid wooden cabinet the size of a dining-room sideboard. On its turntable we set a fragile arm with its embedded needle into grooves on black vinyl disks and listened to Broadway show tunes. On the wall in the kitchen in the next room was an avocado green telephone with a rotary dial and an extra long cord so that my mother could talk on the phone while doing dishes.
In the cellar, my stepfather had a dark room. There we turned the plastic cartridges containing strips of light-sensitive plastic from our Kodak Instamatic cameras into black-and-white images, moving the paper carefully with tongs from one acrid pan of chemicals to another in the dim red light.
It was 1969. And Neil Armstrong took one small step for man; one giant leap for mankind. Just half a lifetime ago. And today, nearly everyone reading this has a computer in their pocket that is a television, and a bookshelf, and a music player, and a telephone, and a camera that is about a million times more powerful than all the computing power NASA had available to put a man on the moon.
Computers have changed nearly every aspect of our lives, and we should be asking ourselves how computers have changed our behaviors and our understanding, as well. Our kitchens, our professions, our cars, our health care, our entertainment, the way we communicate with people we love, the way we get our news or decide where to go out to dinner—is all different, because of the power of computing.
What has mankind done with computing power?
Much of the answer is good almost beyond imagining. We have pulled billions of people from subsistence to surplus through the economic growth enabled by technology. We have advanced health care, and energy use, and education, and food production, and navigation.
At the same time, we are facing disadvantages brought to us by the computer revolution and its resulting intrusions on our privacy, and nearly addictive power over our brains.
In terms of computer-driven change, the military is little different from the rest of modern culture. If anything, the thinking we do about how computers have changed our practices and behaviors becomes even more pressing when we consider questions of war and defense. How do we, and how should we, think about computing and the use of force?
For most purposes, the consideration of just war is divided into two parts: the legitimacy of the resort to force (jus ad bellum) , and the rules governing the conduct of hostilities (jus in bello) . The growth of the power of computing challenges us in both of these areas in different ways.
With few exceptions, human societies have abhorred war even when they have found it to be sometimes justified. Modern international society accepts the use of force by some agents in world politics in particular circumstances. And a key moment in the development of this just-war analysis comes in the thirteenth century when St. Thomas Aquinas places the legitimate power to go to war (jus ad bellum) in the hands of a sovereign authority, writing, “In order that a war may be just three things are necessary. In the first place, the authority of the prince, by whose order the war is undertaken.” His second and third requirements for a just war, like those of his predecessor St Augustine, were a just cause and right intent.
The medieval thinkers of Thomas’s time were living in a period of tremendous social and political change. By the middle of the thirteenth century the centralized system of feudalism was losing ground and there was a new world of consolidated city-states and growing kingdoms.
A few centuries later, when Hugo Grotius was writing about international law in the sixteenth century, sovereigns had gradually asserted a monopoly on the use of violence both within their borders and against other sovereigns. By the close of the nineteenth century, sovereignty and the exclusive right to wage war were characteristics of a State so strongly established that to suggest otherwise would have seemed preposterous.
Over the same stretch of time, an international consensus gradually developed that the methods and means of warfare (jus in bello) are not unlimited. In the late nineteenth and early twentieth centuries these customary principles began to be codified in conventions and treaties between states intended to mitigate the horrors of war.
No comments:
Post a Comment