Seventy years ago in his short story Runaround, science fiction writer Isaac Asimov introduced the “Three Laws of Robotics”. They are:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
In an op-ed today in the Wall Street Journal Robert H. Latiff and Patrick J. McCloskey lay out the case for restraint in developing autonomous robots with the ability to kill:
These machines will bring many benefits, greatly increasing battle reach and efficiency while eliminating the risk to human soldiers. If a drone gets shot down, there’s no grieving family to console back home. Politicians will appreciate the waning of antiwar protests, too.
The problem is that robotic weapons eventually will make kill decisions on the battlefield with no more than a veneer of human control. Full lethal autonomy is no mere next step in military strategy: It will be the crossing of a moral Rubicon. Ceding godlike powers to robots reduces human beings to things with no more intrinsic value than any object.
When robots rule warfare, utterly without empathy or compassion, humans retain less intrinsic worth than a toaster—which at least can be used for spare parts. In civilized societies, even our enemies possess inherent worth and are considered persons, a recognition that forms the basis of the Geneva Conventions and rules of military engagement.
I would go farther than the authors of the op-ed do. I would contend that developing robotic weapons with “full lethal autonomy” is inherently a war crime. Gen. Latiff and Mr. McCloskey do not mention the “Three Laws” in their op-ed but I think it’s time to bring them back into the discussion.
War is death and destruction. The trend of the last forty years in the United States has been to lower the transaction costs of war through air warfare, guided missiles, the volunteer army, and, now, unmanned drones. There should always be a presumption against war but, as the transaction costs of war grow lower, the internalized prohibitions in law, politics, and social conscience must become greater.
Human beings, unfortunately, do not come with the “Three Laws” built into them. We should strive to make our creations better than ourselves in that respect.