Human Rights Watch Wants to Stop Killer Robots
If it’s not already too late...
Photo by Flickr user Global Panorama.
It’s probably already too late to stop the inevitable domination of the robot race—because, let’s be real, our iPhones are about iOS update away from becoming self-aware and subjugating all of humankind—but that won’t stop Human Rights Watch from trying. The human rights organization published a report called Mind the Gap on Thursday imploring the United Nations to adopt laws that would prohibit the development of “autonomous killer robots.” “Autonomous” is the key word here, because, as we all well know, killer robots already do exist, and they can fly, too—they’re called drones.
Human Rights Watch, along with its co-author, Harvard Law School’s International Human Rights Clinic, recommend that the U.N. “prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument” and “adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons.”
They concluded in their report:
The weapons themselves could not be held accountable for their conduct because they could not act with criminal intent, would fall outside the jurisdiction of international tribunals, and could not be punished. Criminal liability would likely apply only in situations where humans specifically intended to use the robots to violate the law. In the United States at least, civil liability would be virtually impossible due to the immunity granted by law to the military and its contractors and the evidentiary obstacles to products liability suits.
NO KIDDING. Apparently, as the current laws stand, it would be extremely difficult to hold any human who builds an autonomous killer robot accountable for the machine’s misdeeds. Human Rights Watch released the report just in time for the U.N. to convene and discuss edits to its Convention on Certain Conventional Weapons, which bans or limits the use of weapons that can be considered especially dangerous.