Asimov’s Laws of Robotics Do Not Apply to Terminator or Military Drones

Photo of author

(Newswire.net — November 14, 2014)  — Fire and forget just got a new meaning as the latest technology marvel, autonomous drones, are capable of executing long-range missions and returning home ontheir own.  Some argue that these may violate human rights as they sometimes mistake civilians for military targets.

Is a drone really a robot? According to the UN, and anyone with common sense, a machine that can operate without the help of a human is a robot, and Isaac Asimov’s 3 Laws of Robotics clearly say:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Too bad these laws are not mandatory for every machine that could make decisions autonomously.

Since there are no real regulations that control the conduct of a drone, UN member states warned on Thursday at the annual Convention on Conventional Weapons (CCW) in Geneva, that autonomous weapons systems are increasing the numbers of violations of international and humanitarian law.

Many citizens have already fallen victim to drone strikes, but who is to blame if a drone makes a miscalculation and drops a tactical weapon on civilians?

In December 2013, for example, a US strike in Yemen killed 15 people on their way to a wedding. Pakistan, meanwhile is estimated to have suffered over 3,200 drone-strike fatalities since 2004, 175 of those deaths being children.

Military officials behind the drone do not feel responsible for an error in the algorithm. Nonetheless, the UN delegates will share responsibility when voting by consensus on Friday afternoon whether or not to continue next year with multilateral talks on subjects relating to “lethal autonomous weapons systems.”

The UN conference in Geneva was the second such international assembly this year to discuss the rise of autonomous killing machines.

“There is a sense of urgency about how we deal with killer robots. Technology is racing ahead”, Human Rights Watch said at the UN Assembly on Thursday

Another organization, the Campaign to Stop Killer Robots, called on leaders to “preemptively ban weapons that would select and attack targets without further human intervention.”

“We have many concerns with these fully autonomous weapons, but perhaps our most significant concern is with the notion of permitting a machine to take a human life on the battlefield or in law enforcement and other situations,” the letter said.

Meanwhile, Lockheed Martin is developing a so-called Long Range Anti-Ship Missile, capable of maneuvering on its own to avoid radar, and out of radio contact with human controllers “designed to destroy its target with a minimal amount of human control,” the Times article noted.

According to the Bureau of Investigative Journalism, 746 people were killed in CIA drone strikes between January 2006 and October 2009. Of that number, nearly 20 percent fatalities were civilians and 94 victims are said to be children.