Killer robots sound like something out of a scary futuristic science fiction film. But, it seems, the future is here.
- Killer robot's first-time use in Dallas opens ethical debate in the United States
- Black youths ponder ways not to become next hashtag: Staying away from cops, swallowing pride
- 'Bizarre' pro-Israel robot accused of harassing students at Brown University
The use of a police robot loaded with explosives to kill Micah Xavier Johnson, who shot and killed five police officers during a protest march in Dallas last week, has brought an international ethics debate over the use of robotic weaponry to the realm of domestic law enforcement.
The discussion has been going on for decades, since the first drones – Unmanned Aerial Vehicles – took to the sky. In recent years, the debate has expanded to the use of robots on the ground, or Unmanned Ground Vehicles.
A group called “The Campaign Against Killer Robots” has galvanized opposition to this and has been pushing for United Nations restrictions on such devices, warning that we are moving rapidly to a situation where these weapons will be “fully autonomous” – making life and death decisions with no humans at the controls.
Israel, at the cutting age of developing UAV and UGV technology, has been featured in much of the criticism. The field is a perfect fit for a country whose high-tech start-up entrepreneurs are largely equipped with the experience necessary and military knowledge.
For at least a decade there have been UGVs patrolling the southern and northern borders of Israel with sensors and communications, sending alerts and real-time pictures back to commanders, reducing the risk to troops who would otherwise be patrolling on foot.
One Israeli company, Roboteam, whose ground robots were used in the 2014 Gaza war to help infiltrate and detect bombs and booby traps in the infamous tunnels, manufactures its devices in the United States and sells them to the Pentagon and U.S. law enforcement.
Because of challenges like visual obstacles and rough terrain, the use of UGVs as weaponry is moving forward at a slower pace than aerial drones. But progress is still being made, with companies like Roboteam developing small robots that can climb stairs and navigate rugged outdoor terrain.
Companies developing robots for weaponized use in combat provide them to the Israel Defense Forces for experimental use, says Barbara Opall-Rome, a security analyst and Israel bureau chief for Defense News, who has been covering their development for years.
One small Israeli company, General Robotics, has developed a small UGV that incorporates a gun in its design, which she said was essentially “the first inherently armed tactical combat robot.”
Called the Dogo, after an Argentinian dog that fiercely protects its master, the device contains a standard Glock 26 9mm pistol inside its apparatus.
The Dallas sniper attack on Friday claimed the largest number of U.S. law enforcement lives in a single event since 9/11. The shooter later became the first American to be killed, not by the human hand of a police officer pulling a trigger, but by a robot. Loaded with C4 explosive, the robot was sent by police into the parking lot where Johnson was holed up in an hours-long standoff.
The robot they used was designed to destroy and neutralize suspicious objects and packages, in this case repurposed as a deadly weapon, put into use after negotiations with Johnson had failed and he repeatedly threatened to kill as many police as possible.
Instead of risking the lives of officers to neutralize him SWAT-team style, law enforcement authorities chose to send in the robot and detonate its explosives remotely, killing Johnson. Dallas police chief David Brown told a press conference later that the step was taken after a conclusion was reached that "other options would have exposed our officers to grave danger."
Opall-Rome said that if a robot like the Israeli Dogo, whose blows are not necessarily deadly, had been operational in a situation like Dallas, it might not only have saved police lives, but, potentially, Johnson’s as well.
“This would have been ideal for the Dallas case, it is fully commanded in the rear, and with the Glock pistol you could have taken out the shooter without necessarily killing him. It is an excellent new capability that gives more options to the commanders of the scene.” she said.
The Dogo can also be equipped with pepper spray, and can be deployed as well as a remote device for hostage negotiations that relays voice commands two ways.
Opall-Rome says that unlike the warnings of activists abroad, Israeli developers of such weaponry feel they can potentially make warfare less – not more – brutal.
In Israel, “I am not aware of any ethical debate on the use of these lifesaving tools for these commanders and fighters at the scene,” she said.
On the contrary, she says, “When the lives of their soldiers are not directly threatened, they can make more cool-headed decisions” that are more measured and informed thanks to tools like the multiple cameras incorporated in the robots.
Such an argument isn’t likely to quell the fears of robot armies or police forces, with weaponry capable of pulling a trigger or detonating a bomb without stopping to consider the consequences and cost.
There is also the immediate worry that, as in the case with drones, the fact that when political and military leaders aren’t putting their own forces in harm's way, knowing that there won’t be body bags returning home from conflicts, they will more likely take risks that can cost lives on the other side.
The political consequences are enormous. From Gaza to Pakistan, the use of pilotless drones to attack terrorist targets located near civilian population has fueled international anger and resentment at both the Israeli and American military.
The controversy over their use has led, in Israel’s case, to deliberate obfuscation as to whether aerial attacks are carried out by warplanes or drones. One can only imagine what the reaction might be to unmanned robot vehicles entering cities and villages.
Finally, as in any discussion on weaponry – from guns to nuclear bombs – there is the issue of what happens when criminals, terrorists, or regimes with clearly malevolent intent get their hands on them – as they inevitably do.
And that isn’t the only risk. Last summer, more than 1,000 experts on robotics and artificial intelligence, including Stephen Hawking, Elon Musk and Steve Wozniak, signed an open letter warning that “autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.”
The letter cautioned that autonomous offensive weapons were a threat to human life and the unbridled development of weapons that can select targets and attack without human control.
“The key question for humanity today is whether to start a global AI [artificial intelligence] arms race or to prevent it from starting,” the worried experts wrote.