Elon Musk Leads 115 AI Founders & Experts to Call for a Complete Ban of “Killer Robots”

Rafia Shaikh
elon musk killer robots
Automated weapons can be "hacked to behave in undesirable ways," the group warns.

Elon Musk along with 115 leading Artificial Intelligence and robotic experts from over 26 countries have joined together to call on the United Nations to ban the development and use of autonomous weapons. These weapons, also being called "killer robots," include drones, tanks, autonomous machine guns, and other forms of AI controlled weaponry. Mustafa Suleyman, the cofounder of Google DeepMind, has also signed this letter.

"Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend," the letter says. "These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways."

The UN had recently voted to begin formal discussions on such weapons and was scheduled to meet on August 21. However, the meeting has now been postponed to November.

Related Story Tesla’s Board Takes a Dim View of Elon Musk’s Online Polls but Urges Shareholders to Re-Ratify His 2018 Compensation Package

"Greatest threat we face as a civilization" - Elon Musk

While Musk is famous for calling out on the potential of AI to go bad, he has now been joined by the leaders of the AI community. The group has warned that this arms race could usher the world in the "third revolution in warfare" after gunpowder and nuclear arms. Musk has previously even called AI "the greatest threat we face as a civilization," even greater than the nuclear weapons.

The signatories believe that while AI is still in infancy for many industries, automated weaponry is almost here. “Unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability," Ryan Gariepy, the founder of Clearpath Robotics, said.

Considering how unpredictable AI can be, they fear these weapons could operate on their own or by a select few. An earlier report had suggested that if nothing is done to prevent such an arms race, under current laws, manufacturers and programmers will escape liability for deaths that will be caused by these killer robots.

"We do not have long to act. Once this Pandora’s box is opened, it will be hard to close."

The experts have previously shared concerns over how relatively easier it is to build autonomous weapons especially when compared to nuclear weapons. This ensures that these products will be cheaper, easier to build, and easier to buy, making anyone from a terrorist to a despot owning them up in bulk.

This open letter was released today at the opening of the International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne after the UN had to delay today's meeting. While Musk's statements on AI have often been considered as overblown, he has been joined by the likes of Stephen Hawking, Google's Suleyman, Apple's Steve Wozniak, and now the experts in the fields of AI and robotics to warn the UN of the real dangers of automated weapons.

As Gariepy said, AI might not be intelligent enough elsewhere, but autonomous weapons are already here. With cyber-weaponry and cybersecurity another growing concern, terrorists and adversaries could very well hack these weapons to "behave in undesirable ways."

Share this story

Deal of the Day

Comments