Tech leaders warn against 'Pandora's box' of robotic weapons

People look at a mock killer robot in central London at the launch of the

Australia is not among those that have already stated their position.

According the UN's website, a group focused on these types of weapons was set to meet on Monday, but the session has been cancelled and rescheduled for November. "We therefore implore the High Contracting Parties to find a way to protect us all from these dangers".

Autonomous weapons already being worked on across the world include a fixed-place sentry gun being developed by the South Korean government; an unmanned combat aerial vehicle being developed in the United Kingdom by BAE Systems; robotic tanks being worked on in Russian Federation and the U.S.; and an autonomous warship that was launched in 2016 by the US. They added: "We do not have long to act".

'Once this Pandora's box is opened, it will be hard to close'. Signed by several of the world's top AI minds, the letter was spearheaded by Walsh, a professor in AI at the University of New South Wales, who told Xinhua that he was concerned with what he felt was an "arms race" occurring around the world.

Those in favour of killer robots believe the current laws of war may be sufficient to address any problems that might emerge if they are ever deployed, arguing that a moratorium, not an outright ban, should be called if this is not the case.

Experts are calling for what they describe as "morally wrong" technology to be added to the list of weapons banned under the UN Convention on Certain Conventional Weapons (CCW).

Autonomous weaponry is for the most part still at the prototype stage, although the technology is rapidly improving.

These future weapons - dubbed "killer robots" - are machines programmed to hit people or targets, which operate autonomously on the battlefield. One of the biggest worries shared by these technology leaders is that a rogue state, or tyrannical regime, would be able to use these weapons in order to suppress their populace into docility, and Walsh outlined the likely path he feels will be taken in such an eventuality.

"We should not lose sight of the fact that, unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability", he said.

Speaking to the ABC on Monday, AI expert Professor Toby Walsh said that it was crucial that the United Nations demonstrated greater urgency in preventing the proliferation of lethal autonomous weapons.

Bengio explained why he signed, saying, "the use of AI in autonomous weapons hurts my sense of ethics". Though the letter itself is more circumspect, an accompanying press release says the group wants "a ban on their use internationally".

Related:

Comments


Other news