The World Should Outlaw Lethal Autonomous Weapons (LAWs)

This technology is a major threat.

Two recent developments made me think of Lethal Autonomous Weapons, ironically abbreviated to LAWs. One was a report that China has deployed machine-gun welding autonomous robots near the India border (https://bit.ly/36a4XCc ),with the Himalayan heights were too cold for human soldiers to patrol 24×7! The other was a fascinating series of lectures by the BBC, called the Reith Lectures (https://bbc.in/3oInXOH ), delivered by AI doyen Stuart Russell, where he describes in urgent anguish the threat proposed by LAWs, or AI-powered weapons.

Think of autonomous weapons and most people picture the giant war robots of Terminator fame. While we are not there yet, there is an arms race on among the world powers to make something similar and as sinister. Russell’s Future of Life Institute had created a Black Mirror style fictional video in 2017 called Slaughterbots, which went viral. It depicted small AI-powered quadcopters with explosive warheads, which could attack cities and people in swarms and become fearsome, intelligent weapons of mass destruction. They could be algorithmically directed to choose certain race, or gender, or even a particular face of victims as its target profile, and selectively destroy them by firing at them or through kamikaze attacks. Professor Russel worryingly pointed out that all the technologies shown already existed but had not been put together. The Institute recently released a sequel to the video which has even more dangerous ‘use cases’: LAWs in cars shooting voters in a polling booth, a bank robbery by doglike robots carrying assault weapons, a nightclub massacre by autonomous quadcopters with bombs. These scenarios have become horrifyingly real: an autonomous armed drone attack by Houthi rebels at a Saudi Arabia’s airport injured eight. Autonomous armed quadcopters have started appearing in arms fairs and exhibitions.

I have often written about Artificial Intelligence as perhaps the most powerful technology created by man. Like any other technology, it has great benefits, but also a dark side, and LAWs inhabit its darkest dungeons. One of the reasons Russell brought out the video and gave the Reith Lectures was to persuade world powers to formulate an international, legally binding prohibition on autonomous weapons which use artificial intelligence to identify, select, and kill people without human intervention. The Lectures were timed just before the U.N.’s Convention on Certain Conventional Weapons – Sixth Review Conference which took place late 2020, with 125 member countries meeting to discuss the ban. There are clear precedents for this: the potentially catastrophic nuclear threat has so far been averted by global regulation. The awkwardly named Convention on Certain Conventional Weapons (CCW) has succeeded in curtailing dangerous incendiary explosives and blinding lasers; bio and chemical weapons are regulated too.

Killer robots, on the other hand, do not seem to sway the Great Powers. The Sixth Review Conference ended with an ineffectual and non-binding code of conduct. US, UK, Russia, China all argued that it was ‘too early’, something that Professor Russell and other experts vehemently disagree with and have demonstrated that. These countries have put forward the ludicrous argument that LAWs can actually lead to more ‘humane’ wars with robots fighting robots with no humans involved, or that the killer bots will sharply target and kill only soldiers leaving civilians unharmed. Secretly, they are scared that other countries will take a lead over them, giving them unassailable military powers, if they themselves were to stop working on LAWs.

The above arguments do not hold water. The same could have been true of bio and chemical weapons, but the world managed to control them; though, admittedly, it was a less fractious world then, with more statesmen among politicians. Secondly, the technology behind these weapons is nowhere as precise as the militaries would like us to believe – note the civilian causalities which happen with every ‘targeted’ drone or missile strike in Afghanistan or Iraq. Thirdly, and most ominously, the moment you have weapons like these being produced by the Raytheons and Lockheed Martins of the world, non-state actors and terrorist groups will get hold of them. Swarming, autonomous killing machines will make suicide-vest attacks seem medieval by comparison. Max Tegmark of the Future of Life Institute puts it succinctly: “It’s not in the national security interest of the U.S., the U.K. or China for W.M.D.’s to be so cheap that all our adversaries can afford them.”

The great inventor Thomas Edison was prescient when he said a century ago: “There will one day spring from the brain of science a machine or force so fearful in its potentialities, so absolutely terrifying, that even man . . . will be appalled, and so abandon war forever.” Even nuclear weapons, with the power to destroy our planet multiple times over, have not met Edison’s prophecy. Will it then be the ‘killer robots’ powered by AI – perhaps Man’s Final Invention?

FAQ

 Artificial Intelligence is a leading technology that promises to change the face of warfare. AI is capable of enabling autonomous systems to conduct missions, achieving sensor fusion, automating tasks, and making better, quicker decisions than humans. For now, AI is limited to more mundane, dull, and monotonous tasks performed by our military in uncontested environments.

With AI being omnipresent, the potential dark side to artificial intelligence is no longer hidden to anyone. AI technologies can create opportunities for cyber threats; lead to data privacy concerns, ethics and morality concerns, non-transparency issues and the “black-box problem” as explained below. AI technology can be manipulative, and it is up to market participants, lawyers and policymakers to work together when coming up with an effective regulatory statute that can guide AI-related processes. Since these technologies directly affect us, we should also be part of AI decision-making.


Subscribe To My Monthly Newsletter