INTRODUCTION
What are killer robots? What are our major concerns? If you have seen the Netflix produced a documentary called the “Unknown: Killer Robots”, you might have a clue on what we will be discussing in this article. For those who do not, the documentary follows the terrifying behind-the-scenes of military-funded scientists racing to build this technology, as Artificial intelligence infiltrates every level of the armed forces.[1] This might seem far-fetched in the scenario where robots are taking over the world. However, this still remains a possibility technology does not just encompass Siri or Alexa but also under its purview is every country-owned war machine. This can be seen in the Ukraine and Gaza wars. Killer drones in the air and sea that are present on the battlefield are worrying but at least they are controlled by humans.
“By combining AI with advanced robotics, the US military and those of other advanced powers are already hard at work creating an array of self-guided “autonomous” weapons systems—combat drones that can employ lethal force independently of any human officers meant to command them. Called “killer robots” by critics, such devices include a variety of uncrewed or “unmanned” planes, tanks, ships, and submarines capable of autonomous operation. The US Air Force, for example, is developing its “collaborative combat aircraft”, an unmanned aerial vehicle (UAV) intended to join piloted aircraft on high-risk missions. The Army is similarly testing a variety of autonomous unmanned ground vehicles (UGVs), while the Navy is experimenting with both unmanned surface vessels (USVs) and unmanned undersea vessels (UUVs, or drone submarines). China, Russia, Australia, and Israel are also working on such weaponry for the battlefields of the future.”[2]
Now killer robots are machines that do not need any human control and are perfectly capable of functioning on their own. They can select targets and attack! If you have seen the movie MEGAN, here is a real-life example of that possibility. This development in the weaponry system would fundamentally change the way wars are carried on. This is often what we call a third revolution in the world of warfare after nuclear bombs. The name of this robot is Lethal Autonomous Weapon Systems (also abbreviated as LAWS). Ironic much? When we use the term “autonomous” it is needless to say that the term may mean different in different fields of study. In terms of military weapon development, the identification of a weapon as autonomous is not as clear as in other areas.[3] The official United States Department of Defense Policy on Autonomy in Weapon Systems, defines an Autonomous Weapons System as, “A weapon system that, once activated, can select and engage targets without further intervention by a human operator.”[4]
Rest assured science fiction movies like The Terminator or I Robot, are not likely to become a reality anytime soon but there is still a possibility that they could be deployed in the coming years as some systems are still under development. Today’s arms are still operating on human control which is responsible for selecting and attacking targets. We will provide some examples of these arms below:[5]
- SGR-A1: “Made by Hanhwa (South Korea) sold to South Korea. This stationary robot, armed with a machine gun and a grenade launcher, operated along the border between North and South Korea. It can detect human beings using infra-red sensors and pattern recognition software. The robot has both a supervised and unsupervised mode available. It can identify and track intruders, with the possibility of firing at them.”[6]
- SEAHUNTER: “Made by Pentagon’s Darpa (United States of America) sold to Under development. This 40m long self-navigating warship is designed to hunt for enemy submarines and can operate without contact with a human operator for 2-3 months at a time. It is currently unarmed. US representatives have said the goal is to arm the Sea Hunters and to build unmanned flotillas within a few years. However, it has been said any decision to use offensive lethal force would be made by humans.”[7]
- HARPY: “Made by Israel Aerospace Industries (Israel), and sold to China, India, Israel, South Korea and Turkey. This 2.1m long ‘loitering’ missile is launched from a ground vehicle. It is armed with a 15 kg explosive warhead. The Harpy can loiter for up to 9 hours at a time, searching for enemy radar signals. It automatically detects, attacks and destroys enemy radar emitters by flying into the target and detonating.”[8]
- NEURON: Made by Dassault Aviation (France) and sold to Under Development. This 10m long stealth unmanned combat aircraft can fly autonomously for over 3 hours for autonomous detection, localization, and reconnaissance of ground targets. The Neuron has fully automated attack capabilities, target adjustment, and communication between systems.”[9]
“The motorcycle’s growling engine was no match for the silent drone as it stalked Mr. Babenko. “Push, push more. Pedal to the metal, man,” his colleagues called out over a walkie-talkie as the drone swooped toward him. “You’re screwed, screwed!”
If the drone had been armed with explosives, and if his colleagues hadn’t disengaged the autonomous tracking, Mr. Babenko would have been a goner.
Vyriy is just one of many Ukrainian companies working on a major leap forward in the weaponization of consumer technology, driven by the war with Russia. The pressure to outthink the enemy, along with huge flows of investment, donations and government contracts, has turned Ukraine into a Silicon Valley for autonomous drones and other weaponry.”[10]
CONCLUSION
It is necessary for any developed country to invest in defence and national security. Naturally, this has its own repercussions and involves legal, ethical, moral and security concerns as well. A technical system that is any machine in any form should not be ever created in a manner that allows it to make decision on its own especially when decisions are concerning matters of life and death. Such crucial judgments and assessments should not be left to a computer system. Legally, this will attract many human rights violation charges and is indeed against the principles of the right to life and the right to live with dignity.
Let us assume, we put some sort of data in a robot that makes it susceptible to understanding and valuing human life. Following this logic, let us also assume that in such a case the said robot will not have a “kill switch” and will take into account a rational decision. Now, whether such a decision was implicit, explicit or simply what the robot was “programmed” to do will depend from situation to situation. However, this logic also brings us to the fact that if needed like a rottweiler it is a tiny possibility that if a robot senses danger, it might as well make a “tough call” and make a life threatening choice. As such, whom do we prosecute the robot or the creator of the robot? Why the creator? When a robot makes a decision and would possibly know the consequences of such a decision. Can we really punish a robot for a loss of a human life? Or a million human lives? Either which way this brings down or desensitizes the value we place on human life.
[1] Unknown: Killer Robots (2023) - IMDb [2] Michael T Klare, The Killer Robots Are Here. It’s Time To Be Worried, Feb 23, 2024, Retrieved from: The Killer Robots Are Here. It’s Time to Be Worried. | The Nation [3] Crootof, Rebecca (2015). "The Killer Robots Are Here: Legal and Policy Implications". Cardozo L. Rev. 36: 1837 – via heinonline.org. [4] Allen, Gregory (6 June 2022). "DOD Is Updating Its Decade-Old Autonomous Weapons Policy, but Confusion Remains Widespread". Center for Strategic and International Studies. [5] The Below examples have been taken from a PPT titled WHAT ARE KILLER ROBOTS, what are they? And what are the concerns? For more information retrieve from: pax-booklet-killer-robots-what-are-they-and-what-are-the-concerns.pdf (paxforpeace.nl) [6] Ibid. [7] Ibid. [8] Ibid. [9] Id. [10] A.I. Begins Ushering In an Afe of Killer Robots, New York Times, By Paul Mozur and Adam Satariano, July 2nd 2024, Retrieved from: In Ukraine War, A.I. Begins Ushering In an Age of Killer Robots - The New York Times (nytimes.com)