By: Charlotte Sparling
November 25, 2024
What used to be a myth is now a reality. Lethal autonomous weapons (LAWs), at their core, are weapons that act and make lethal decisions without human intervention, effectively thinking on their own. They are similar to drones but lack the crucial element of human oversight. However, not even the international community has a clear definition of what they are, contributing to the challenge of how to adequately address them.
The development of these weapons poses several major concerns that must be addressed.
First, there are serious questions about LAWs violating international law. UN Secretary-General António Guterres argues these weapons are morally concerning and violate International Humanitarian Law (IHL). Two of the core elements of IHL are distinction and accountability. LAWs act without human oversight; there is no human verification of targets. Facial recognition is one method to identify targets but opens the door to ethnic cleansing. Furthermore, like any other technology, LAWs will inevitably malfunction, resulting in the question of who is then accountable. If a robot accidentally kills the wrong target, is the machine accountable?
Second, the question of regulation is uncertain and pressing. There is no clear regulation of LAWs, yet their potential to cause such sweeping impacts makes their regulation imperative. To prevent crises, standards must be established. The international community has recognized the importance of discussing this issue, but what exactly those steps would be is unclear.
Despite this pessimistic outlook, regulation is possible. Land mines, biological weapons, and nuclear weapons were all regulated in some manner. There is no reason why the same could not be done for LAWs, but such regulation must first be preceded by establishment of a solid definition.
The truth is that LAWs are here. We must move forward, acknowledge, and act accordingly. In the Russia-Ukraine war, there has been talk of Russia deploying weapons with some autonomous capabilities. Turkey, Israel, Russia, and South Korea have reportedly deployed similar weapons. Both the US and China are also investing heavily in this technology. No one wants to be the one without the newest and shiniest toys.
Part of what makes LAWs of interest is their life-preserving abilities. Without having to send troops who are often clouded by emotions into risky situations, LAWs can be more accurate, mitigate the loss of life, and reduce damage to the surrounding areas. It is important to recognize these benefits as the march of innovation cannot be halted.
Ukraine, recognizing these benefits, has invested heavily in autonomous drones and similar technologies. In a war that is not likely to end in a decisive Ukrainian win, any methods to minimize the sacrifice of Ukrainian lives are imperative. While Ukraine’s weapons still require human intervention, they are only one step off from LAWs. How the war develops with these weapons will likely serve as a turning point for the future of autonomous warfare.
On the global diplomatic scale, opinions are mixed. The UN and many countries support a full ban on LAWs or regulation at a minimum. The US supports regulation over a complete ban. China and Russia, meanwhile, have yet to clearly indicate their positions.
Instead of an outright ban, the solution is to establish regulations. By embracing the reality that LAWs will likely play a crucial role in future warfare, guidelines can help shape the role this technology plays while simultaneously reaping its benefits. Ignoring these weapons could lead to their misuse and make humanitarian violations far more likely. The first step in regulation is to create a singular definition of what LAWs are. It is through these efforts that the international community can adequately address ethical concerns.
ORIGINAL
What used to be a myth is now a reality. Lethal autonomous weapons (LAWs), at their core, are weapons that act and make lethal decisions without human intervention, effectively thinking on their own. They are similar to drones but lack the crucial element of human oversight. However, definitions vary. Not even the international community has a clear definition of what they are, contributing to the challenge of how to adequately address them.
The development of these weapons poses several major concerns that must be addressed.
First, there are serious questions about LAWs violating international law. UN Secretary-General António Guterres argues these weapons are morally concerning and violate International Humanitarian Law (IHL). Two of the core elements of IHL are distinction and accountability. LAWs act without human oversight; there is no human verification of targets. Facial recognition is one method to identify targets but opens the door to ethnic cleansing. Furthermore, like any other technology, LAWs will inevitably malfunction, resulting in the question of who is then accountable. If a robot accidentally kills the wrong target, is the machine accountable?
Second, the question of regulation is uncertain and pressing. There is no clear regulation of LAWs, yet their potential to cause such sweeping impacts makes their regulation imperative. To prevent crises, standards must be established. The international community has recognized the importance of discussing this issue, but what exactly those steps would be is unclear. Despite this pessimistic outlook, regulation is possible. Land mines, biological weapons, and nuclear weapons were all regulated in some manner. There is no reason why the same could not be done for LAWs. However, regulation discussions assume there is a clear definition to base discussion on. No definition, no regulation.
The truth is LAWs are here. We must move forward, acknowledge, and act accordingly. In the Russia-Ukraine war, there has been talk of Russia deploying weapons with some autonomous capabilities. Turkey, Israel, Russia, and South Korea have reportedly deployed similar weapons. Other countries, like the US and China, are also investing heavily in this technology. No one wants to be the one without the newest and shiniest toys.
Part of what makes LAWs of interest is their life-preserving abilities. Without having to send troops, often clouded by emotions, into risky situations, LAWs can be more accurate, mitigate the loss of life, and reduce damage to the surrounding areas. It is important to recognize these benefits as the march of innovation cannot be halted. Instead of hiding in the past, embracing the reality that LAWs will likely play a crucial role in future warfare, regulation can help shape the role they play while simultaneously reaping the benefits from this rising technology.
Ukraine, recognizing these benefits, has invested heavily in autonomous drones and similar technologies. In a war that is not likely to end in a decisive Ukrainian win, any methods to minimize the sacrifice of Ukrainian lives are imperative. While Ukraine’s weapons still require human intervention, they are only one step off from LAWs. How the war develops with these weapons will likely serve as a turning point for the future of autonomous warfare.
On the global scale, the UN and many countries support a full ban on LAWs or regulation at a minimum. The US supports regulation over a complete ban. On the flip side, China and Russia have yet to clearly indicate their positions.
LAWs are not going away. Instead of an outright ban, the solution is to establish regulations. Ignoring these weapons could lead to their misuse and the humanitarian concerns opponents of LAWs fear will occur. The first step in regulation is to create a singular definition of what LAWs are. It is through these efforts that ethical concerns can be adequately addressed.