Notes on Emerging Technologies in Lethal Autonomous Weapons Systems
Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems
Introduction
- Technology significantly influences international relations, impacting wartime strategies and peacetime competition.
- Emerging technologies present challenges to peace, stability, and security, raising ethical, legal, political, and humanitarian questions.
- Rapid technological advancements necessitate "technology neutral" regulations.
- The focus should be on preserving meaningful human control in autonomous weapons to prevent the dehumanization of warfare.
Background
- The UN's intergovernmental debate on emerging technologies began in the Human Rights Council and was referred to the Convention on Certain Conventional Weapons (CCW).
- Professor Christof Heyns' report highlighted concerns about lethal autonomous robotics (LARs) that can select and engage targets without human intervention.
- His report raised the following concerns:
- Protection of life during war and peace.
- Compliance with international humanitarian law and human rights law.
- Legal accountability.
- Robots wielding power of life and death.
- He recommended national moratoria on LARs and a high-level panel to articulate international policy.
- The Secretary General suggested addressing autonomous systems in an existing forum like the Conventional Weapons Convention.
- In 2013, the CCW decided to address the issue as lethal autonomous weapon systems (LAWS).
- Emerging technologies' implications have led to discussions in other forums, particularly in human rights.
- Examples include:
- General Comment 36 on Article 6 of the International Covenant on Civil and Political Rights on the Right to Life (2018).
- Report of the Special Rapporteur on Disabilities (2021).
- Recommendation on the Ethics of Artificial Intelligence by UNESCO (2021).
- These frameworks address crosscutting issues impacting the CCW, emphasizing that the CCW should consider developments in other forums to remain relevant.
Holistic Understanding
- A holistic, multidimensional understanding of autonomy in weapon systems is needed to grasp its effects on human agency.
- This understanding impacts ethics, international law, international humanitarian law, human rights law, and international security.
- The central question is not whether we "can" remove the user from applying force, but the consequences of doing so.
- It remains questionable whether autonomous weapons can comply with international humanitarian law and human rights law due to wartime uncertainties.
- There's an implicit requirement for meaningful human control in IHL, especially regarding distinction, proportionality, precautions in attack, and military necessity.
- Similar requirements exist in international human rights law.
- Ethical considerations should guide the GGE's work on retaining human agency in decisions to use force, especially on matters of life and death.
- A Joint Special Session of the GGE LAWS with the Special Rapporteur on Extrajudicial, Summary and Arbitrary Executions was suggested.
- International security concerns related to autonomy in weapon systems have been on the periphery of CCW debates.
- These concerns include:
- Asymmetric warfare.
- Force multiplication.
- Lowering the threshold for nations to start wars.
- Potential of conflict escalation.
- Entanglement with other weapon capabilities.
- Ethical concerns must be considered beyond legal analysis, recognizing that not everything illegal is unethical and vice versa.
- The use of force, mediated through technology, must consider ethical and societal implications.
International Regulation in the Framework of the CCW
- The challenges posed by autonomy in weapon systems necessitate a legally binding instrument.
- The reasons for this necessity are:
- Clarifying, strengthening, and advancing IHL regarding autonomous weapons.
- Autonomous functionalities call for a broader approach than the traditional scope of IHL, focusing not only on use but also the weapon's lifecycle.
- Avoiding a fragmented approach through national measures, which may lead to dispersion and lack of homogeneity.
- The CCW serves as a normative framework for the codification and progressive development of international law applicable in armed conflict.
- The CCW addresses weapon systems with autonomous functionalities that may be excessively injurious or have indiscriminate effects.
Draft Legally Binding Instrument on Prohibitions and Regulations
- There is a clear need for specific rules to regulate weapons with autonomous functionalities at an international level.
- This need is derived from ethical, legal, societal, and international security concerns.
- The following draft of a legally binding instrument can be considered:
- Due to the challenges of autonomy in weapon systems, states shall:
- Prohibit the development and use of weapons with autonomous functionalities that cannot be controlled by humans.
- Prohibit the development and use of weapons which incorporate autonomous functionalities that cannot be used in compliance with IHL, including weapons that:
- Cannot be directed at a specific military objective.
- Cause superfluous injury or unnecessary suffering.
- Have effects that cannot be limited as required by IHL.
- Prohibit the development and use of weapons which incorporate autonomous functionalities whose effects cannot be sufficiently understood, predicted, and explained.
- Positive obligations, in the form of regulations, should be developed to ensure humans exercise control, notably in terms of:
- A human operator shall:
- Be certain that there are adequate environmental limits in place, including spatial and temporal limits.
- Be fully aware and approve any decision on determining the operational context through a sufficient level of situational awareness.
- Be certain on the reliability and predictability in the identification, selection, and engagement of targets.
- Take the necessary precautions during the conduct of operations to ensure that a weapons system is not able to change mission parameters without human validation.
- Allow for constant human supervision and ensure intervention where necessary as to be able to:
- Interrupt and deactivate the weapon during its operation phase.
- Verify that auto-deactivation features operate as intended, in particular when required by the legal assessment of the user.
- States should ensure effective investigations, prosecution, and punishment for violations during the use of autonomous weapons.
- Commanders and operators are responsible for complying with legal obligations.
Additional Recommendations
- Taking into account technological advancements, States may need to identify additional recommendations, guided by humanity and public conscience.
- Such recommendations may include prohibitions, regulations, voluntary measures, and exchange of best practices.
- Any further recommendations shall preserve human control and avoid accountability gaps.
Legal Weapons Review
- Legal weapons review must assess the attributes and effects of autonomous weapons, as well as their conformity with international humanitarian law and international law.
- This assessment should:
- Evaluate technical performance, including reliability and predictability, and whether effects can be limited to military objectives.
- Confirm intended or expected use.
- Confirm the placement of adequate limits on tasks and types of targets.
- Legal reviews of autonomous weapons should adopt a precautionary approach and deny authorization when there is less than full certainty of all the characteristics.
Conclusions
- The reflections mentioned derive from the substantive discussions within the GGE LAWS for the past years.
- They provide a basis for a framework that while ensuring the full applicability of international law, including IHL, highlights the need to develop additional legally binding norms based on ethical standards, to give an adequate normative response to the challenges posed by autonomy in weapon systems.
- Risks associated with ending of the arms race and building confidence among autonomous functionalities in weapons systems could be inherent/built in, thus further consideration is needed on the viability of mitigation measures particularly when dealing with categorizations that are context-dependent as prescribed by IHL.
- Taking into account the irreversibility and magnitude of the risks we are dealing with (particularly with regard to decisions on life and death) the most effective way to address this is through prohibitions as risk avoidance measures and regulations as risk prevention/mitigations measures.
- Prohibitions and regulations once established, should then be operationalized through national implementation measures.
- Innovation and regulation need not be at odds, because innovation in and of itself is not what matters but how and why it’s used.
- The main concern is how we embed our fundamental values in each and every step of the development and deployment of the systems.