“Regulating Lethal Autonomous
Weapon Systems (LAWS)”
INTERNATIONAL HUMANITARIAN
LAW IN THE AGE OF AI
PRESENTATION
by
Maj Gen Nilendra Kumar
Executive President,
Indian Society of
International Law
This has a come up as a new
advancement in technology.
WHAT ARE
AUTONOMOUS WEAPONS ?
Those able to select and engage targets without
meaningful human controls, they select and
engage targets without human intervention.
AUTONOMOUS WEAPONS
Have autonomy to execute their functions
in the absence of any direction or input
from a human actor.
WHAT IS
ARTIFICIAL INTELLIGENCE ?
It is the intelligence of machines or software as
opposed to intelligence of humans or animals.
This capacity can be put to use in military too.
AUTONOMY IS TOTAL
except that
1. A human gives the final command to engage.
2. There are also some which are defensive systems.
AUTONOMOUS WEAPON
SYSTEMS
(AWS)
provide
operational flexibility
in
air, underwater, land or outer space
LAWS
(Lethal Autonomous Weapon Systems).
Pre-programmed to kill a specific target
profile. Once deployed, it searches for that
target profile using sensor data, such as facial
recognition. Thus it identifies, selects and kills
human or other targets without human
intervention.
They are also known as
‘Killer Robots’.
TO BE NOTED
Not all autonomous weapons incorporate AI to
execute particular tasks. Role of AI can be to
(a) Enable, or
(b) Assist
WHY PREFERENCE FOR AUTONOMOUS
WEAPONRY
1. Generally cheaper
2. Easier to produce
3. No need for extensive training or dedicated pers
4. Can neutralize far more costly & advanced
weapons.
EXAMPLE OF LAWS
1. Unmanned aerial and submersible
drones
2. The US Phalanx CIWS
3. Russian Arena
4. Russian Status-6
5. Israeli Trophy
6. German AMAP-ADS
SEARCH
Carried out in a particular environment
(location) for a specific ‘target profile’ using
sensor data, such as ‘facial recognition’.
AI WEAPONISED
1. Experimental submarines
2. Tanks
3. Ships
4. ‘Swarm’ drones
5. Aviation support and
surveillance system
INDIA
is known to have already started acquiring and
exploring deployment of near autonomous
weapon systems.
COMPELLING
REASONS FOR
INDIA TO ADOPT LAWS & AI
1. Diverse and tough terrain may favor employment of ALS
for border patrolling and protection of its space assets.
2. Availability of technical savvy uniform personnel.
CONTROL SET UP
1. Defence AI Project Agency (DAIPA)
2. Defence AI Council (DAIC)
under
Chairmanship of Defence Minister
ACQUISITION IS A NECESSITY FOR INDIA
1. The adversaries are known to have access
2. Emerging strategic partnerships and equations
call for technological readiness
CHINESE AIDP
New generation of AI Development Plan issued
in 2017 indicates a new policy and road map for
development of autonomous weapons.
US ARSENAL
‘Replicator’ Program
1. Self piloting naval vehs
2. Uncrewed aircraft
3. Hunter drone
4. Live Rockets
RUSSIAN POSITION
1. Modified commercial quadcoptres
2. Poseidon underwater veh
3. Aurora 1-MT FPV Drone
GAZA OPERATIONS
IDF has employed AI for
1. Proactive forecasting, threat alert; and
defensive systems.
2. Intelligence analysis, Targeting and munitions.
ISRAELI POLICY
The policy should be premised upon the
concept of ‘Responsible Innovation’ which is
based on the need to support innovation
while simultaneously fostering accountability
and ethically aligned design and use.
Benjamin Netanyahu
UKRAINE WAR
Moscow and Kyiv have both used small aerial
drones to target opposing troops and attack
vehicles. Further, larger sized, medium
altitude drones have been used to attack
radars and installations.
STAGES IN ATTACK KILL CHAIN
Find or search
Fix
Track
Target
Engage
Assess
CONCERNS
about use of such weapons relate to
humanitarian, legal, ethical, security and
technical issues.
HUMANITARIAN CONCERNS
1. Separation of civilians from military personnel.
2. Accurate estimation of the force to be employed.
3. Refine mode of warning to secure surrender by
opponents.
LEGAL ISSUES
1. How would the culpability be fixed in case
of wrongful death or large damages?
2. Would the actions be viewed as deliberate
or omissions?
ETHICAL DILEMMA
Human decisions about life and
death are substituted with sensor,
software and machine processes.
SECURITY ISSUES
1. Differentiation between civilians
and non-civilians.
2. Proportionality
OPERATIONAL CHALLENGES
1. Authority to use force in tri or joint
force situations
2. Regular and live training
TECHNICAL ISSUES
1. Scope for errors, failures and vulnerabilities.
2. Thus, reliability, safety and security may be
compromised.
3. Robustness, interpretability, and adversial
resilience of AI, sensors and algorithms
cannot be relied upon as foolproof.
IHL PROVISIONS LIKELY TO
BE INFRINGED
ADDITIONAL PROTOCOL
1 of 1977
PART III
Section 1
Method and Means of Warfare
AP 1
Article 36-New weapons
In the study, development, acquisition or adoption of a
new weapon, means or method of warfare, a High
Contracting Party is under an obligation to determine
whether its employment would, in some or all
circumstances, be prohibited by this Protocol or by any
other rule of international law applicable to the High
Contracting Party.
AP I
Art 48. shall at all times distinguish
between the civilian population and
combatants and between civilian objects
and military objectives.
Article 36 stands meaningless as
there are no rules of international
law specifically on LAWS and AI
DANGER & CONCERN
Autonomous weapons select and apply
force to targets based on sensor
processing rather than human inputs.
PART IV
CIVILIAN POPULATION
SECTION 1
General Protection against effect of
hostilities.
CHAPTER I
Basic Rule and field of application
CHAPTER II
Civilians and civilian population
Article 51-Protection of the civilian
population.
51(1). The civilian population and individual civilians shall
enjoy general protection against dangers arising from
military operations.
(2) and (3). XXX
4. Indiscriminate attacks are prohibited.
CHAPTER III
Civilian objects
Article 52- General protection of civilian
objects
52(1). Civilian objects shall not be the object of
attack or of reprisals. Civilian objects are all objects
which are not military objectives as defined in
paragraph 2.
CHAPTER IV
Precautionary measures
Article 57-Precautions in attack
57 (1). In the conduct of military operations,
constant care shall be taken to spare the
civilian population, civilians and objects.
Algorithms Based on Long Standing Data
(instances of)
1. Air bases used for particular type of missions.
2. Railway stations used for troop or stores movements.
3. Particular weapons carried by reconnoitering patrols.
4. Usual deployments for a bridge head or breakout
phase.
5. Reaction time taken for combat air patrols.
ALGORITHIMS
1. What type of injuries would be caused by use
of certain ammunition, for example to persons
of one particular age group? Say under water
or at high altitudes?
2. Sudden increase in procurement of certain
types of POL at certain locations.
DANGEROUS IMPLICATIONS
Devastating force may be used endangering
human lives, without any
#Warning,
#Negotiations
#Clarifications
#Third Party intervention
#Statesmanship
FRIGHTFUL CONSEQUENCES
The machine as opposed to the human
operator would decide where, when, or
against what force is to be applied.
PRINCIPLES OF WAR WHICH
IMPEDE PREDICTABILITY
1. Surprise.
2. Security
3. Flexibility
4. Cooperation
The Human are the ultimately
responsible entities.
Saddle responsibility on
1. Commander, or
2. Operator
MAJOR CHALLENGES
1. Absence of meaningful human control (MHC).
2. Hard to identify which particular technologies have the
potential capacity for military use.
3. Rather than regulating the development of autonomous
weapons focus should be on regulating their use.
4. Increasing likelihood of use of such weapons in
offensive roles.
UN GA RESOLUTION NO 78/241
22 Dec 2023
It acknowledges the serious challenges and concerns
raised by new technological applications in the
military domain, including those which
relate to artificial intelligence and
autonomy in weapon systems.
ICRC POSITION
States (should) adopt new, international legally
binding rules to prohibit unpredictable
autonomous weapons and those designed or
used to apply force against persons, and to
place strict restrictions on all others.
AUTOMATION BIAS
The tendency of humans to not critically
challenge a system’s output or search for
contradictory information.
DECISION SUPPORT SYSTEMS
AI-DSS
Rely on a human-centred approach to the
development and use of AI in armed conflict.
CONCLUSION
IHL provisions call for requisite study to assess
their capacity to deal with emerging means and
methods of warfare.
Member states of the UN should promote
negotiations on a new international treaty to
ban and regulate these weapons.
Artificial Intelligence, Regulating LAWS.

Artificial Intelligence, Regulating LAWS.

  • 1.
  • 2.
  • 3.
    PRESENTATION by Maj Gen NilendraKumar Executive President, Indian Society of International Law
  • 4.
    This has acome up as a new advancement in technology.
  • 5.
    WHAT ARE AUTONOMOUS WEAPONS? Those able to select and engage targets without meaningful human controls, they select and engage targets without human intervention.
  • 6.
    AUTONOMOUS WEAPONS Have autonomyto execute their functions in the absence of any direction or input from a human actor.
  • 7.
    WHAT IS ARTIFICIAL INTELLIGENCE? It is the intelligence of machines or software as opposed to intelligence of humans or animals. This capacity can be put to use in military too.
  • 8.
    AUTONOMY IS TOTAL exceptthat 1. A human gives the final command to engage. 2. There are also some which are defensive systems.
  • 9.
  • 10.
    LAWS (Lethal Autonomous WeaponSystems). Pre-programmed to kill a specific target profile. Once deployed, it searches for that target profile using sensor data, such as facial recognition. Thus it identifies, selects and kills human or other targets without human intervention.
  • 11.
    They are alsoknown as ‘Killer Robots’.
  • 12.
    TO BE NOTED Notall autonomous weapons incorporate AI to execute particular tasks. Role of AI can be to (a) Enable, or (b) Assist
  • 13.
    WHY PREFERENCE FORAUTONOMOUS WEAPONRY 1. Generally cheaper 2. Easier to produce 3. No need for extensive training or dedicated pers 4. Can neutralize far more costly & advanced weapons.
  • 14.
    EXAMPLE OF LAWS 1.Unmanned aerial and submersible drones 2. The US Phalanx CIWS 3. Russian Arena 4. Russian Status-6 5. Israeli Trophy 6. German AMAP-ADS
  • 15.
    SEARCH Carried out ina particular environment (location) for a specific ‘target profile’ using sensor data, such as ‘facial recognition’.
  • 16.
    AI WEAPONISED 1. Experimentalsubmarines 2. Tanks 3. Ships 4. ‘Swarm’ drones 5. Aviation support and surveillance system
  • 17.
    INDIA is known tohave already started acquiring and exploring deployment of near autonomous weapon systems.
  • 18.
    COMPELLING REASONS FOR INDIA TOADOPT LAWS & AI 1. Diverse and tough terrain may favor employment of ALS for border patrolling and protection of its space assets. 2. Availability of technical savvy uniform personnel.
  • 19.
    CONTROL SET UP 1.Defence AI Project Agency (DAIPA) 2. Defence AI Council (DAIC) under Chairmanship of Defence Minister
  • 20.
    ACQUISITION IS ANECESSITY FOR INDIA 1. The adversaries are known to have access 2. Emerging strategic partnerships and equations call for technological readiness
  • 21.
    CHINESE AIDP New generationof AI Development Plan issued in 2017 indicates a new policy and road map for development of autonomous weapons.
  • 22.
    US ARSENAL ‘Replicator’ Program 1.Self piloting naval vehs 2. Uncrewed aircraft 3. Hunter drone 4. Live Rockets
  • 23.
    RUSSIAN POSITION 1. Modifiedcommercial quadcoptres 2. Poseidon underwater veh 3. Aurora 1-MT FPV Drone
  • 24.
    GAZA OPERATIONS IDF hasemployed AI for 1. Proactive forecasting, threat alert; and defensive systems. 2. Intelligence analysis, Targeting and munitions.
  • 25.
    ISRAELI POLICY The policyshould be premised upon the concept of ‘Responsible Innovation’ which is based on the need to support innovation while simultaneously fostering accountability and ethically aligned design and use. Benjamin Netanyahu
  • 26.
    UKRAINE WAR Moscow andKyiv have both used small aerial drones to target opposing troops and attack vehicles. Further, larger sized, medium altitude drones have been used to attack radars and installations.
  • 27.
    STAGES IN ATTACKKILL CHAIN Find or search Fix Track Target Engage Assess
  • 28.
    CONCERNS about use ofsuch weapons relate to humanitarian, legal, ethical, security and technical issues.
  • 29.
    HUMANITARIAN CONCERNS 1. Separationof civilians from military personnel. 2. Accurate estimation of the force to be employed. 3. Refine mode of warning to secure surrender by opponents.
  • 30.
    LEGAL ISSUES 1. Howwould the culpability be fixed in case of wrongful death or large damages? 2. Would the actions be viewed as deliberate or omissions?
  • 31.
    ETHICAL DILEMMA Human decisionsabout life and death are substituted with sensor, software and machine processes.
  • 32.
    SECURITY ISSUES 1. Differentiationbetween civilians and non-civilians. 2. Proportionality
  • 33.
    OPERATIONAL CHALLENGES 1. Authorityto use force in tri or joint force situations 2. Regular and live training
  • 34.
    TECHNICAL ISSUES 1. Scopefor errors, failures and vulnerabilities. 2. Thus, reliability, safety and security may be compromised. 3. Robustness, interpretability, and adversial resilience of AI, sensors and algorithms cannot be relied upon as foolproof.
  • 35.
    IHL PROVISIONS LIKELYTO BE INFRINGED
  • 36.
    ADDITIONAL PROTOCOL 1 of1977 PART III Section 1 Method and Means of Warfare
  • 37.
    AP 1 Article 36-Newweapons In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.
  • 38.
    AP I Art 48.shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives.
  • 39.
    Article 36 standsmeaningless as there are no rules of international law specifically on LAWS and AI
  • 40.
    DANGER & CONCERN Autonomousweapons select and apply force to targets based on sensor processing rather than human inputs.
  • 41.
    PART IV CIVILIAN POPULATION SECTION1 General Protection against effect of hostilities. CHAPTER I Basic Rule and field of application
  • 42.
    CHAPTER II Civilians andcivilian population
  • 43.
    Article 51-Protection ofthe civilian population. 51(1). The civilian population and individual civilians shall enjoy general protection against dangers arising from military operations. (2) and (3). XXX 4. Indiscriminate attacks are prohibited.
  • 44.
  • 45.
    Article 52- Generalprotection of civilian objects 52(1). Civilian objects shall not be the object of attack or of reprisals. Civilian objects are all objects which are not military objectives as defined in paragraph 2.
  • 46.
  • 47.
    Article 57-Precautions inattack 57 (1). In the conduct of military operations, constant care shall be taken to spare the civilian population, civilians and objects.
  • 48.
    Algorithms Based onLong Standing Data (instances of) 1. Air bases used for particular type of missions. 2. Railway stations used for troop or stores movements. 3. Particular weapons carried by reconnoitering patrols. 4. Usual deployments for a bridge head or breakout phase. 5. Reaction time taken for combat air patrols.
  • 49.
    ALGORITHIMS 1. What typeof injuries would be caused by use of certain ammunition, for example to persons of one particular age group? Say under water or at high altitudes? 2. Sudden increase in procurement of certain types of POL at certain locations.
  • 50.
    DANGEROUS IMPLICATIONS Devastating forcemay be used endangering human lives, without any #Warning, #Negotiations #Clarifications #Third Party intervention #Statesmanship
  • 51.
    FRIGHTFUL CONSEQUENCES The machineas opposed to the human operator would decide where, when, or against what force is to be applied.
  • 52.
    PRINCIPLES OF WARWHICH IMPEDE PREDICTABILITY 1. Surprise. 2. Security 3. Flexibility 4. Cooperation
  • 53.
    The Human arethe ultimately responsible entities. Saddle responsibility on 1. Commander, or 2. Operator
  • 54.
    MAJOR CHALLENGES 1. Absenceof meaningful human control (MHC). 2. Hard to identify which particular technologies have the potential capacity for military use. 3. Rather than regulating the development of autonomous weapons focus should be on regulating their use. 4. Increasing likelihood of use of such weapons in offensive roles.
  • 55.
    UN GA RESOLUTIONNO 78/241 22 Dec 2023 It acknowledges the serious challenges and concerns raised by new technological applications in the military domain, including those which relate to artificial intelligence and autonomy in weapon systems.
  • 56.
    ICRC POSITION States (should)adopt new, international legally binding rules to prohibit unpredictable autonomous weapons and those designed or used to apply force against persons, and to place strict restrictions on all others.
  • 57.
    AUTOMATION BIAS The tendencyof humans to not critically challenge a system’s output or search for contradictory information.
  • 58.
    DECISION SUPPORT SYSTEMS AI-DSS Relyon a human-centred approach to the development and use of AI in armed conflict.
  • 59.
    CONCLUSION IHL provisions callfor requisite study to assess their capacity to deal with emerging means and methods of warfare. Member states of the UN should promote negotiations on a new international treaty to ban and regulate these weapons.