Thrust group2 presention 020510


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Thrust group2 presention 020510

  1. 1. Thrust Group on International Governance of Robots in National Security CETMONS February 5, 2010
  2. 2. Robotics in the Military: Technology and Applications Ron Arkin
  3. 4. Robots for the Battlefield <ul><li>South Korean robot platform is intended to be able to detect and identify targets in daylight within a 4km radius, or at night using infrared sensors within a range of 2km, providing for either an autonomous lethal or non-lethal response . The system does have an automatic mode in which it is capable of making the decision on its own </li></ul><ul><li>iRobot, the maker of Roomba, is now providing versions of their Packbots capable of tasering enemy combatants; also some versions are equipped with the highly lethal MetalStorm weapon system. </li></ul><ul><li>The SWORDS platform developed by Foster-Miller is already at work in Iraq and Afghanistan and is capable of carrying lethal weaponry (M240 or M249 machine guns, or a Barrett .50 Caliber rifle). New MAARS version under development. </li></ul><ul><li>Israel is considering deploying stationary robotic gun-sensor platforms along its borders with Gaza in automated kill zones, equipped with fifty caliber machine guns and armored folding shields. </li></ul><ul><li>The U.S. Air Force has created their first hunter-killer UAV , named the MQ-9 Reaper, successor to the Predator and widely used in Afghanistan. </li></ul><ul><li>China is developing the “Invisible Sword”, a deep strike armed stealth UAV. </li></ul><ul><li>Many other examples both domestically and internationally. </li></ul>
  4. 6. Current Motivators for Military Robotics <ul><li>Force Multiplication </li></ul><ul><ul><li>Reduce # of soldiers needed </li></ul></ul><ul><li>Expand the Battlespace </li></ul><ul><ul><li>Conduct combat over larger areas </li></ul></ul><ul><li>Extend the warfighter’s reach </li></ul><ul><ul><li>Allow individual soldiers to strike further </li></ul></ul><ul><li>Reduce Friendly Casualties </li></ul><ul><li>The use of robotics for reducing ethical infractions in the military does not yet appear anywhere </li></ul>
  5. 7. Samsung Techwin Korean DMZ Surveillance and Guard Robot
  6. 8. War Robots: Concerns & Risks Patrick Lin, Cal Poly, San Luis Obispo Ed Barrett, US Naval Academy Jason Borenstein, Georgia Tech
  7. 9. Overview <ul><li>Legal challenges </li></ul><ul><li>Just-war challenges </li></ul><ul><li>Technical challenges </li></ul><ul><li>Robot-human challenges </li></ul><ul><li>Societal challenges </li></ul><ul><li>Other and future challenges </li></ul>
  8. 10. 1. Legal Challenges <ul><li>Unclear responsibility </li></ul><ul><ul><li>To whom would we assign blame—and punishment—for improper conduct and unauthorized harms caused by an autonomous robot (whether by error or intentional)? </li></ul></ul><ul><ul><li>Designers, robot manufacturer, procurement officer, robot controller/supervisor, field commander, a nation’s president or prime minister...or the robot itself? </li></ul></ul><ul><li>Refusing an order </li></ul><ul><ul><li>If robots have better situational awareness, could they refuse legitimate orders (e.g., to attack a house in which it detects children)? </li></ul></ul>
  9. 11. 1. Legal Challenges (cont’d) <ul><li>Consent by soldiers to risk </li></ul><ul><ul><li>In 2007, a semi-autonomous robotic cannon malfunctioned and killed 9 friendly soldiers and injured 14 other in the South African army </li></ul></ul><ul><li>Unclear designation of combatants </li></ul><ul><ul><li>Legal status of UAV operators in the U.S.: e.g., can they be attacked on their days off work? </li></ul></ul><ul><ul><li>Legal status of civilians who work on robotic systems: e.g., are they combatants on the battlefront? </li></ul></ul>
  10. 12. 2. Just-War Challenges <ul><li>Attack decisions </li></ul><ul><ul><li>Increasing tempo of warfare may require split-second decisions that only computing machines can make </li></ul></ul><ul><ul><li>No “eyes on target” or “human in the loop” poses risk of wrongful attack </li></ul></ul><ul><li>Lower barriers to war </li></ul><ul><ul><li>Fewer US deaths = lower political cost = more likely to go to war? </li></ul></ul><ul><ul><li>But this could be said of any new offensive/defensive technology </li></ul></ul><ul><ul><li>Do robots enable us to do morally/legally questionable things that we otherwise wouldn’t do, e.g., Pakistan strikes? </li></ul></ul>
  11. 13. 2. Just-War Challenges (cont’d) <ul><li>Imprecision of Laws of War & Rules of Engagement </li></ul><ul><ul><li>Using LOW/ROE in programming is incomplete, e.g. req’t to minimize civilian casualties doesn’t specify hard numbers </li></ul></ul><ul><ul><li>Similar to unintended results in Asimov’s Laws of Robotics? </li></ul></ul><ul><li>Halting conflict </li></ul><ul><ul><li>Given the nature of modern warfare, which individuals/groups of combatants have the authority to end hostilities? </li></ul></ul><ul><ul><li>Will/can combatants surrender to robots (or operators or maintenance crew)? If so, what is the appropriate process for handling the situation? </li></ul></ul>
  12. 14. 3. Technical Challenges <ul><li>Discrimination among targets </li></ul><ul><ul><li>Too difficult? Requires contextual understanding </li></ul></ul><ul><li>Robots gone wild </li></ul><ul><ul><li>Malfunction, hacking, capture </li></ul></ul><ul><li>Unauthorized overrides </li></ul><ul><ul><li>How to prevent a rogue officer from improperly taking control of a robot? </li></ul></ul>
  13. 15. 4. Human-Robot Challenges <ul><li>Effect on squad cohesion </li></ul><ul><ul><li>Unblinking eye may erode “band of brothers” </li></ul></ul><ul><li>Self-defense </li></ul><ul><ul><li>If no such instinct, then very expensive equipment may be captured or lost </li></ul></ul><ul><li>Winning hearts and minds </li></ul><ul><ul><li>Lasting/true peace may be hindered by using robots to control population and to fight wars (shows lack of respect?) </li></ul></ul>
  14. 16. 5. Societal Challenges <ul><li>Counter-tactics in asymmetrical warfare </li></ul><ul><ul><li>More desperate enemies = increased terrorism and other unconventional tactics? </li></ul></ul><ul><li>Proliferation </li></ul><ul><ul><li>Other nations will eventually have war robots, just as with other weapons </li></ul></ul><ul><li>Space race </li></ul><ul><ul><li>Militarization of space increases space pollution, etc. </li></ul></ul><ul><li>Civil security and privacy </li></ul><ul><ul><li>Military robots may turn into police/civilian security robots </li></ul></ul>
  15. 17. 6. Other/Future Challenges <ul><li>The precautionary principle </li></ul><ul><ul><li>Slowing/halting work in order to address serious risks seems to make sense </li></ul></ul><ul><ul><li>…but this is in tension with pressure to use robots in the military </li></ul></ul><ul><li>Co-opting of ethics work by military </li></ul><ul><ul><li>Can justify work in robotics by saying that ethics is already being attended to </li></ul></ul><ul><li>Robot rights </li></ul><ul><ul><li>In distant future, if robots have animal- or human-level intelligence </li></ul></ul>
  16. 18. Current Governance Architecture George R. Lucas, Jr. Richard M. O’Meara
  17. 19. Conventions in International Law for specific technologies <ul><li>The 1999 Hague declaration concerning expanding bullets </li></ul><ul><li>Convention on the Prohibition of the development, Production and stockpiling of Bacteriological (Biological) and Toxin Weapons and on their Destruction (1972) </li></ul><ul><li>Convention on the prohibition of military or any hostile use of environmental modification techniques (1976) </li></ul><ul><li>Resolution on Small-Calbre Weapon Systems (1979) </li></ul><ul><li>Protocol on Non-Detectable fragments (Protocol 1) (1980) </li></ul><ul><li>Protocol on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices (Protocol 11) (1980) </li></ul><ul><li>Protocol on Prohibitions or Restrictions on the Use of Incendiary Weapons (Protocol 111) (1980) </li></ul>
  18. 20. Conventions in International Law for specific technologies, II <ul><li>Convention on the prohibition of the development, production, stockpiling and use of chemical weapons and on their destruction (1993) </li></ul><ul><li>Protocol on Blinding Laser weapons (Protocol 1V to the 1980 Convention (1995) </li></ul><ul><li>Protocols on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices as amended on 3 May, 1996 (Protocol 11 to the 1980 Convention as amended on 3 May 1996) </li></ul><ul><li>Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction (1997) </li></ul><ul><li>Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects , Amendment article 1, 21 (2001) </li></ul><ul><li>Protocol 1 Additional to the 1949 Geneva Conventions; Convention on Cluster Munitions (2008). ICRC, ‘International Humanitarian Law-treaties and Documents </li></ul>
  19. 21. Five Core Principles: Int’l Humanitarian Law & LOAC <ul><li>Weapons prohibitions: suffering or superfluous injury; otherwise: </li></ul><ul><li>Military necessity </li></ul><ul><li>Proportionality </li></ul><ul><li>Discrimination </li></ul><ul><li>Command responsibility </li></ul>
  20. 22. Weapons Prohibitions <ul><li>Some weapons are patently inhumane </li></ul><ul><li>Others are design-dependent (effects are reasonably foreseen) </li></ul><ul><li>Thus, ICRC/SIrUS criteria would ban weapons when: </li></ul>
  21. 23. Use of weapon would foreseeably cause: <ul><li>A specific disease, specific abnormal physiological state, a specific and permanent disability or specific disfigurement; or </li></ul><ul><li>Field mortality of more than 25% or a hospital mortality of more than 5%; or </li></ul><ul><li>Grade 3 wounds as measure by the Red Cross wound classification scale; or </li></ul><ul><li>Effects for which there is no well-recognized and proven treatment. </li></ul>
  22. 24. Military Necessity <ul><li>Promotes speedy end to hostilities </li></ul><ul><li>Requires definition of victory </li></ul><ul><li>Requires assessment of intent or capacity of enemy </li></ul>
  23. 25. Proportionality <ul><li>Considerable concern to innovator or user of new technologies </li></ul><ul><li>Is foreseeable destructive capacity “disproportionate” to military objective? </li></ul><ul><li>(Old Saw regarding new technologies: “necessity always trumps proportionality”) </li></ul>
  24. 26. Discrimination <ul><li>Not directed against a specifically military objective </li></ul><ul><li>Employ a means or method which cannot be limited to a military objective </li></ul><ul><li>Likely to strike civilian and military targets without distinction </li></ul><ul><li>(Ron Arkin argument: autonomous robots likely superior to humans in this respect) </li></ul>
  25. 27. Command Responsibility <ul><li>Liability for illegal actions (Trial of Gen. Yamashita) </li></ul><ul><li>Constrains both actions of soldiers </li></ul><ul><li>And orders and jurisdiction of their commanding officers </li></ul><ul><li>(Rob Sparrow objection to autonomous robots: no meaningful accountability possible) </li></ul>
  26. 28. Other General Governance Provisions or Principles <ul><li>Article 36 of 1977 Additional Protocol 1 to the Geneva Conventions of 1949 </li></ul><ul><li>“ universal jurisdiction” </li></ul><ul><li>“ Lawfare” </li></ul>
  27. 29. Provisions for Good Governance (O’Meara) <ul><li>Attempts are clearly defined </li></ul><ul><li>Proposals or solutions are realistic </li></ul><ul><li>Holistic, involving all stakeholders in crafting legislation </li></ul><ul><li>Subject to assessment of effectiveness, and amendment </li></ul>
  28. 30. Goal of Technology Governance <ul><li>Respect long-term effects </li></ul><ul><li>Consider ramifications of actions </li></ul><ul><li>Promote consumer/user awareness of these ramifications </li></ul>
  29. 31. Professional Codes <ul><li>Alternative to conventional international law that satisfies these criteria </li></ul><ul><li>Promote (and require) sound professional judgment </li></ul><ul><li>Promote best practices </li></ul><ul><li>Define boundaries of acceptable professional practice </li></ul>
  30. 32. Berkeley Engineers’ Code <ul><li>I promise to work for a BETTER WORLD where science and technology are used in socially responsible ways. I will not use my EDUCATION for any purpose intended to harm human beings or the environment. Throughout my career, I will consider the ETHICAL implications’ of my work before I take ACTION. While the demands placed upon me may be great, I sign this declaration because I recognize that INDIVIDUAL RESPONSIBILITY is the first step on the path to PEACE </li></ul>
  31. 33. Legally Binding International Agreements and Other Instruments that Provide Relevant Lessons or Precedent Orde Kittrie
  32. 34. ICRAC <ul><li>The International Committee for Robot Arms Control (ICRAC) </li></ul><ul><ul><li>founded in September 2009 </li></ul></ul><ul><ul><li>Goal: campaign for limiting lethal autonomous robots through an international agreement modeled on existing arms control agreements </li></ul></ul><ul><ul><ul><li>e.g., those restricting nuclear and biological weapons </li></ul></ul></ul><ul><li>ICRAC called for military robots to be banned from space and said no robotic systems should carry nuclear weapons. </li></ul>
  33. 35. Arms Control Agreements: Types of Restrictions <ul><li>Existing legally-binding arms control agreements and other instruments include a wide variety of different types of restrictions on targeted weapons, including prohibitions and limitations (restrictions that fall short of prohibition) on: </li></ul><ul><ul><li>acquisition </li></ul></ul><ul><ul><li>research and development </li></ul></ul><ul><ul><li>testing </li></ul></ul><ul><ul><li>deployment </li></ul></ul><ul><ul><li>transfer/proliferation </li></ul></ul><ul><ul><li>use </li></ul></ul>
  34. 36. Arms Control Agreements: Form/Scope <ul><li>Legally binding multilateral agreements (most common) </li></ul><ul><li>Legally binding bilateral agreements </li></ul><ul><li>Legally binding resolutions of the United Nations Security Council </li></ul>
  35. 37. Relevant Precedents? <ul><li>Nuclear Nonproliferation Treaty </li></ul><ul><li>Comprehensive Nuclear Test Ban Treaty </li></ul><ul><li>Limited Test Ban Treaty </li></ul><ul><li>United Nations Security Council Resolution 1540 </li></ul><ul><li>Chemical Weapons Convention </li></ul><ul><li>Biological Weapons Convention </li></ul><ul><li>Mine Ban Treaty </li></ul><ul><li>Inter-American Convention on Transparency in Conventional Weapons Acquisitions </li></ul><ul><li>The Strategic Offensive Reductions Treaty </li></ul><ul><li>The Conventional Armed Forces in Europe Treaty </li></ul><ul><li>Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be Deemed to be Excessively Injurious or to have Indiscriminate Effects (the CCW) </li></ul>
  36. 38. Soft Law Approaches Gary Marchant Lyn Gulley
  37. 39. Transitions in International Oversight of Technology <ul><li>Regulation    Governance </li></ul><ul><ul><li>From government top-down imposed to private-public partnerships, collaborations, etc. </li></ul></ul><ul><li>Hard Law    Soft Law </li></ul><ul><ul><li>From enforceable legal agreements to guidelines, codes of conduct, principles </li></ul></ul>
  38. 40. Advantages of Soft Law <ul><li>Voluntary; cooperative </li></ul><ul><li>Reflexive </li></ul><ul><li>Can be adopted or revised relatively quickly </li></ul><ul><li>Many different approaches can be tried simultaneously </li></ul><ul><li>Can be gradually “hardened” into more formal regulatory oversight </li></ul>
  39. 41. Codes of Conduct <ul><li>Synthetic Biology </li></ul><ul><ul><li>U.S. Government </li></ul></ul><ul><ul><li>International Association Synthetic Biology (IASB) </li></ul></ul><ul><ul><li>International Gene Synthesis Consortium (IGSC) </li></ul></ul><ul><li>Nanotechnology </li></ul><ul><ul><li>Foresight institute Guidleines </li></ul></ul><ul><ul><li>Responsible Nanocode </li></ul></ul><ul><ul><li>EU Code of Conduct for Nanoresearchers </li></ul></ul><ul><li>Biotechnology </li></ul><ul><ul><li>Asilomar Guidelines </li></ul></ul><ul><ul><li>2006 Review Conference for the Biological and Toxin Weapons Convention </li></ul></ul>
  40. 42. Transgovernmental Dialogue <ul><li>Pharmaceuticals </li></ul><ul><ul><li>International Conference on Harmonization (ICH) </li></ul></ul><ul><li>Nanotechnology </li></ul><ul><ul><li>OECD Working Group </li></ul></ul><ul><li>Arms Control </li></ul><ul><ul><li>Australia Group </li></ul></ul>
  41. 43. Framework Convention <ul><li>International agreement negotiated by States </li></ul><ul><li>Establishes institutions, processes and procedures </li></ul><ul><li>Minimal (if any) substantive content at first </li></ul><ul><li>Encourage broad participation by as many states as possible </li></ul><ul><li>Build trust </li></ul><ul><li>Gradually add substance in the form of protocols </li></ul>
  42. 44. Benefits of Framework Conventions <ul><li>“ In sum, the FC-protocol approach allows states to put in place activities and procedures designed to reduce scientific and technical uncertainty about a problem, and then to act incrementally to address that problem or particular aspects of it, as their knowledge and understanding grow. Politically, the substantive weakness of the original FC helps to attract the broadest possible participation, even if the commitment of some participants is weak or even insincere; as the process unfolds, the aim typically is to enmesh the participants in a process of social learning that will lead them to accept stronger commitments commensurate with the evolving understanding of the problem.” </li></ul>Abbott, Marchant, et al., 2006
  43. 45. Information Sharing