Ethics of Emerging Technologies Soldier Enhancement

422 views
371 views

Published on

Ethics of Emerging Technologies
Soldier Enhancement
University of Notre Dame
Spring 2014

Published in: Spiritual, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
422
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
8
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Ethics of Emerging Technologies Soldier Enhancement

  1. 1. Ethics of Emerging Technologies Soldier Enhancement University of Notre Dame Spring 2014
  2. 2. 1974-1978; Based on the novel “Cyborg”
  3. 3. Plan • What is enhancement? • Types of enhancement - general • Advantages • Issues
  4. 4. What is it? • Improving mind, body, ability • Natural versus artificial • Beyond species typical/statistical norm – Bear in mind “reversion to the mean” • Changes structure or function of the human body
  5. 5. Internal versus External • External – “always on” performance aids • Internal – Intimate connections – Changes notion of personal identity
  6. 6. Contexts • Extremes are uninteresting • All enhancements morally objectionable • All are unproblematic • Strict equality not morally required? – Range of natural variation – John Rawls’ Theory of Justice • Equality of opportunity
  7. 7. Neuroscience • Biometrics – Biofeedback – fNIR • Neuro-modulation – External – Internal • Neuro-stimulation • Neural prosthetics • Neural control
  8. 8. Cognitive Enhancement • Ritalin, Beta-blockers, hallucinogens, O2 enhancers • TMS (Transcranial Magnetic stimulation) • Neural Implants - DARPA – REMIND – Restorative Memory Integration Neural Device – REPAIR - Reorganization and Plasticity to Accelerate Injury Recovery • DARPA NIA – Neuroscience for Intelligence Analysis • DARPA CT2WS – Cognitive Technology Threat Warning System
  9. 9. DARPA “REPAIR” Project • High fidelity model of neural function • Programming implantable computer to stimulate neuron function • From the request for proposals: – “Investigators should be able to demonstrate the ability to stimulate relevant regions of the brain in such a manner that will evoke a response in the primate similar to that evoked through natural interaction with their surrounding environment... Ideally, investigators will be able to demonstrate ability of a non-human-primate to complete the task outlined in technical area one without the use of traditional sensory inputs”
  10. 10. Genetics/Neuroscience Issues • Manipulation/Intervention – Devices (do they affect “human essence”?) – Human/Non-human chimeras (gene transplantation) • Privacy of genetic and neuroscience data • Cultural Effects – Danger of misuse of information – Neuro-essentialism • Research Ethics – Consent, data, phenotypes, incidental findings • Enhancement – Genetics – (example: prenatal selection) long term effects – Neuroscience – (examples: pharmacological, implants) near term effects
  11. 11. Neuroethics (Greely) • Unexpected incidental findings in neuroimaging • How human brains make ethical decisions • How discoveries are likely to affect society, law – Consequences of improved prediction – Competence determination – “Mind reading” (lying, bias, etc) – Enhancement
  12. 12. Neuroscience Issues (Loeb) • Cost-benefit analysis • Socioeconomic disruption – Cochlear implants and the “deaf culture” • Visibility • Privacy • Autonomy
  13. 13. Physical Enhancement • Steroids • Cosmetic Surgery • Prosthetics • Cybernetics, bionic limbs • Exoskeletons • Lenses, visual augmentation • DARPA – Metabolically Dominant Soldier
  14. 14. Subretinal video camera (Retina Implant AG)  ✁✂✂ ✄☎✆✝✞✟✠ ✡✂ ✆ ✡✂ ☛☞ ✝✌✍✎ ✍✏✑✒✌☎✑☎✑✓ ✌ photodiode, an amplifying circuit, and a stimulation electrode. Power is provided via trans-dermal magnetic coupler placed behind the ear, like in a cochlear implant. Visual acuity: 0.3 cycles/degree ~ 20/1200
  15. 15. Issues • Freedom and Autonomy • Fairness and Equity • Social Disruption • Human Dignity • Rights and Obligations • Policy and Law
  16. 16. Freedom and Autonomy • Autonomy – do we really have it? – Accepted restrictions – Doesn’t hurt anyone but me • Adequate informed consent? • Consequences for others, society • Free Will – Does a person under external influence really have it?
  17. 17. Fairness and equity • Enhancement (advantage) for one implies disadvantage for others • Access for only those who can afford • Does the enhancement benefit the society or just the individual?
  18. 18. Societal Disruption • Longer lifetimes – Health care, resource demands, food, water, shelter, population disruptions • Adaptation to different lifestyles • Superhuman or extra-human abilities • Different species – Genetic experimentation
  19. 19. Human dignity • What does it mean to be human? – Mortality – Fallibility – Emotions of our own • Pursuit of the “good life” – What comprises it? – What would be the importance of personal effort?
  20. 20. Rights and Obligations • A right to be enhanced? • Obligation to be enhanced? – Pilots – Soldiers • Children – Right to be protected • Guard against an “enhancement race”
  21. 21. Policy and Law • Limits on the military • Intellectual property issues – Example of patents for gene sequences • A new formulation for ethics – Historically lags technology development
  22. 22. Specific Military Questions • How safe must an enhancement be prior to deployment • Should soldiers be able to meaningfully object? • How to weigh the risk to the individual against the military advantaged accrued?
  23. 23. Important Considerations • Long term consequences – Dependence, addiction – Societal pressure • Unintended social engineering, “a race to the bottom” • Unanticipated usage – Ambiguity and uncertainty in prediction • Altering the meaning of “human” • Inability to calculate or even anticipate risk • Environmental impacts
  24. 24. From Allenby and Sarewitz, The Techno-Human Condition Level I - the technology Level II – the system Level III - society
  25. 25. It is only recently that we have begun to ask, as we adapt to each new instrument and device, and learn to interpret the world accordingly, whether we are losing touch with our traditional modes of understanding (as Camus surmised) Camus put the case very well when he wrote: “We work not on matter, but on machines, and we kill and are killed by proxy. The moment we abrogate our responsibilities to machines, we are in danger of distancing ourselves from the consequences of our own actions” From Coker: Asimov’s Children
  26. 26. Framework Considerations • Military enhancements cannot a priori be ruled out as illegal or unethical, but… • Required – Legitimate purpose – Necessity – Benefits outweigh risks – Maintain warfighter dignity – Burdens minimized – reversible – Consent granted – understanding its limitations – Transparency and oversight – Superiors accountable Mehlman, Lin, Abney, “Enhanced Warfighters: A Policy Framework”
  27. 27. Additional Questions • Would enhanced warfighters affect unit cohesion? • Do the enhancements affect service commitment? • Will the presence of enhanced soldiers negate or modify international conventions? – Torture? • What will be the effect on society after soldier return?
  28. 28. Additional Interesting Readings • Wondergenes, Maxwell Mehlman, Indiana University, 2003 • The Case Against Perfection, Michael Sandel, Harvard, 2009 • The Price of Perfection, Maxwell Mehlman, Johns Hopkins, 2009 • You and Me: The Neuroscience of Identity, Susan Greenfield, Notting Hill, 2011 • Could Human Enhancement Turn Soldiers Into Weapons That Violate International Law?, Patrick Lin, The Atlantic, Jan 2013
  29. 29. 2/24/2014 56 A framework for initial analysis • By stakeholder: – Subjects of research – military users of a technology or application – nonmilitary users – organizations – noncombatants – other nations • By cross-cutting themes – nature of harm – humanity, – technological imperfections – Reaction of adversary – unintended military uses – opportunity cost • By sources of ELSI insight
  30. 30. 2/24/2014 57 Some stakeholders and illustrative questions • Subjects of research – How far does the consent requirement extend to members of the all- volunteer armed forces? • Military users of a technology or application – Should senior commanders receive more protection from a defensive application than soldiers on the line? • Nonmilitary users – How will law enforcement uses of an application implicate ELSI issues? • Organizations – How might an application affect unit cohesion • Noncombatants – How might the public at large perceive an application? • Other nations – How might an application affect the solidarity of allies with the United States? – How might U.S. restraint affect other nations’ activity?
  31. 31. 2/24/2014 58 Some cross-cutting themes • Humanity – Does the research or application compromise of something essentially human? • Technological imperfections – How will more scientific knowledge or better technology affect judgements about ELSI related issues such as safety, fitness for human use, precision of application? • Nature of harm involved – What is the nature of harm involved, if any, with a new military application? • Reactions of adversary – How will adversaries respond if they are the targets of a new application? – How will we respond if we are targets of a new application whose development we sponsored? • Transfer to civil society – What is the impact of an application on civil liberties? On economic relationships? On social relationships? • Impact of scale – How and to what extent, if any, could a change in the scale of deployment or use of a technology or application change an ethical calculation?
  32. 32. 2/24/2014 59 Sources of insight for a framework and some illustrative questions • Philosophical and disciplinary ethics; – On what basis can the benefits and costs (in a broad sense) of any given research effort be determined and weighed against each other? • International law (related especially to the laws of armed conflict); – What might be the impact on policy makers regarding their willingness to resort to the use of force? • Social and behavioral sciences; – How if at all do scenarios for use implicate values and norms held by users? By adversaries? By observers? • Scientific and technological framing; – How and to what extent, if any, are known ethical, legal, and societal issues related to uncertainties in the underlying science or maturity of the technology? • The precautionary principle and cost-benefit analysis; – How and to what extent, if any, can ELSI-related tensions between cost-benefit analysis and the precautionary principle be reconciled? • Risk science and communication. – How can technology developers communicate with the public to reveal concerns early in the development process?

×