7. Plan
• What is enhancement?
• Types of enhancement - general
• Advantages
• Issues
8. What is it?
• Improving mind, body, ability
• Natural versus artificial
• Beyond species typical/statistical norm
– Bear in mind “reversion to the mean”
• Changes structure or function of the human
body
9.
10.
11. Internal versus External
• External – “always on” performance aids
• Internal
– Intimate connections
– Changes notion of personal identity
12.
13.
14. Contexts
• Extremes are uninteresting
• All enhancements morally objectionable
• All are unproblematic
• Strict equality not morally required?
– Range of natural variation
– John Rawls’ Theory of Justice
• Equality of opportunity
26. DARPA “REPAIR” Project
• High fidelity model of neural function
• Programming implantable computer to stimulate
neuron function
• From the request for proposals:
– “Investigators should be able to demonstrate the ability
to stimulate relevant regions of the brain in such a
manner that will evoke a response in the primate similar
to that evoked through natural interaction with their
surrounding environment... Ideally, investigators will be
able to demonstrate ability of a non-human-primate to
complete the task outlined in technical area one without
the use of traditional sensory inputs”
27.
28.
29.
30.
31.
32.
33.
34.
35. Genetics/Neuroscience Issues
• Manipulation/Intervention
– Devices (do they affect “human essence”?)
– Human/Non-human chimeras (gene transplantation)
• Privacy of genetic and neuroscience data
• Cultural Effects
– Danger of misuse of information
– Neuro-essentialism
• Research Ethics
– Consent, data, phenotypes, incidental findings
• Enhancement
– Genetics – (example: prenatal selection) long term effects
– Neuroscience – (examples: pharmacological, implants) near term
effects
36. Neuroethics (Greely)
• Unexpected incidental findings in
neuroimaging
• How human brains make ethical decisions
• How discoveries are likely to affect society, law
– Consequences of improved prediction
– Competence determination
– “Mind reading” (lying, bias, etc)
– Enhancement
39. Subretinal video camera (Retina Implant AG)
✁✂✂ ✄☎✆✝✞✟✠ ✡✂ ✆ ✡✂ ☛☞ ✝✌✍✎ ✍✏✑✒✌☎✑☎✑✓ ✌
photodiode, an amplifying circuit, and a
stimulation electrode. Power is provided via
trans-dermal magnetic coupler placed behind
the ear, like in a cochlear implant.
Visual acuity:
0.3 cycles/degree
~ 20/1200
40.
41. Issues
• Freedom and Autonomy
• Fairness and Equity
• Social Disruption
• Human Dignity
• Rights and Obligations
• Policy and Law
42. Freedom and Autonomy
• Autonomy – do we really have it?
– Accepted restrictions
– Doesn’t hurt anyone but me
• Adequate informed consent?
• Consequences for others, society
• Free Will
– Does a person under external influence really have
it?
43. Fairness and equity
• Enhancement (advantage) for one implies
disadvantage for others
• Access for only those who can afford
• Does the enhancement benefit the society or
just the individual?
44. Societal Disruption
• Longer lifetimes
– Health care, resource demands, food, water,
shelter, population disruptions
• Adaptation to different lifestyles
• Superhuman or extra-human abilities
• Different species
– Genetic experimentation
45. Human dignity
• What does it mean to be human?
– Mortality
– Fallibility
– Emotions of our own
• Pursuit of the “good life”
– What comprises it?
– What would be the importance of personal effort?
46. Rights and Obligations
• A right to be enhanced?
• Obligation to be enhanced?
– Pilots
– Soldiers
• Children
– Right to be protected
• Guard against an “enhancement race”
47. Policy and Law
• Limits on the military
• Intellectual property issues
– Example of patents for gene sequences
• A new formulation for ethics
– Historically lags technology development
48. Specific Military Questions
• How safe must an enhancement be prior to
deployment
• Should soldiers be able to meaningfully object?
• How to weigh the risk to the individual against
the military advantaged accrued?
49. Important Considerations
• Long term consequences
– Dependence, addiction
– Societal pressure
• Unintended social engineering, “a race to the bottom”
• Unanticipated usage
– Ambiguity and uncertainty in prediction
• Altering the meaning of “human”
• Inability to calculate or even anticipate risk
• Environmental impacts
50. From Allenby and
Sarewitz, The
Techno-Human
Condition
Level I - the
technology
Level II – the system
Level III - society
51. It is only recently that we have begun to ask, as we adapt to each new instrument
and device, and learn to interpret the world accordingly, whether we are losing
touch with our traditional modes of understanding (as Camus surmised)
Camus put the case very well when he wrote: “We work not on matter, but on
machines, and we kill and are killed by proxy. The moment we abrogate our
responsibilities to machines, we are in danger of distancing ourselves from the
consequences of our own actions”
From Coker: Asimov’s Children
52. Framework Considerations
• Military enhancements cannot a priori be ruled
out as illegal or unethical, but…
• Required
– Legitimate purpose
– Necessity
– Benefits outweigh risks
– Maintain warfighter dignity
– Burdens minimized – reversible
– Consent granted – understanding its limitations
– Transparency and oversight
– Superiors accountable
Mehlman, Lin, Abney, “Enhanced Warfighters: A Policy Framework”
53. Additional Questions
• Would enhanced warfighters affect unit
cohesion?
• Do the enhancements affect service
commitment?
• Will the presence of enhanced soldiers negate
or modify international conventions?
– Torture?
• What will be the effect on society after soldier
return?
54.
55. Additional Interesting Readings
• Wondergenes, Maxwell Mehlman, Indiana University, 2003
• The Case Against Perfection, Michael Sandel, Harvard, 2009
• The Price of Perfection, Maxwell Mehlman, Johns Hopkins,
2009
• You and Me: The Neuroscience of Identity, Susan Greenfield,
Notting Hill, 2011
• Could Human Enhancement Turn Soldiers Into Weapons
That Violate International Law?, Patrick Lin, The Atlantic, Jan
2013
56. 2/24/2014 56
A framework for initial analysis
• By stakeholder:
– Subjects of research
– military users of a technology or application
– nonmilitary users
– organizations
– noncombatants
– other nations
• By cross-cutting themes
– nature of harm
– humanity,
– technological imperfections
– Reaction of adversary
– unintended military uses
– opportunity cost
• By sources of ELSI insight
57. 2/24/2014 57
Some stakeholders and illustrative questions
• Subjects of research
– How far does the consent requirement extend to members of the all-
volunteer armed forces?
• Military users of a technology or application
– Should senior commanders receive more protection from a defensive
application than soldiers on the line?
• Nonmilitary users
– How will law enforcement uses of an application implicate ELSI issues?
• Organizations
– How might an application affect unit cohesion
• Noncombatants
– How might the public at large perceive an application?
• Other nations
– How might an application affect the solidarity of allies with the United States?
– How might U.S. restraint affect other nations’ activity?
58. 2/24/2014 58
Some cross-cutting themes
• Humanity
– Does the research or application
compromise of something essentially
human?
• Technological imperfections
– How will more scientific knowledge
or better technology affect
judgements about ELSI related issues
such as safety, fitness for human use,
precision of application?
• Nature of harm involved
– What is the nature of harm involved,
if any, with a new military
application?
• Reactions of adversary
– How will adversaries respond if they
are the targets of a new application?
– How will we respond if we are targets
of a new application whose
development we sponsored?
• Transfer to civil society
– What is the impact of an application
on civil liberties? On economic
relationships? On social
relationships?
• Impact of scale
– How and to what extent, if any, could a
change in the scale of deployment or use
of a technology or application change an
ethical calculation?
59. 2/24/2014 59
Sources of insight for a framework and some
illustrative questions
• Philosophical and disciplinary ethics;
– On what basis can the benefits and costs (in a broad sense) of any given
research effort be determined and weighed against each other?
• International law (related especially to the laws of armed conflict);
– What might be the impact on policy makers regarding their willingness to
resort to the use of force?
• Social and behavioral sciences;
– How if at all do scenarios for use implicate values and norms held by users?
By adversaries? By observers?
• Scientific and technological framing;
– How and to what extent, if any, are known ethical, legal, and societal issues
related to uncertainties in the underlying science or maturity of the
technology?
• The precautionary principle and cost-benefit analysis;
– How and to what extent, if any, can ELSI-related tensions between cost-benefit
analysis and the precautionary principle be reconciled?
• Risk science and communication.
– How can technology developers communicate with the public to reveal
concerns early in the development process?