Your SlideShare is downloading. ×
On Genies and Bottles: Leiden
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

On Genies and Bottles: Leiden

52
views

Published on

Published in: Education, Spiritual, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
52
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Vermelding onderdeel organisatie 1 David Koepsell, TU Delft, TPM Faculty, Philosophy Section On Genies and Bottles: Scientists’ Moral Responsibility and Dangerous R&D
  • 2. April 23, 2014 2 The Ethical Context Rapid rate of technological progress, and increasing availability of cheaper tools for scientific and technological applications, make it harder to ensure public safety. It is becoming easier to create catastrophic technologies without detection.
  • 3. April 23, 2014 3 The Ethical Context How can we help ensure a safer world? What roles do governments have, and what roles do scientists and technologists have? Who is morally responsible for dangerous research and development?
  • 4. April 23, 2014 4 Aims To provide an argument for individual moral responsibility of scientists To provide an argument for governmental responsibility for moral education of scientists a) component requirements (a basic principle) b) institutional requirements
  • 5. April 23, 2014 5 Science and Ethics Traditionally, individual responsibility for deployment of dangerous technology has divorced scientists from the consequences. Precepts: a) science should inquire into everything b) politicians and maybe engineers are responsible for deployment
  • 6. April 23, 2014 6 Science and Ethics These precepts lead to a sort of “scientific firewall” against moral responsibility. Scientists cannot be morally responsible because their duty is the unfettered exploration of everything, regardless of potential consequences.
  • 7. April 23, 2014 7 Science and Ethics Q: Do scientists ever have a positive moral duty to refrain? Let’s consider a graphic example…
  • 8. April 23, 2014 8 Smallpox Science Smallpox was eliminated from the environment in 1977. It could have been eliminated altogether, and all stores of the virus destroyed. But as late as 2001, scientists in the US decided to conduct experiments to create a monkey-model of variola infection…
  • 9. April 23, 2014 9 The Australian Mousepox “Trick” UPI: “CANBERRA, Australia, Jan. 11 (UPI) -- Scientists working for the Australian government have created a genetically engineered mousepox virus more deadly to mice than the original virus. Even when vaccinated with a normally effective vaccine, half the mice died after infection with the new virus. Biological warfare experts are worried that the current international Biological and Toxin Weapons Convention, abbreviated BTWC, may not be strong enough to cope with the misuse of the genetic engineering techniques. Governments from all over the world have been meeting in Geneva for six years to address the BTWC shortcomings, but have failed to reach final agreement. Dr. Ian Ramshaw, a viral engineer and the immunologist on the mousepox experiment, told United Press International that inserting genetic material has hazards. His team will publish their research in the February issue of the Journal of Virology. "It is a potentially vile weapon," Renshaw said.”
  • 10. April 23, 2014 10 The Australian Mousepox “Trick” The gene splice involved with the Mousepox Trick may easily be applied to smallpox, making a nearly unstoppable weapon. So why shouldn’t scientists now take the next step and see if this is true?
  • 11. April 23, 2014 11 Smallpox Ethics The Dual-Use argument ultimately is unhelpful, even a nuclear weapon has a dual-use (like Project Orion, above). Dual-use was used to justify smallpox research (a catch-22 argument). Are there or should there be moral limits to some research? Is some research morally prohibited because of its nature? Is there a model for shaping researchers’ behaviours?
  • 12. April 23, 2014 12 The Bioethics Example Nazi crimes, Milgram, Tuskeegee, and other historical ethical lapses led slowly to the development of modern bioethics. Belmont Report guides the development of institutions and education meant to protect future human subjects studies
  • 13. April 23, 2014 13 The Bioethics Example The eradication of smallpox itself was based upon initially unethical research: “Dr. Jenner decided it was time to test his vaccination, and he tested it on his gardener's son, an eight-year-old boy named James Phipps. (He got the term "vacca" from the Latin word for "cow.") The boy did contract Cowpox, but he recovered from it within a few days. Dr. Jenner then waited eight weeks for the boy's body to build an immunity. To complete his experiment, Dr. Jenner exposed James to Smallopx. Amazingly, the boy did not contract the deadly disease, and the doctor claimed success.”
  • 14. April 23, 2014 14 The Bioethics Example Jenner’s work would be unethical under the Nuremburg Code, which requires animal testing, and the Belmont Principles, which require informed consent. Because, even as late as the mid-20th Century, physicians and researchers still did not always heed these principles, Ethics Boards and IRBs were created by law to oversee human subjects research.
  • 15. April 23, 2014 15 Belmont + Can we re-fashion or re-apply standard bioethics principles beyond the protection of individual subjects? When scientific research has either a direct or potential effect on humanity as a whole, ought we to apply the principles of dignity, respect, beneficence, and justice to basic science? Isn’t there a broader moral horizon at stake?
  • 16. April 23, 2014 16 Examples Consider the fictional discovery of ïce-nine”in Cat’s Cradle… Ice-nine has a dual use (think, skating in summer), but does this justify its initial development given Belmont Principles?
  • 17. April 23, 2014 17 Examples Science doesn’t kill people; people with technologies kill people …
  • 18. April 23, 2014 18 Examples But even the most ardent gun- rights proponent will not support free ownership of tactical nuclear weapons, and international law prohibits research and development of such weapons.
  • 19. April 23, 2014 19 Examples I contend that when considering the ethics of scientists, we must not only look at regulations, laws, and codes used to review or punish their actions, but we should also consider intentions and motivations with an eye toward education. Moral training of scientists, as with other professionals, presupposes not only that we wish to keep them from breaking laws or running afoul of professional codes of conduct, but also that we wish to help develop moral insight that can guide behaviors.
  • 20. April 23, 2014 20 Morals Matter So what of ice-nine, smallpox, and other potentially catastrophic science and technology? We might argue that beneficence argues in favor of investigating smallpox because we worry about terrorist uses of it and need to devise treatments. All of which is recursively self-satisfying, because we would not have had to worry about this had scientists done the right thing to begin with, and supported its ultimate destruction. In the world of Cat’s Cradle, we could similarly argue in favor of ethically pursuing ice-nine research only in a post-ice-nine- apocalypse environment.
  • 21. April 23, 2014 21 Morals Matter An argument that is often used to justify these sorts of scientific inquiries is that “someone will devise the technologies, and employ them harmfully, eventually. Thus, we should investigate these things first (because we have good intentions).”
  • 22. April 23, 2014 22 Morals Matter Of course, this reasoning justifies investigating any and all science and technologies, no matter how potentially destructive or threatening to humanity or the environment. But it presupposes a) that the investigators doing the work have good intentions, b) that the technology or discovery would eventually be carried out by others, and c) that once discovered or applied, it can be contained
  • 23. April 23, 2014 23 Morals Matter The “eventual” fallacy justifies any investigation, and scientific inquiry, no matter the potential consequences. It fails if we broaden the moral horizon offered by the Belmont principles to include humanity as a whole …. Implicit in bioethical principles is some utilitarian calculus
  • 24. April 23, 2014 24 Morals Matter Science proceeds not in a vacuum, but as a socially devised institution. It is conducted by professionals, with funding from mostly public sources, and with relative freedom under the auspices of mostly academic environments. As a largely public institution, and as the beneficiaries of the public trust and wealth, scientists must consider the consequences of their inquiries
  • 25. April 23, 2014 25 Morals Matter The “eventual” argument makes sense when the risks posed by investigating a deadly thing is outweighed by the likelihood of that deadly thing’s being discovered and used by others combined with the potential of a scientific investigation developing a plausible protection of the public at large. So, roughly: R=risk, L=likelihood of independent discovery and use, and P=potential benefit from scientific investigation now
  • 26. April 23, 2014 26 Morals Matter R=risk, L=likelihood of independent discovery and use, and P=potential benefit from scientific investigation now If L+P>R, then a scientist can make a moral case for pursing an investigation into something posing a large, general risk. Otherwise, there is simply no moral justification for further inquiry.
  • 27. April 23, 2014 27 Cultivating Moral Responsibility Unlike the Belmont Principles, which could be used to guide the development of regulatory institutions, the expanded ethical horizon I have argued for above requires individual responsibility on the part of scientists. The calculus proposed must be employed by scientists before they ever get to the point of disseminating their ideas. It is a personal, moral responsibility that must be cultivated.
  • 28. April 23, 2014 28 Cultivating Moral Responsibility Nonetheless, encouraging the development and adoption of these principles, and adopting the notion of a broad horizon of scientific responsibility (encompassing not just individual human subjects, but also responsibility toward humanity in general), can best be encouraged through new institutions. Legal and regulatory bodies ought to devise these institutions both within and among sovereigns. Professional organization as well ought to embrace and adopt ethical training of their members, understanding that scientists are citizens of broader groups whose funding and support they require. Education in principles not just of scientific integrity, but also social responsibility, ought to be developed and embraced.
  • 29. April 23, 2014 29 Cultivating Moral Responsibility Just as governments take it upon themselves to fund and advance research and development, both out of scientific curiosity and as a way to grow economically, so should they adopt the responsibility to educate scientists to be better citizens. As taxpayers provide for investigations into nature’s truths, sometimes with no potential for economic benefit, they must also be considered as beneficiaries or targets of the fruits of scientific inquiry…
  • 30. April 23, 2014 30 We are all human subjects of certain inquiries
  • 31. April 23, 2014 31 Thank you Atlas R. M. and Dando M. (2006). The dual-use dilemma for the life sciences: perspectives, conundrums, and global solutions, Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science, Vol. 4, No. 3, pp. 276-286. Childress, J., Meslin, E., & Shapiro, H., Eds. (2005). Belmont revisited: Ethical principles for research with human subjects. Washington, DC: Georgetown University Press. Cohen H.W., Gould R.M., Sidel V.W. (2004), The pitfalls of bioterrorism preparedness: the anthrax and smallpox experiences, American Journal of Public Health, Vol. 94, No. 10, pp. 1667-1671. Corneliussen F. (2006). Adequate regulation, a stop-gap measure, or part of a package? EMBO Reports, Vol. 7, pp. s50-s54. Ehni, H-J. (2008). Dual use and the ethical responsibility of scientists. Arch. Immunol. Ther. Exp., Vol. 56, pp. 147-152. Jones N.L. (2007). A code of ethics for the life sciences, Science, Engineering Ethics, Vol. 13, pp. 25-43. Kelley M. (2006). Infectious disease research and dual-use risk, Virtual Mentor: Ethics Journal of the American Medical Association, Vol. 8, No. 4, pp. 230-234. Miller S and Selgelid M.J. (2008). Chap. 3: The Ethics of dual-use research, in Ethical and Philosophical Consideration of the Dual-Use Dilemma in the Biological Sciences (Miller ed.), Springer Sciences, NV. Musil, R. K. (1980). There must be more to love than death: A conversation with Kurt Vonnegut. The Nation, Vol. 231 (Issue 4): p128–132. Nixdorff K. and Bender W. (2002). Ethics of university research, biotechnology and potential military spin-off, Minerva Vol. 40, pp. 15-35. Preston R. (2003). The Demon in the Freezer (Fawcett). Somerville M.A. and Atlas R. M. (2005), Ethics: a weapon to counter bioterrorism, Science, Policy Forum, Mar. 25, p. 1881. References