Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Catherine M. Hammack, "Thought Leader Perspectives on Risks and Protections in Precision Medicine Research"

324 views

Published on

Part of the "2016 Annual Conference: Big Data, Health Law, and Bioethics" held at Harvard Law School on May 6, 2016.

This conference aimed to: (1) identify the various ways in which law and ethics intersect with the use of big data in health care and health research, particularly in the United States; (2) understand the way U.S. law (and potentially other legal systems) currently promotes or stands as an obstacle to these potential uses; (3) determine what might be learned from the legal and ethical treatment of uses of big data in other sectors and countries; and (4) examine potential solutions (industry best practices, common law, legislative, executive, domestic and international) for better use of big data in health care and health research in the U.S.

The Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School 2016 annual conference was organized in collaboration with the Berkman Center for Internet & Society at Harvard University and the Health Ethics and Policy Lab, University of Zurich.

Learn more at http://petrieflom.law.harvard.edu/events/details/2016-annual-conference.

Published in: Healthcare
  • Be the first to comment

  • Be the first to like this

Catherine M. Hammack, "Thought Leader Perspectives on Risks and Protections in Precision Medicine Research"

  1. 1. Thought Leader Perspectives on Risks and Harms in Precision Medicine Research Laura M. Beskow, MPH, PhD, Principal Investigator Catherine M. Hammack, MA, JD Kevin McKenna, MPH Kathleen M. Brelsford, MPH, PhD www.empiricalbioethics.duke.edu
  2. 2. Beyond Data Security: Promoting Confidentiality and Advancing Science R01-HG-007733 • Laura M. Beskow, MPH, PhD – Principal Investigator • Kathleen M. Brelsford, MPH, PhD • Catherine M. Hammack, MA, JD • Ariel Hwang • Kevin McKenna, MPH • Erin C. Fuse Brown, MPH, JD • Leslie E. Wolf, MPH, JD
  3. 3. Beyond Data Security: Promoting Confidentiality and Advancing Science R01-HG-007733 • Exploring thoughtleaders’ views of confidentiality-related topics at the forefront of genome research; • Analyzing the scope and limits of legal tools for protecting confidentiality in genome research, including in the context of evolving models of participant-centric data sharing; and • Developing flexible model language describing confidentiality risks and protections in genomeresearch.
  4. 4. Beyond Data Security: Promoting Confidentiality and Advancing Science R01-HG-007733 • Exploring thoughtleaders’ views of confidentiality-related topics at the forefront of genome research; • Analyzing the scope and limits of legal tools for protecting confidentiality in genome research, including in the context of evolving models of participant-centric data sharing; and • Developing flexible model language describing confidentiality risks and protections in genomeresearch.
  5. 5. Categoriesof Interviewees Current Informatics Bioinformatics, clinical and medical informatics 5 ELSI Researcher Scholars who study ethical, legal, and socialissues 4 Ethics e.g., directorsof centers forbioethics 7 Health Law e.g., directors of centers forhealthlaw 6 Historically Disadvantaged Perspectives Scholars who study issues related to historically disadvantaged populations 6 Human Subjects e.g., members of nationalcommittees related tohuman subjects protections 7 Federal Government Individuals in relevant positions inthe federal government 5 Participant-centric Perspectives Recognized leaders in participant-centric approaches 7 Researcher Researchers in medical, bench genome sciences 5 Total 52
  6. 6. Interview topics • Risks and Harms – Risksand harmsto tell family andfriends – Instancesofrisksand harmsactuallyoccurring – Evolving risksovernext 10years • Protections – Technical data security,restrictingaccess,and preventingmisuse – Specific thoughtson Common Rule,HIPAA,and GINA • Consent – Initial reaction to MillionAmerican Study – Benefits – Risksand protectionsessentialto conveyin consentform • Non-Traditional Consent – Dynamic consent – Open consent • Risk Comparisons – Genomic analysesofbiospecimensv.ongoing accessto EHR v.streaming healthdata frommobile devices
  7. 7. Interview topics • Risks and Harms – Risksand harmsto tell family andfriends – Instancesofrisksand harmsactuallyoccurring – Evolving risksovernext 10years • Protections – Technical data security,restrictingaccess,and preventingmisuse – Specific thoughtson Common Rule,HIPAA,and GINA • Consent – Initial reaction to MillionAmerican Study – Benefits – Risksand protectionsessentialto conveyin consentform • Non-Traditional Consent – Dynamic consent – Open consent • Risk Comparisons – Genomic analysesofbiospecimensv.ongoing accessto EHR v.streaming healthdata frommobile devices
  8. 8. Imagine that your family members and close friends are all at a gathering together. The conversation turns to the “Million American Study” that has been in the news recently. Everyone is eager to hear your thoughtsabout whether they should consider signing up to be in this study. . . . 4. How would you describe to your family and friends the primary risks and harms of participating in the Million American Study? . . .
  9. 9. Main risks and harms • Risks: – Re-identification – Objectionable use – Unknowns – Return of results • Harms: – Psychological, familial, decisional – Discrimination – Group harm – Legal implications
  10. 10. Risk: Re-identification • Breach – Negligent – Hacking
  11. 11. Risk: Re-identification • Breach – Negligent – Hacking
  12. 12. Risk: Re-identification • Breach – Negligent – Hacking . . . the probability of unwanted people getting the information is close to 100 percent. . . . Because we know that every databasein the world gets hacked eventually, right? 19 | Ethics
  13. 13. Risk: Re-identification • Breach – Negligent – Hacking • Triangulation The more detailed the medical and phenotypic information, the easier it is to re- identify. The more detailed that information, the more useful it is for research. So what makes it useful for research makes re-identification easier. 14 | Health Law
  14. 14. Risk: Objectionable use • Social, cultural, religious, individual
  15. 15. Risk: Objectionable use • Social, cultural, religious, individual • Commercial
  16. 16. Risk: Objectionable use • Social, cultural, religious, individual • Commercial • Healthcare systems, government Your healthcare providers and government are integrally involved in the project. It’s very likely that they would see good uses for this information. And it is furthering their own interests rather than your health interests. From looking for ways to cut costs in the healthcare system, looking for ways to improve nationalsecurity, etcetera. So it’s really the internal leaks – the internal . . . sharing of the information that I’d be more worried aboutthan external breaches of data security. 37 | ELSI Researcher
  17. 17. Risk: Unknowns • Genetics • Social, political, cultural • Unknown unknowns As a geneticist, I know quite a lot aboutwhat are the limitations and possibilities, like what someone can infer from a genome, but I don'tknow what will happen 30 years from now—how this country will evolve . . . how the political climate will evolve. That . . . might create some harm. 07 | Researcher
  18. 18. Risk: Unknowns • Genetics • Social, political, cultural • Unknown unknowns . . . I would mainly try to focus on the idea that the risks aren't known yet, and by participating in this, one of the biggest benefits to society is to help us understand those risks. But that means we'll be the first ones exposed to them. 23 | Participant-centric Perspective
  19. 19. Risk: Return of Results • Making major medical decisions • Pursuing interventions I think the risks largely hinge on whether results are returned or not. . . . We know that a certain percentage of sequence data will have information about a significant future health risk for the individual. [If] determined to be clinically actionable . . . that might turn out to be a benefit for those individuals. On the other hand, we know that lots of folks don’t want to know predictive genetic information about themselves. Or there’s always a risk that inaccurate or uncertain information would be returned, in which case people might take actions that weren’t warranted. . . . So I think that the risks associated with this sort of thing . . . hinge really on whether results are returned. 01 | Human Subjects
  20. 20. Harm: Psychological, familial, decisional • Return of results I would say one risk is that . . . the researchers will discover things about. . . each family member's genome that the researchers are unsure about, that doctors might not be able to use to help us, and that that could cause problems, whether it's more expensive future testing that we might wantdone because now all of a sudden we're worried abouta particular genetic variation, or fear or anxiety, and the havoc that could cause to the family . . . if we start finding out that family members have these genes that we don't know much aboutor that could be worrisome. 10 | Historically Disadvantaged Perspective
  21. 21. Harm: Discrimination • Insurance (other than health) • Employment
  22. 22. Harm: Group Harm • Race, ethnicity • Condition • Other
  23. 23. Harm: Group Harm • Further stigmatizing already marginalized groups
  24. 24. Harm: Group Harm • Further stigmatizing already marginalized groups • Promoting ideas of superior, inferior groups
  25. 25. Harm: Group Harm • Further stigmatizing already marginalized groups • Promoting ideas of superior, inferior groups • Exacerbating existing disparities
  26. 26. Harm: Group Harm • Further stigmatizing already marginalized groups • Promoting ideas of superior, inferior groups • Exacerbating existing disparities • Non-health topics
  27. 27. Harm: Group Harm • Further stigmatizing already marginalized groups • Promoting ideas of superior, inferior groups • Exacerbating existing disparities • Non-health topics There’s a lot of interest in using this information to try to do something about . . . the health disparities that show up between different groups in our population. But if a researcher took your information and used it to make the case that: “Well, the reason why our group has worse outcomes is because we’re genetically inferior to other groups,” then it only adds to whatever sort of social burden we’re already dealing with. And a lot of people would be upset to know that they contributed to a research project that ended up stigmatizing their community or their group. 37 | ELSI Researcher
  28. 28. Harm: Legal implications • National Congress's relationship to NIH is a clear risk portal. Because they are relatively uneducated and erratic around science, the pursuit of research . . . and many social issues. Imagine Donald Trump . . . he's the President. He's got the Tea Party Congress, and they are making decisions that all of the DNA data that's in federal repositories should be made available to the FBI as a part of their, I don't know, bad ideas aboutimmigration. And violence – how to solve violence is to identify people. 05 | Researcher
  29. 29. Harm: Legal implications • Local Recently . . . a dad put his DNA on 23andMe. I don’tknow how the police got it, they got it, they found a familial link and so they went after the son who lived in a different city and he had to give his DNA to prove he wasn't a suspect. 46 | Historically Disadvantaged Perspective
  30. 30. Interview topics • Risks and Harms – Risks andharmsto tell family andfriends – Instancesofrisksand harmsactuallyoccurring – Evolving risksovernext 10years • Protections – Technical data security,restrictingaccess,and preventingmisuse – Specific thoughtson Common Rule,HIPAA,and GINA • Consent – Initial reaction to MillionAmerican Study – Benefits – Risksand protectionsessentialto conveyin consentform • Non-Traditional Consent – Dynamic consent – Open consent • Risk Comparisons – Genomic analysesofbiospecimensv.ongoing accessto EHR v.streaming healthdata frommobile devices
  31. 31. Interview topics • Risks and Harms – Risksand harmsto tell family andfriends – Instancesofrisksand harmsactuallyoccurring – Evolving risksovernext 10years • Protections – Technical data security,restrictingaccess,and preventingmisuse – Specific thoughtson Common Rule,HIPAA,and GINA • Consent – Initial reaction to MillionAmerican Study – Benefits – Risksand protectionsessentialto conveyin consentform • Non-Traditional Consent – Dynamic consent – Open consent • Risk Comparisons – Genomic analysesofbiospecimensv.ongoing accessto EHR v.streaming healthdata frommobile devices
  32. 32. This project was supported by a grant from the National Institutes of Health R01-HG-007733 Beyond Data Security: Promoting Confidentiality and Advancing Science Laura M. Beskow, MPH, PhD Principal Investigator The contents of this presentation are solely the responsibility of the authors and do not necessarily represent the views of NIH.
  33. 33. Program for EMPIRICAL BIOETHICS at Duke University Schoolof Medicine www.empiricalbioethics.duke.edu

×