This document provides guidance on how to accurately summarize and report on medical studies to avoid misrepresenting results. It emphasizes the importance of reading full studies, asking clarifying questions of authors, considering limitations and biases, disclosing conflicts of interest, and relying on outside experts rather than just study authors when evaluating results. The goal is to help readers make informed health decisions by providing coverage that reflects the evidence objectively and acknowledges uncertainty.
Role of the media in preventing or promoting overdiagnosisGary Schwitzer
Seminar presentation at Preventing Overdiagnosis 2015 conference in Washington DC 9/1/15 by Gary Schwitzer, Publisher, HealthNewsReview.org & Adjunct Associate Professor, University of Minnesota School of Public Health
Advice to junior researchers: High or low road to success?James Coyne
A presentation from the International Psycho-Oncology Society Conference in Rotterdam invited by the IPOS Early Career Professionals Special Interest Group.
Talk at the University of Tokyo on history of Retraction Watch, our database, and current trends. Includes titles in Japanese, courtesy of Iekuni Ichikawa.
Peer Review is the Process used to judge the quality of articles submitted for publication in a scholarly journal. Peer Reviewed articles are considered the best source to use when writing a research paper.
Publishers are caretakers of science. Part of that work is maintaining the integrity of scientific literature. Science builds directly upon past work, so we need to be sure that we are building upon a solid foundation and not faulty research. Publishers need to take an active role in monitoring and tracking faulty, retracted research and its influence. I'm asking publishers to (1) clearly mark retracted papers; (2) alert authors who have already cited a retracted paper; and (3) before publishing an article, check its bibliography for retracted papers.
Retracted papers should be clearly marked everywhere they appear, but today that is not the case. Publishers can also use the CrossRef CrossMark service, which lets readers check for article updates (such as retraction) from a little red ribbon at the top of an article. Checking for citations to retracted articles, and limiting future citations, can help science self-correct by shoring up its foundations.
Continued citation of bad science and what we can do about it--2021-04-20jodischneider
Continued Citation of Bad Science and What We Can Do About It
Even papers that falsify data continue to be cited. I describe network and text analysis for studying how authors continue to cite bad science: articles retracted from the literature due to serious flaws or errors. I will present an in-depth case study of a human trial cited for over 10 years after it was retracted for falsifying data. Then, I will describe how the team scaled up to study a data set of 7000 retracted papers and hundreds of thousands of citations. Finally, I will discuss an ongoing Sloan-funded stakeholder consultation that is bringing editors, publishers, librarians, researchers, and research integrity experts together to address this problem.
BiographyJodi Schneider is Assistant Professor at the School of Information Sciences, University of Illinois at Urbana-Champaign where she runs the Information Quality Lab. She studies the science of science through the lens of arguments, evidence, and persuasion with a special interest in controversies in science. Her recent work has focused on topics such as systematic review automation, semantic publication, and the citation of retracted papers. Interdisciplinarity (PhD in Informatics, MS Library & Information Science, MA Mathematics; BA Great Books/liberal arts) is a fundamental principle of her work. She has held research positions across the U.S. as well as in Ireland, England, France, and Chile. She leads the Alfred P. Sloan-funded project, Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda. With Aaron Cohen and Neil Smalheiser she is working on the NIH R01 "Text Mining Pipeline to Accelerate Systematic Reviews in Evidence-Based Medicine". Talk with her about scoping reviews and about citation-based methods for updating systematic reviews!
Tuesday, April 20th, 2021
Noon-1PM Eastern
GWU - CNHS Informatics Seminar
Introduction to the peer review workshop for the PhD students of the Wageningen Graduate Schools. The goal is to explain peer review, entice PhD students to take part in the peer review process and give some tips on how to start with peer review.
From the event "Specimen Science: Ethics and Policy Implications," held at Harvard Law School on November 16, 2015.
This event was a collaboration between The Center for Child Health and Policy at Case Western Reserve University and University Hospitals Rainbow Babies & Children’s Hospital; the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School; the Multi-Regional Clinical Trials Center of Harvard and Brigham and Women's Hospital; and Harvard Catalyst | The Harvard Clinical and Translational Science Center. It was supported by funding from the National Human Genome Research Institute and the Oswald DeN. Cammann Fund at Harvard University.
For more information, visit our website at http://petrieflom.law.harvard.edu/events/details/specimen-science-ethics-and-policy
Role of the media in preventing or promoting overdiagnosisGary Schwitzer
Seminar presentation at Preventing Overdiagnosis 2015 conference in Washington DC 9/1/15 by Gary Schwitzer, Publisher, HealthNewsReview.org & Adjunct Associate Professor, University of Minnesota School of Public Health
Advice to junior researchers: High or low road to success?James Coyne
A presentation from the International Psycho-Oncology Society Conference in Rotterdam invited by the IPOS Early Career Professionals Special Interest Group.
Talk at the University of Tokyo on history of Retraction Watch, our database, and current trends. Includes titles in Japanese, courtesy of Iekuni Ichikawa.
Peer Review is the Process used to judge the quality of articles submitted for publication in a scholarly journal. Peer Reviewed articles are considered the best source to use when writing a research paper.
Publishers are caretakers of science. Part of that work is maintaining the integrity of scientific literature. Science builds directly upon past work, so we need to be sure that we are building upon a solid foundation and not faulty research. Publishers need to take an active role in monitoring and tracking faulty, retracted research and its influence. I'm asking publishers to (1) clearly mark retracted papers; (2) alert authors who have already cited a retracted paper; and (3) before publishing an article, check its bibliography for retracted papers.
Retracted papers should be clearly marked everywhere they appear, but today that is not the case. Publishers can also use the CrossRef CrossMark service, which lets readers check for article updates (such as retraction) from a little red ribbon at the top of an article. Checking for citations to retracted articles, and limiting future citations, can help science self-correct by shoring up its foundations.
Continued citation of bad science and what we can do about it--2021-04-20jodischneider
Continued Citation of Bad Science and What We Can Do About It
Even papers that falsify data continue to be cited. I describe network and text analysis for studying how authors continue to cite bad science: articles retracted from the literature due to serious flaws or errors. I will present an in-depth case study of a human trial cited for over 10 years after it was retracted for falsifying data. Then, I will describe how the team scaled up to study a data set of 7000 retracted papers and hundreds of thousands of citations. Finally, I will discuss an ongoing Sloan-funded stakeholder consultation that is bringing editors, publishers, librarians, researchers, and research integrity experts together to address this problem.
BiographyJodi Schneider is Assistant Professor at the School of Information Sciences, University of Illinois at Urbana-Champaign where she runs the Information Quality Lab. She studies the science of science through the lens of arguments, evidence, and persuasion with a special interest in controversies in science. Her recent work has focused on topics such as systematic review automation, semantic publication, and the citation of retracted papers. Interdisciplinarity (PhD in Informatics, MS Library & Information Science, MA Mathematics; BA Great Books/liberal arts) is a fundamental principle of her work. She has held research positions across the U.S. as well as in Ireland, England, France, and Chile. She leads the Alfred P. Sloan-funded project, Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda. With Aaron Cohen and Neil Smalheiser she is working on the NIH R01 "Text Mining Pipeline to Accelerate Systematic Reviews in Evidence-Based Medicine". Talk with her about scoping reviews and about citation-based methods for updating systematic reviews!
Tuesday, April 20th, 2021
Noon-1PM Eastern
GWU - CNHS Informatics Seminar
Introduction to the peer review workshop for the PhD students of the Wageningen Graduate Schools. The goal is to explain peer review, entice PhD students to take part in the peer review process and give some tips on how to start with peer review.
From the event "Specimen Science: Ethics and Policy Implications," held at Harvard Law School on November 16, 2015.
This event was a collaboration between The Center for Child Health and Policy at Case Western Reserve University and University Hospitals Rainbow Babies & Children’s Hospital; the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School; the Multi-Regional Clinical Trials Center of Harvard and Brigham and Women's Hospital; and Harvard Catalyst | The Harvard Clinical and Translational Science Center. It was supported by funding from the National Human Genome Research Institute and the Oswald DeN. Cammann Fund at Harvard University.
For more information, visit our website at http://petrieflom.law.harvard.edu/events/details/specimen-science-ethics-and-policy
MEJÍA G., Raúl. “Por las avenidas del modelo educativo: Modelo Educativo Siglo XXI”, en la revista El Despertar Académico, Universidad del Valle de México – Campus Lomas Verdes, Año 4, No. 16, México, octubre del 2001.
Esta revista ya no existe.
Simplifying the Complex: Serving Data from Pipeline Data ModelsSafe Software
These tools allow non-GIS professionals to access their data stored in industry standard Pipeline Data Models. This process allows these people access to data stored in many disparate tables inside complex data models that normally would take weeks to develop as standard turn key reports. The presentation also examines the process from moving these workbenches from the desktop to the server.
Estimulación cardiaca sin cables
08/10/2015 13:30-15:00h Casa del Corazón, Madrid
http://marcapasossincables.secardiologia.es
#marcapasossincables
Presentación de caso clínico
Dra. Marta Pachón Iglesias, Complejo Hospitalario de Toledo
El marco empresarial de hoy en día exige que las empresas sean conscientes de que su imagen y crecimiento está inexorablemente vinculada al papel social que ejercen en sus relaciones con el consejo de Administración, con sus propios empleados, con sus clientes, proveedores, e incluso como agentes conformadores de la opinión pública.
El curso de imagen etiqueta y protocolo Ejecutivo, aborda de manera integral, el conjunto de reglas y conductas que regulan el comportamiento del ser humano en la empresa, prohibiendo algunas y favoreciendo otras en función de lo que se acepta socialmente como cortés y descortés.
Quito: 09 de diciembre 2016
Guayaquil: 12 de diciembre 2016
McGovern Award Lecture - American Medical Writers AssociationGary Schwitzer
The McGovern Award is given by the American Medical Writers Association for "preeminent contributions to medical communications." It was presented at the AMWA annual conference, in Memphis, October 9, 2014.
The scandal of the £5m PACE chronic fatigue trialJames Coyne
Talk delivered to patients with chronic fatigue/myalgic
encephalomyelitis Belfast Castle February 7, 2016 about trial of psychotherapy that failed to demonstrate effectiveness, despite claims to the contrary
Bioethics lecture UMDNJ-RWJ Medical School: "Addressing the Morass at the Int...Gary Schwitzer
I began the talk by expressing my thanks and humility for being invited to speak in a lecture series that had previously hosted George Annas, Art Caplan, Robert Veatch, Linda and Zeke Emmanuel, Daniel Callahan and many others whose work I have followed and admired. I expressed my appreciation for being the first journalist to speak in the series and hoped that I would not be the last.
I noted that one previous speaker in the series had said, ”In the last 30 years, our entire ethical sensitivity has increased substantially.” I began by wondering if the same could be said about increased ethical sensitivity in media messages about health care. And then I launched into my own 30 year retrospective.
I cited a few examples from the epiphany I had in 1984 as a reporter whose eyes were opened to the hype/misinformation disseminated on AIDS, Artificial Heart, Alzheimer's. And then I transitioned to a reflection on how the same or similar issues are covered today. I offered only a few examples; it would have been a 5-hour talk if I'd made the list more complete. CNN, not coincidentally, is cited in many of the examples, some of them from my own first-hand experience. From the ‘80s, the network insisting on hourly live reports of artificial heart patient updates, and the hyping of a trial in 4 Alzheimer’s patients. In ’90, the hype of an AIDS patient (or was he?) claiming cure from a hyperthermia experiment. Then in the current era, CNN lending credence to cloning claims by a UFO-obsessed sect, and claiming an “exclusive” and “breakthrough” on a hospital news release claiming a cancer cure was within reach. The talk emphasized shared responsibilities on the part of all who communicate about medical research and health care claims. It touched on the imbalance in many media messages about screening tests, with journalists sometimes crossing the line from independent vetting into non-evidence-based advocacy. I cited the Statement of Principles of the Association of Health Care Journalists (which I wrote). It pointed to how medical journals can be complicit in the miscommunication of findings, but how many articles are now being published in journals raising questions about “spin” and bias and interpretation and word choice.
Knight Science Journalism Fellowships at MIT Medical Evidence Boot Camp 2013Gary Schwitzer
This was, I believe, the fifth time I've been asked to speak at this event in Cambridge. Other speakers: Drs. Steven Woloshin and Lisa Schwartz of Dartmouth, Dr. Barry Kramer of NCI, and Dr. Marty Makary of Johns Hopkins.
My June 14, 2017 talk at the Friends of the National Library of Medicine conference, "Consequential Clinical Research Accelerating Continuous Improvement"
We Are All Gatekeepers Now: Scrutiny of Science Goes PublicIvan Oransky
Workshop presentation for Society for Scholarly Publishing's 2013 annual meeting in San Francisco, June 5. An update of a presentation I gave in October: http://www.slideshare.net/ivanoransky/scrutiny-of-science-goes-public
We Are All Gatekeepers Now: Scrutiny of Science Goes Public
Covering Medical Studies: How Not to Get It Wrong
1. Covering Medical Studies:
How Not to Get It Wrong
AHCJ
Boston, March 2013
Ivan Oransky, MD
Executive Editor, Reuters Health
Co-Founder, Retraction Watch
@ivanoransky
4. Put Down That Coffee!
STUDY LINKS COFFEE USE TO PANCREAS CANCER
New York Times, March 12, 1981
“Although the statistical association does not prove
that coffee causes cancer, Dr. Brian MacMahon of
Harvard, leader of the research group, said he stopped
drinking coffee a few months ago when the results of
the study became clear. In a telephone interview, he
said that he would not presume to advise others.”
6. Or Get A Refill
CRITICS SAY COFFEE STUDY WAS FLAWED
New York Times, June 30, 1981
''This otherwise excellent paper may be flawed in one
critical way,'' said a letter from Dr. Steven Shedlofsky of
the Veterans Administration Hospital in White River
Junction, Vt. He questioned the comparison of
pancreatic cancer patients with persons hospitalized
for noncancerous diseases of the digestive system.
7. Or Get A Refill
CRITICS SAY COFFEE STUDY WAS FLAWED
New York Times, June 30, 1981
“Such patients, he noted, might be expected to give up
coffee drinking because of their illness. This, he argued,
would tilt the proportion of coffee drinkers away from
the ''control'' group who were being compared with
the cancer patients. Amplifying the letter in an
interview, Dr. Shedlofsky said many patients with
digestive diseases give up coffee because they believe
it aggravates their discomfort, and others do so
because their doctors have advised them to.
9. We Cured Cancer 15 Years Ago
The New York Times, May 3, 1998
“Within a year, if all goes well, the first cancer patient will
be injected with two new drugs that can eradicate any
type of cancer, with no obvious side effects and no drug
resistance -- in mice.”
…
''Judah is going to cure cancer in two years,'' said Dr.
James D. Watson, a Nobel laureate who directs the Cold
Spring Harbor Laboratory, a cancer research center on
Long Island. Dr. Watson said Dr. Folkman would be
remembered along with scientists like Charles Darwin as
someone who permanently altered civilization.”
10. Or Maybe We Didn’t. Here’s Why.
The New York Times,
February 11, 2013
11. Or Maybe We Didn’t. Here’s Why.
The New York Times, February 11, 2013
“The study’s findings do not mean that mice are useless
models for all human diseases. But, its authors said,
they do raise troubling questions about diseases like
the ones in the study that involve the immune system,
including cancer and heart disease.”
12. Do You Like Being Wrong?
5,000 compounds started out for the market
How many made it to clinical (human) trials?
13. Do You Like Being Wrong?
5,000 compounds started out for the market
How many made it to clinical (human) trials?
5
How many of those made it to FDA approval?
14. Do You Like Being Wrong?
5,000 compounds started out for the market
How many made it to clinical (human) trials?
5
How many of those made it to FDA approval?
1
Source: http://www.phrma.org/issues/intellectual-property (PhRMA)
15. How to Get It Wrong
• Write about compounds in pre-clinical trials as
if they were about to be on pharmacy shelves
• Write about every drug in phase I and phase II
trials as if it would definitely be approved
16. How Often Are Studies Wrong?
Ioannidis JPA. PLoS Med 2005; 2(8): e124
22. Always Read the Study
Get the full study and read it –
“I think it’s journalistic malpractice to not have the full
study in front of you when you’re reporting,” Oransky
says.
23. How to Get Studies
• www.EurekAlert.org for embargoed material
• AHCJ membership includes access to Cochrane
Library, Health Affairs, JAMA, and many other
journals www.healthjournalism.org
• ScienceDirect (Elsevier) gives reporters free access
to hundreds of journals www.sciencedirect.com
• Open access journals (e.g., Public Library of Science
www.plos.org)
• Ask press officers, or the authors
24. Ask “Dumb” Questions
If you lack experience dealing with scientific material,
don’t be afraid to ask for definitions of jargon and
scientific terms. This is no time to pretend you
understand everything. Oransky says the science and
medical industries are full of jargon that mask important
details. “You’ll get off the phone and have a notebook
full of gibberish and jargon,” he says. “You can’t be
afraid of asking a dumb question.”
25. Ask Smart Questions
• Was it:
– Peer-reviewed?
– Published? Where? Not all journals are
created equal.
“Dr. X said they published in Y rather than a
clinical journal because the paper was too
long for the word limits in the clinical
journals. I'm not sure where a detail like that
would go…but he was impressed with my
question.”
26. Ask Smart Questions
• Was it in humans?
– It’s remarkable there are any mice left with
cancer, depression, or restless leg syndrome
27. Ask Smart Questions
• Size matters
Look for the power calculation, and ask if you
don’t see one
28. Ask Smart
Questions
• Was it well-designed?
From Covering Medical Research, Schwitzer/AHCJ
29. Ask Smart Questions
• “Were those your primary endpoints?”
• “Looks as though that endpoint reached statistical
significance. Is that difference clinically
significant?”
30. Read the Discussion
Good journals will insist that authors include
limitations.
Read accompanying editorials, too.
31. What’s Your Angle?
• Are you trying to help readers, listeners, and
viewers make better health care decisions?
• Covering a study because it has a good business
angle, or it’s about a local project, is perfectly OK,
but it doesn’t mean readers deserve less
evidence and skepticism
32. Who Could Benefit?
• How many people have the disease?
• Keep potential disease-mongering in mind
33. How Effective is the Treatment?
• Clinically significant endpoints, or surrogates –
does this matter?
• Preventing complications? How many?
• Always remember to quantify results, not just
“patients improved”
34. What Are The Side Effects?
• Every treatment has them
• Where to look:
– Go beyond press releases and abstracts
– Look at tables, charts, and results sections
35. Who Dropped Out?
• Why did they leave the trial?
• Intention to treat analysis
36. How Much Does it Cost?
• If it’s ready to be the subject of a story,
someone has projected the likely cost and
market.
– At least ask.
37. Who Has an Interest?
• Disclose conflicts
• PharmedOut.org
• Dollars For Docs series
http://projects.propublica.org/docdollars/
38. Are There Alternatives?
• Did the study compare the new treatment to
existing alternatives, or to placebo?
• What are the advantages and disadvantages
(and costs) of those existing alternatives?
• Consider alternative explanations. Remember
coffee and pancreatic cancer?
39. Don’t Rely Only on Study Authors
• Find outside sources. Here’s how:
40. Use Anecdotes Carefully
• Is the story representative?
• Does the source of the story have any conflicts?
41. Watch Your Language
• Lifestyle/diet – are they randomized controlled
trials, or just observational?
• If observational, make the language fit the
evidence:
– YES: “tied,” “linked”
– NO: “reduces,” “causes”
42. A Dirty Little Secret
Keep a biostatistician in your back pocket
Photo by Peyri Herrera, on Flickr