Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Applying criteria


Published on

This is the presentation I made to the National Cancer Institute's Cancer Research in the Media workshop for Latin American journalists in Guadalajara on November 8, 2011. It is step-by-step advice about things to consider about each of the 10 criteria we apply to the review of health care news stories about treatments, tests, products & procedures.

Published in: Health & Medicine, Business
  • Be the first to comment

  • Be the first to like this

Applying criteria

  1. 1. criteria Gary Schwitzer Publisher,
  2. 2. Availability• Sometimes stories seem to suggest that something is imminently available, when, in fact, it may be years away. There may be a lot of marketing hype involved.• Whenever you hear a prediction about something “expected to be approved” in a certain time period, ask what is the basis for that prediction? There needs to be some sense of history of the thousands of exciting ideas that never panned out. Does the person making this prediction stand to benefit in some way?• With devices/procedures, did you consider the availability of trained personnel to deliver the approach? The learning curve? These are important issues that may severely limit availability/adoption. You can address it in just a few more words.• Finally, when you hear about new approaches – new devices or new operations – you should ask how widespread is the approach. No one wants to be the first patient treated with a new device or a new operation.
  3. 3. Costs• It’s not sufficient to write, "The cost is much lower.” What is that cost? How much lower?• It may be difficult to estimate costs of an experimental approach that is very early in its development. But can you at least cite costs of existing alternatives? Is the new approach comparable to other approaches whose costs you can cite? We believe: If its not too early to talk about how well something might work, then it’s not too early to start discussing what it may cost.• A recent study reported “Perhaps as many as one in five people dont take drugs a doctor has prescribed because they cant pay for them.” This is why cost information is vital information. Yet 70% of the 1,600 stories we’ve reviewed get unsatisfactory grades on the cost issue.
  4. 4. Disease-mongering• Question prevalence estimates. Scrutinize statistics. Where did they come from?• Are you using interviews with “worst-case” scenario patients, holding such patients up as examples as if they were representative of all with this condition?• Have you framed surrogate markers or intermediate endpoints (test scores, blood values, etc.) as if they were the outcomes that people should really care about to establish improved quality of life and longevity?
  5. 5. Evaluating the evidence• Confusing causation and association – failing to explain limitations of observational studies.• Failure to emphasize limitations of small, short-term studies.• Failure to include study dropout rate. Why did they drop out?• Presenting findings from an animal or lab experiment without cautioning about the possibly limited applicability for human health• Not explaining that abstracts presented at conferences undergo only limited peer review and should be scrutinized very carefully• Presenting anecdotes as evidence of a treatment’s harms or benefits – rather than as a single illustration of its use.
  6. 6. Harms• It’s not sufficient to report on “common side effects.” How common? 40-50% of patients? Important to quantify.• Don’t accept promoters’ claims that “minimally invasive” automatically means safer. Demand the data. Compare with existing alternative options. Seek independent perspectives.• Failing to account for “minor” side effects that could have a significant impact on a person’s life.• Screening tests cause harm. It is a common journalistic pitfall to overlook this fact. Don’t minimize anxiety from false positives. Though a blood test may have little risk, future consequences that have real risks are rarely considered (for example, complications from surgical biopsy following prostate cancer blood test).
  7. 7. True novelty of the idea• In health care, newer isn’t always better. In fact, sometimes it really isn’t even “newer”• Many “new” treatments, tests, products or procedures are not really novel. A “new” drug may be another member of a well-established class of drugs.• Even if a drug is the first in a new class of drugs it may offer no more than the drugs that are widely available.• Drug companies are very good at promoting their new drugs as “novel” in order to increase initial sales.• The website is a good source of information about other studies that are underway about a specific treatment or for a particular condition. It can help you judge whether something is truly innovative.• Another good resource for assessing novelty is You can enter a key word and soon establish whether something is unique and if not – how long it’s been around and studied.
  8. 8. Benefits• Risk reduction benefits should usually be reported in absolute, not just relative terms.• Insufficient to write “significantly increased.” What does that mean? How was it measured? Avoid vague terms.• Statistical significance may not equal clinical significance. What difference did it make in peoples’ lives?• The plural of anecdote is not data. Patient stories are not data. Personal stories may overwhelm readers’ critical thinking.• Reporting only surrogate markers or intermediate endpoints. For example, changes in blood test scores, may not be a true benefit – may not influence individual health outcomes. What difference did the intervention make in peoples’ lives?
  9. 9. Did the story rely on a news release?• News releases can be valid sources of some information. But journalism is charged with independently vetting claims. So it is unacceptable to rely on a news release as the sole source of information.• There are many vested interests in health care trying to influence consumer choices. We expect journalism to use independent verification – not to rely on news releases or company spokesmen.• Again, industry isn’t the only guilty party. Dartmouth Medical School researchers concluded in a study: “Press releases from academic medical centers often promote research that has uncertain relevance to human health and do not provide key facts or acknowledge important limitations.”
  10. 10. Story sources• Who’s promoting this? Do they have a conflict of interest?• We believe that you always need second opinions from independent experts who have no vested interest in the research.• Over-enthusiastic claims are not made solely by drug companies or medical device manufacturers. Academic medical centers are often just as guilty.• Frequent examples of potential conflicts of interest in people making claims about new treatments, tests, products or procedures include: o A trial paid for by the drug manufacturer. o Researchers employed by or getting fees from a drug company. o A spokesman for a device manufacturer. o Doctors who are early adapters & true believers in a new idea. o An inventor.• All of these people want their product or their idea to look as good as it can.
  11. 11. Alternative options• Did you discuss the harms/benefits of a new idea compared with harms/benefits of existing approaches?• Did you provide some sense of the inevitably larger evidence base for existing approaches than for the new approach?• A story should not focus on a surgical approach while never mentioning nonsurgical options or prevention.• Stories should always consider the option of doing nothing – of “watchful waiting” – of “active surveillance” only• Stories about screening tests should mention other screening options, including the option of not being screened.