THE FILE DRAWER PROBLEM
...and how innovative publishing solutions can help
DEFINITION & HISTORY
First termed “File Drawer
Problem” by Rosenthal
(1979) but recognised as a
problem since at least 1959
(Sterling)

Despite repeated hand-
wringing over the years,
publication bias still
rampant

Current publishing system
is a major culprit
IS IT EVEN A PROBLEM?
Is the file drawer problem really a problem in vision
science?

Anecdotal evidence suggests yes

Some effects "known to be dodgy" but largely by hearsay

This disadvantages those without access to networks

Hands up if your file drawer is empty

Examples?
RECENT BUZZ
RECENT BUZZ
RECENT BUZZ
RECENT BUZZ
RECENT BUZZ
RECENT BUZZ
RECENT BUZZ
• Corollary 2: The smaller the effect sizes in a scientific field, the
  less likely the research findings are to be true

• Corollary 4: The greater the flexibility in designs, definitions,
  outcomes, and analytical modes in a scientific field, the less likely
  the research findings are to be true.

• Corollary 5: The greater the financial and other interests and prejudices
  in a scientific field, the less likely the research findings are to be true


• Corollary 6: The hotter a scientific field (with more scientific teams
  involved), the less likely the research findings are to be true.
BARRIERS TO PUBLICATION
OF NULL RESULTS & REPLICATIONS

No glory in publishing a
replication

Few journals publish
replications

Usually uphill battle even with
those that do

The wrath of the original
researcher
BARRIERS TO PUBLICATION
OF NULL RESULTS & REPLICATIONS

No glory in publishing a
replication

Few journals publish
replications

Usually uphill battle even with
those that do

The wrath of the original
researcher
BARRIERS TO PUBLICATION
OF NULL RESULTS & REPLICATIONS

No glory in publishing a
replication

Few journals publish
replications

Usually uphill battle even with
those that do

The wrath of the original
researcher
BARRIERS TO PUBLICATION
OF NULL RESULTS & REPLICATIONS

No glory in publishing a
replication

Few journals publish
replications

Usually uphill battle even with
those that do

The wrath of the original
researcher
BARGHED!
BARGHED!
BARGHED!
WHAT CAN WE DO?

PsychFileDrawer?

Post-publication peer review?

Submit only intro & methods, similar to clinical trials
registration?

Reproducibility Project/PLoS One

More radical: arXiv, self-publishing

Favour publication of negative results
THE CURRENT SYSTEM IS
      WASTEFUL
THE CURRENT SYSTEM IS
      WASTEFUL
Proposed solution
J-REPS
Journal of Registered Evidence in Psychological Science
                                                                       Dan Simons
 1.   Authors plan a replication study

 2.   They submit an introduction and methods section

 3.   It is sent to reviewers, including the targeted author         Alex Holcombe

 4.   The editor decides whether to accept/reject, based on:

      1.   Reviewer comments regarding the proposed protocol

      2.   Importance of the study, judged by argument in the
           introduction, number of citations of original, reviewer
           comments

 5.   The Intro, Method and analysis plan, and reviewer
      comments are posted on the journal website                      ✔      ✔       ✔
 6.   When the results come in, the authors write a conventional
      results and discussion section and that together with the
                                                                      Good incentive
      raw data are posted, yielding the complete publication          for researchers
Comprehensive solution?
       Open Science
• Funders may eventually demand:
 • Any data collected with their money be made
   available

 • Experiment software code be posted
• As data comes in, put on web
 • Electronic lab notebook
• Papers written via open collaborative documents on the
 web
DISCUSSION POINTS

What are the problems with existing and proposed
models?

What are possible solutions to these problems?

How can we change the prevailing culture?

What are the exciting possibilities for vision
science?
Thanks
• Thanks to Alex Holcombe for much of the
  material in this talk

• Thanks to all my co-presenters

• Thank you all for coming and contributing!

• Slides will be on Slideshare very shortly and
  are open source so feel free to use them
How publication practices distort science

                                                       Analogy in Scientific
 Economic Term              Meaning
                                                           Publication


                    The winner in an auction            Scientific studies try to find true
                    tends on average to have         relationships, but none are certain of
                                                      what these relationships are exactly.
 Winner's curse   overpaid, especially when no       Published articles, especially in very
                    participant is sure exactly     competitive journals, have on average
                    how valuable the item is.                 exaggerated results.

                  A market where a few traders Very few journals with limited
                  have the major share and each publication slots (compared with
   Oligopoly        oligopolist has significant  geometrically increasing scientific
                      power to influence the        data that seek publication)
                                                determine highly visible science.
                             market.
                  Follow-the-leader behaviour: the Scientists may uncritically follow
                   actions of the first or dominant  paths of investigation that are
                                                       popularised in prestigious
    Herding       player supersede the individual
                                                    publications, neglecting novel
                    information and actions of all
                                                     ideas and truly independent
                       the players in a market.
                                                          investigative paths.
How publication practices distort science

                                                       Analogy in Scientific
 Economic Term              Meaning
                                                           Publication


                    The winner in an auction            Scientific studies try to find true
                    tends on average to have         relationships, but none are certain of
                                                      what these relationships are exactly.
 Winner's curse   overpaid, especially when no       Published articles, especially in very
                    participant is sure exactly     competitive journals, have on average
                    how valuable the item is.                 exaggerated results.

                  A market where a few traders Very few journals with limited
                  have the major share and each publication slots (compared with
   Oligopoly        oligopolist has significant  geometrically increasing scientific
                      power to influence the        data that seek publication)
                                                determine highly visible science.
                             market.
                  Follow-the-leader behaviour: the Scientists may uncritically follow
                   actions of the first or dominant  paths of investigation that are
                                                       popularised in prestigious
    Herding       player supersede the individual
                                                    publications, neglecting novel
                    information and actions of all
                                                     ideas and truly independent
                       the players in a market.
                                                          investigative paths.
How publication practices distort science

                                                       Analogy in Scientific
 Economic Term              Meaning
                                                           Publication


                    The winner in an auction            Scientific studies try to find true
                    tends on average to have         relationships, but none are certain of
                                                      what these relationships are exactly.
 Winner's curse   overpaid, especially when no       Published articles, especially in very
                    participant is sure exactly     competitive journals, have on average
                    how valuable the item is.                 exaggerated results.

                  A market where a few traders Very few journals with limited
                  have the major share and each publication slots (compared with
   Oligopoly        oligopolist has significant  geometrically increasing scientific
                      power to influence the        data that seek publication)
                                                determine highly visible science.
                             market.
                  Follow-the-leader behaviour: the Scientists may uncritically follow
                   actions of the first or dominant  paths of investigation that are
                                                       popularised in prestigious
    Herding       player supersede the individual
                                                    publications, neglecting novel
                    information and actions of all
                                                     ideas and truly independent
                       the players in a market.
                                                          investigative paths.
How publication practices distort science

                                                       Analogy in Scientific
   Economic Term               Meaning
                                                           Publication


                          Restrictions on the          Print page limits are an obvious excuse
                                                     for failure to accept articles, and further
                       provision of a commodity       the small number of major high-impact
  Artificial scarcity                                journals have limited slots; extremely low
                       above that expected from      acceptance rates provide status signals to
                          its production cost.      successful publications and their authors.


                       Situation where the real       For much (most?) scientific
                                                         work, it is difficult or
                         long-term value of a
    Uncertainty                                       impossible to immediately
                        commodity is largely             predict future value,
                            unpredictable.             extensions, and practical
                          Marking a product as               applications
                                                        Publishing in selective
                       valuable; of key importance journals provides evidence of
      Branding            when it is difficult to    value of a research result and
                       determine a product's value its authors, independent of the
                          prior to consuming it.        manuscript's content.

File drawer talk

  • 1.
    THE FILE DRAWERPROBLEM ...and how innovative publishing solutions can help
  • 3.
    DEFINITION & HISTORY Firsttermed “File Drawer Problem” by Rosenthal (1979) but recognised as a problem since at least 1959 (Sterling) Despite repeated hand- wringing over the years, publication bias still rampant Current publishing system is a major culprit
  • 4.
    IS IT EVENA PROBLEM? Is the file drawer problem really a problem in vision science? Anecdotal evidence suggests yes Some effects "known to be dodgy" but largely by hearsay This disadvantages those without access to networks Hands up if your file drawer is empty Examples?
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
    • Corollary 2:The smaller the effect sizes in a scientific field, the less likely the research findings are to be true • Corollary 4: The greater the flexibility in designs, definitions, outcomes, and analytical modes in a scientific field, the less likely the research findings are to be true. • Corollary 5: The greater the financial and other interests and prejudices in a scientific field, the less likely the research findings are to be true • Corollary 6: The hotter a scientific field (with more scientific teams involved), the less likely the research findings are to be true.
  • 13.
    BARRIERS TO PUBLICATION OFNULL RESULTS & REPLICATIONS No glory in publishing a replication Few journals publish replications Usually uphill battle even with those that do The wrath of the original researcher
  • 14.
    BARRIERS TO PUBLICATION OFNULL RESULTS & REPLICATIONS No glory in publishing a replication Few journals publish replications Usually uphill battle even with those that do The wrath of the original researcher
  • 15.
    BARRIERS TO PUBLICATION OFNULL RESULTS & REPLICATIONS No glory in publishing a replication Few journals publish replications Usually uphill battle even with those that do The wrath of the original researcher
  • 16.
    BARRIERS TO PUBLICATION OFNULL RESULTS & REPLICATIONS No glory in publishing a replication Few journals publish replications Usually uphill battle even with those that do The wrath of the original researcher
  • 17.
  • 18.
  • 19.
  • 20.
    WHAT CAN WEDO? PsychFileDrawer? Post-publication peer review? Submit only intro & methods, similar to clinical trials registration? Reproducibility Project/PLoS One More radical: arXiv, self-publishing Favour publication of negative results
  • 21.
    THE CURRENT SYSTEMIS WASTEFUL
  • 22.
    THE CURRENT SYSTEMIS WASTEFUL
  • 23.
  • 24.
    J-REPS Journal of RegisteredEvidence in Psychological Science Dan Simons 1. Authors plan a replication study 2. They submit an introduction and methods section 3. It is sent to reviewers, including the targeted author Alex Holcombe 4. The editor decides whether to accept/reject, based on: 1. Reviewer comments regarding the proposed protocol 2. Importance of the study, judged by argument in the introduction, number of citations of original, reviewer comments 5. The Intro, Method and analysis plan, and reviewer comments are posted on the journal website ✔ ✔ ✔ 6. When the results come in, the authors write a conventional results and discussion section and that together with the Good incentive raw data are posted, yielding the complete publication for researchers
  • 25.
    Comprehensive solution? Open Science • Funders may eventually demand: • Any data collected with their money be made available • Experiment software code be posted • As data comes in, put on web • Electronic lab notebook • Papers written via open collaborative documents on the web
  • 26.
    DISCUSSION POINTS What arethe problems with existing and proposed models? What are possible solutions to these problems? How can we change the prevailing culture? What are the exciting possibilities for vision science?
  • 27.
    Thanks • Thanks toAlex Holcombe for much of the material in this talk • Thanks to all my co-presenters • Thank you all for coming and contributing! • Slides will be on Slideshare very shortly and are open source so feel free to use them
  • 28.
    How publication practicesdistort science Analogy in Scientific Economic Term Meaning Publication The winner in an auction Scientific studies try to find true tends on average to have relationships, but none are certain of what these relationships are exactly. Winner's curse overpaid, especially when no Published articles, especially in very participant is sure exactly competitive journals, have on average how valuable the item is. exaggerated results. A market where a few traders Very few journals with limited have the major share and each publication slots (compared with Oligopoly oligopolist has significant geometrically increasing scientific power to influence the data that seek publication) determine highly visible science. market. Follow-the-leader behaviour: the Scientists may uncritically follow actions of the first or dominant paths of investigation that are popularised in prestigious Herding player supersede the individual publications, neglecting novel information and actions of all ideas and truly independent the players in a market. investigative paths.
  • 29.
    How publication practicesdistort science Analogy in Scientific Economic Term Meaning Publication The winner in an auction Scientific studies try to find true tends on average to have relationships, but none are certain of what these relationships are exactly. Winner's curse overpaid, especially when no Published articles, especially in very participant is sure exactly competitive journals, have on average how valuable the item is. exaggerated results. A market where a few traders Very few journals with limited have the major share and each publication slots (compared with Oligopoly oligopolist has significant geometrically increasing scientific power to influence the data that seek publication) determine highly visible science. market. Follow-the-leader behaviour: the Scientists may uncritically follow actions of the first or dominant paths of investigation that are popularised in prestigious Herding player supersede the individual publications, neglecting novel information and actions of all ideas and truly independent the players in a market. investigative paths.
  • 30.
    How publication practicesdistort science Analogy in Scientific Economic Term Meaning Publication The winner in an auction Scientific studies try to find true tends on average to have relationships, but none are certain of what these relationships are exactly. Winner's curse overpaid, especially when no Published articles, especially in very participant is sure exactly competitive journals, have on average how valuable the item is. exaggerated results. A market where a few traders Very few journals with limited have the major share and each publication slots (compared with Oligopoly oligopolist has significant geometrically increasing scientific power to influence the data that seek publication) determine highly visible science. market. Follow-the-leader behaviour: the Scientists may uncritically follow actions of the first or dominant paths of investigation that are popularised in prestigious Herding player supersede the individual publications, neglecting novel information and actions of all ideas and truly independent the players in a market. investigative paths.
  • 31.
    How publication practicesdistort science Analogy in Scientific Economic Term Meaning Publication Restrictions on the Print page limits are an obvious excuse for failure to accept articles, and further provision of a commodity the small number of major high-impact Artificial scarcity journals have limited slots; extremely low above that expected from acceptance rates provide status signals to its production cost. successful publications and their authors. Situation where the real For much (most?) scientific work, it is difficult or long-term value of a Uncertainty impossible to immediately commodity is largely predict future value, unpredictable. extensions, and practical Marking a product as applications Publishing in selective valuable; of key importance journals provides evidence of Branding when it is difficult to value of a research result and determine a product's value its authors, independent of the prior to consuming it. manuscript's content.

Editor's Notes

  • #2 \n\n
  • #3 \n\n
  • #4 \n\n
  • #5 Posner cueing paradigm\nFigure/ground sf effect Wong & weisstein (Naomi)\nAttention and alpha\nBurr's spatiotopic fMRI stuff (don't actually mention this)\n\n\n\n\n
  • #6 \n\n
  • #7 \n\n
  • #8 \n\n
  • #9 \n\n
  • #10 \n\n
  • #11 \n\n
  • #12 I didn't have room for all of these, so I left out the ones I didn't like (like the one about small sample sizes)\nAlex's notes:\n-I hope this disturbs you! It certainly was very disturbing to me!\n-If you haven’t read stuff by John Ioannidis before, I urge you to do so, because he’s got a pretty good argument. He makes a series of calculations and for \nIt depends on how many tests people are doing of possible effects, what the probability is of any particular thing tested being true, the average size of studies in the field, etc.\n-I’m not going to go through it, but to give you a taste\nBOTH of which are related to the file-drawer problem\nNEXT:\n-Corollary 4. Since everybody is trying to get a positive result so they can publish it, people tend to fish around doing lots of analyses and \n-Corollary 6. Related to the file-drawer problem. This happens in areas where lots of people are testing related things, but understandably only publish the positive findings. So occasionally a team will get a significant result, which often might be a type-1 error (that is, it happened due to chance rather than being a real effect). Then, some other teams in the area will jump on that and try to build on it. Probably they won’t be able to replicate it and hopefully they’ll manage to publish that non-replication somehow, in which case the field will swing to some other area when somebody else finds a different significant result....\n-So publishing of replications is sorely needed. One of the replies to the Ioannidis article relates to this and I find it kinda funny\n-I’m in an absurd system of science that pushes me to publish my own type-1 errors and not correct those of others #publicationBias\n-It shouldn’t be this way, science should be a model human endeavour.\n------------------\nBem 2011 Feeling the Future: Experimental evidence for anomalous retroactive influences on cognition and affect. Journal of Personality and Social Psychology, 100, 407-425.\n Eliot Smith of Indiana University in Bloomington, the Journal of Personality and Social Psychology editor who handled the submitted paper, declined to send it out to review. "This journal does not publish replication studies, whether successful or unsuccessful," he wrote. http://www.newscientist.com/article/dn20447-journal-rejects-studies-contradicting-precognition.html\n
  • #13 \n\n
  • #14 \n\n
  • #15 \n\n
  • #16 \n\n
  • #17 \n\n
  • #18 \n\n
  • #19 \n\n
  • #20 \n\n
  • #21 \n\n
  • #22 \n\n
  • #23 \n\n
  • #24 \n\n
  • #25 Supply and demand is a problem. Supply now far outstrips demand, so publishing is getting harder. What criteria are used to judge value? Positive results are still massively favoured, and trials that look as though they'll turn out negative are abandoned as wasteful. \n\nConstriction on demand side: Prominence of very few journals. Being selective boosts circulation (like Birkin bags?). Impact factors are a selling point.\n\nPreferred publication of negative over positive results has been suggested, with print publication favoured for all negative data (as more likely to be true) and for only a minority of the positive results that have demonstrated consistency and reproducibility [54]. \n\nTo exorcise the winners curse, the quality of experiments rather than the seemingly dramatic results in a minority of them would be the focus of review, but is this feasible in the current reality?\n
  • #26 Supply and demand is a problem. Supply now far outstrips demand, so publishing is getting harder. What criteria are used to judge value? Positive results are still massively favoured, and trials that look as though they'll turn out negative are abandoned as wasteful. \n\nConstriction on demand side: Prominence of very few journals. Being selective boosts circulation (like Birkin bags?). Impact factors are a selling point.\n\nPreferred publication of negative over positive results has been suggested, with print publication favoured for all negative data (as more likely to be true) and for only a minority of the positive results that have demonstrated consistency and reproducibility [54]. \n\nTo exorcise the winners curse, the quality of experiments rather than the seemingly dramatic results in a minority of them would be the focus of review, but is this feasible in the current reality?\n
  • #27 \n\n