Ciaran O'Neill & Amye Kenall: Peering into review - Innovation, credit & reproducibility

1,008 views

Published on

Ciaran O'Neill & Amye Kenall: Peering into review - Innovation, credit & reproducibility. Talk 1 in the "What Bioinformaticians need to know about digital publishing beyond the PDF2" workshop at ISMB 2014, Boston, 16th July 2014

Published in: Science
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,008
On SlideShare
0
From Embeds
0
Number of Embeds
47
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Ciaran O'Neill & Amye Kenall: Peering into review - Innovation, credit & reproducibility

  1. 1. 1 Peering into review Innovation, credit & reproducibility Ciaran Oneill & Amye Kenall
  2. 2. 2 www.biomedcentral.com/biome
  3. 3. 3 Pitfalls of traditional peer review Inconsistent Bias Favouritism Abuse Burden on researchers Slow
  4. 4. 4 Open peer review (Medical journals)
  5. 5. 5 “our goal is unapologetically ambitious: to establish a new system of peer review to bolster productive scientific debate and to provide scientists with useful guides to the literature” Launch Editorial: Eugene Koonin, David Lipman, Laura Landweber
  6. 6. 6
  7. 7. 7 ~ 50% reviewers disclose their name ~ 80% authors make the reports public
  8. 8. 8
  9. 9. 9 Decoupling peer review from the journal
  10. 10. 10 Post-publication peer review
  11. 11. 11 Community review • Post-publication commenting • Open to authors already in PubMed
  12. 12. 12
  13. 13. 13
  14. 14. 14 I thought these were peer reviewed? Problems in reproducibility
  15. 15. 15 1. Ioannidis et al., (2009). Repeatability of published microarray gene expression analyses. Nature Genetics 41: 14 2. Ioannidis JPA (2005) Why Most Published Research Findings Are False. PLoS Med 2(8) Out of 18 microarray papers, results from 10 could not be reproduced
  16. 16. 16 #overlyhonestmethods
  17. 17. 17 How to combat this? ( . . . from the journal side)
  18. 18. 18 Dynamic Document Technology
  19. 19. 19
  20. 20. 20 Journal + database + Computational Tools
  21. 21. 21 Reproducibility Starts with Peer Review
  22. 22. 22 • Repository of standardised and annotated multielectrode array data from mice and ferrets • 366 recordings from 12 studies • Authors submitted in knitr • Aided review process, allowing reviewers to rerun analyses • Authors reported it saved time—having a “natural record” of what you did • Automatic updating of text you might overlook (figure legends, eg)
  23. 23. 23 Some testimonials for Knitr Authors (Wolfgang Huber) “I do all my projects in Knitr. Having the textual explanation, the associated code and the results all in one place really increases productivity, and helps explaining my analyses to colleagues, or even just to my future self.” Reviewers (Christophe Pouzat) “It took me a couple of hours to get the data, the few custom developed routines, the “vignette” and to REPRODUCE EXACTLY the analysis presented in the manuscript. With few more hours, I was able to modify the authors’ code to change their Fig. 4. In addition to making the presented research trustworthy, the reproducible research paradigm definitely makes the reviewer’s job much more fun!
  24. 24. 24 How to Scale?
  25. 25. 25 Back to #overlyhonestmethods
  26. 26. 26 Let’s delegate!
  27. 27. 27
  28. 28. 28 Why stop at publication? More commenting? Bring debate back to the journal? DOIs for comments?
  29. 29. 29 Questions? Amye Kenall Journal Development Manager (Open Data), BioMed Central @AmyeKenall amye.kenall@biomedcentral.com Ciaran O’Neill Associate Publisher, BioMed Central @cjmoneill ciaran.o’neill@biomedcentral.com

×