Slides for the Feb 8, 2017 lab meeting of Roshan Cools' Motivation & Cognitive Control group (Donders Institute), discussing the following paper:
McKiernan, E. C., Bourne, P. E., Brown, C. T., Buck, S., Kenall, A., Lin, J., … Yarkoni, T. (2016). How open science helps researchers succeed. eLife, 5, e16800. https://doi.org/10.7554/eLife.16800.
2. POINT OF VIEW
How open science helps
researchers succeed
Abstract Open access, open data, open source and other open scholarship practices are growing in
popularity and necessity. However, widespread adoption of these practices has not yet been
achieved. One reason is that researchers are uncertain about how sharing their work will affect their
careers. We review literature demonstrating that open research is associated with increases in
citations, media attention, potential collaborators, job opportunities and funding opportunities.
These findings are evidence that open research practices bring significant benefits to researchers
relative to more traditional closed practices.
DOI: 10.7554/eLife.16800.001
ERIN C MCKIERNAN*
, PHILIP E BOURNE, C TITUS BROWN, STUART BUCK,
AMYE KENALL, JENNIFER LIN, DAMON MCDOUGALL, BRIAN A NOSEK,
KARTHIK RAM, COURTNEY K SODERBERG, JEFFREY R SPIES, KAITLIN THANEY,
ANDREW UPDEGROVE, KARA H WOO AND TAL YARKONI
Introduction
Recognition and adoption of open research
practices is growing, including new policies that
increase public access to the academic literature
(open access; Bjo¨rk et al., 2014; Swan et al.,
2015) and encourage sharing of data (open
data; Heimsta¨dt et al., 2014; Michener, 2015;
Stodden et al., 2013), and code (open
source; Stodden et al., 2013; Shamir et al.,
2013). Such policies are often motivated by ethi-
cal, moral or utilitarian arguments (Suber, 2012;
Willinsky, 2006), such as the right of taxpayers
to access literature arising from publicly-funded
research (Suber, 2003), or the importance of
public software and data deposition for repro-
ducibility (Poline et al., 2012; Stodden, 2011;
Ince et al., 2012). Meritorious as such argu-
ments may be, however, they do not address
the practical barriers involved in changing
researchers’ behavior, such as the common per-
ception that open practices could present a risk
to career advancement. In the present article,
we address such concerns and suggest that the
benefits of open practices outweigh the poten-
tial costs.
We take a researcher-centric approach in out-
lining the benefits of open research practices.
Researchers can use open practices to their
advantage to gain more citations, media atten-
tion, potential collaborators, job opportunities
and funding opportunities. We address common
myths about open research, such as concerns
about the rigor of peer review at open access
journals, risks to funding and career advance-
ment, and forfeiture of author rights. We recog-
nize the current pressures on researchers, and
offer advice on how to practice open science
within the existing framework of academic evalu-
ations and incentives. We discuss these issues
with regard to four areas – publishing, funding,
resource management and sharing, and career
advancement – and conclude with a discussion
of open questions.
Publishing
Open publications get more citations
There is evidence that publishing openly is asso-
ciated with higher citation rates (Hitch-
cock, 2016). For example, Eysenbach reported
that articles published in the Proceedings of the
National Academy of Sciences (PNAS) under
their open access (OA) option were twice as
likely to be cited within 4–10 months and nearly
three times as likely to be cited 10–16 months
after publication than non-OA articles published
*For correspondence:
emckiernan@ciencias.unam.mx
Reviewing editor: Peter
Rodgers, eLife, United Kingdom
Copyright McKiernan et al.
This article is distributed under
the terms of the Creative
Commons Attribution License,
which permits unrestricted use
and redistribution provided that
the original author and source are
credited.
McKiernan et al. eLife 2016;5:e16800. DOI: 10.7554/eLife.16800 1 of 19
FEATURE ARTICLE
Bram Zandbelt, 2017
One paper
Overview lab meeting
1. Engaging in open science will boost my career
2. Engaging in open science will accelerate my research
3. Engaging in open science will improve the quality of my research
Proponent Opponent
vs.
Proponent Opponent
vs.
Proponent Opponent
vs.
Three debates
5. Take home message:
Open science has many benefits for individual researchers
Bram Zandbelt, 2017
Open publishing
… comes with greater visibility
… lets you retain author rights and control reuse of materials
… enables you to benefit from transparent peer review
… can be achieved in many ways
… can be low-cost and even free of charge
Practicing open science
… provides you with new funding opportunities
… enables you to comply with funder mandates
Code and data sharing
… facilitates reuse and responding to requests
… promotes good research practices and reduces errors
… is beginning to be acknowledged
… comes with greater visibility
… helps you increase your research output
Practicing open science
… provides new opportunities for scientific collaboration
… is increasingly valued and mandated by institutions
7. Open publishing
comes with greater visibility
more citations
more page views
more social media posts
Bram Zandbelt, 2017Sources: McKiernan et al., eLife, 2016; Wang et al., Scientometrics, 2015;
10. Bram Zandbelt, 2017
Open publishing
can be achieved in multiple ways
Source: https://http://www.sherpa.ac.uk/romeo/ data retrieved on Jan 20, 2017
1. Open Access journals
11. Bram Zandbelt, 2017
AIMS Neurosci
BMC Neurosci
BMC Psychol
BMC Psychiatry
eLife
Front Hum Neurosci
Front Neurosci
Front Psychol
Scientific Data
Scientific Reports
Translational Psychiatry
Nat Commun
Nat Hum Behav
PeerJ
PLoS Biol
PLoS Comp Biol
PLoS ONE
R Soc Open Sci
eNeuro
Open publishing
can be achieved in multiple ways
1. Open Access journals
Source: https://http://www.sherpa.ac.uk/romeo/ data retrieved on Jan 20, 2017
12. Bram Zandbelt, 2017
Open publishing
can be achieved in multiple ways
2. Hybrid journals
Source: https://http://www.sherpa.ac.uk/romeo/ data retrieved on Jan 20, 2017
13. Bram Zandbelt, 2017
JEP General
JEP HPP
JEP LMC
Psych Rev
Psychol Bull
Current Biology
Neuron*
Trends Cogn Sci*
Trends Neurosci*
Biol Psychiatry
Cortex
Curr Opin Behav Sci
Curr Opin Neurosci
Curr Opin Psychol
Eur Neuropsychopharmacol
Neuroimage
Neuropsychologia
J Cogn Neurosci
Neuropsychopharmacol
Mol Psychiatry
Proc Natl Acad Sci
Brain
Cereb Cortex
Soc Cogn Affect Neurosci
Phil Trans R Soc B
J Neurosci
Att Perecpt Psychophys
Brain Struct Func
Cognitive Affect Behavior Neurosci
Exp Brain Res
Psychopharmacol
Eur J Neurosci
Hum Brain Mapp
Open publishing
can be achieved in multiple ways
2. Hybrid journals
Source: https://http://www.sherpa.ac.uk/romeo/ data retrieved on Jan 20, 2017
14. Bram Zandbelt, 2017
Open publishing
can be achieved in multiple ways
3. Self-archiving
Source: https://http://www.sherpa.ac.uk/romeo/ data retrieved on Jan 20, 2017
15. Bram Zandbelt, 2017
Science AIMS Neurosci JAMA Psychiatry
Ann Rev Neurosci
Ann Rev Psychol
J Neurophysiol
JEP General
JEP HPP
JEP LMC
Psych Rev
Psychol Bull
Am J Psychiatry
BMC Neurosci
BMC Psychol
BMC Psychiatry
Current Biology
Neuron
Trends Cogn Sci
Trends Neurosci
eLife
Biol Psychiatry
Cortex
Curr Opin Behav Sci
Curr Opin Neurosci
Curr Opin Psychol
Eur Neuropsychopharmacol
Neuroimage
Neuropsychologia
Front Hum Neurosci
Front Neurosci
Front Psychol
N Engl J Med J Cogn Neurosci
Nat Rev Neurosci
Neuropsychopharmacol
Scientific Data
Scientific Reports
Translational Psychiatry
Mol Psychiatry
Nature
Nat Commun
Nat Hum Behav
Nat Neurosci
Proc Natl Acad Sci
Brain
Cereb Cortex
Soc Cogn Affect Neurosci
PeerJ
PLoS Biol
PLoS Comp Biol
PLoS ONE
Phil Trans R Soc B
R Soc Open Sci
eNeuro
J Neurosci
Eur J Neurosci
Hum Brain Mapp
Att Perecpt Psychophys
Brain Struct Func
Cognitive Affect Behavior Neurosci
Exp Brain Res
Psychopharmacol
Open publishing
can be achieved in multiple ways
none
allowed
preprint
only
pre- &
postprint
all
allowed
pub PDF ✘ ✘ ✘ ✔
postprint ✘ ✘ ✔ ✔
preprint ✘ ✔ ✔ ✔
3. Self-archiving
Source: https://http://www.sherpa.ac.uk/romeo/ data retrieved on Jan 20, 2017
16. Open publishing
can be low-cost and even free of charge
1. Free of charge
R Soc Open Sci
Bram Zandbelt, 2017
Judgm Decis Mak
2. Free of charge in NL
Att Perecpt Psychophys
Brain Struct Func
Cognitive Affect Behavior Neurosci
Psychon Bull Rev
Psychopharmacol
Cogn Sci
Eur J Neurosci
Hum Brain Mapp
Psychophysiol
Topics Cogn Sci
17. Open publishing
can be low-cost and even free of charge
3. Low-cost
Bram Zandbelt, 2017
PeerJ ($399-499, lifetime) RIO Journal (€50-650) Glossa (£300)
J Open Psychol Data (£100)
J Open Res Software (£100)
Cogent Biology (pay what you can)
Cogent Psychology (pay what you can)
Collabra: Psychology ($875) F1000 Research ($1000)
SAGE Open ($395)
19. Practicing open science
provides you with new funding opportunities
Individual fellowships
Open science awards
Replication
awards and grants
Preregistration awards
Travel grant
Bram Zandbelt, 2017Sources: McKiernan et al., eLife, 2016;
20. Publication policies Data archiving policies
Gold OA Green OA
Funder Whether Where Whether What When Where Whether What When Where
NWO (NL) Required
OA/
hybrid
Encouraged
Pub. PDF or
postprint
a.s.a.p.
Any
repository
Encouraged - - -
Welcome Trust
(UK)
Required
OA/
hybrid
Required
Pub. PDF or
postprint
< 6 mo.
after
publication
Europe PMC
or PMC
Required Data
At
publication
-
EC Horizon
2020 (EU)
Encouraged
OA/
hybrid
Required
Pub. PDF or
postprint
< 12 mo.
after
publication
Any
repository
Encouraged
Data
and
code
a.s.a.p.
Any
repository
ERC (EU) Encouraged - Encouraged
Pub. PDF or
postprint
a.s.a.p.
Any
repository
Encouraged
Data
and
code
< 6 mo.
after
publication
Any
repository
NIH (US) - - Required
Pub. PDF or
postprint
Required Data a.s.a.p.
Any
repository
Practicing open science
enables you to comply with funder mandates
Sources: https://http://www.sherpa.ac.uk/romeo/; data retrieved on Jan 20, 2017 Bram Zandbelt, 2017
21. Practicing open science
enables you to comply with funder mandates
Bram Zandbelt, 2017Sources: McKiernan et al., eLife, 2016;
23. Code and data sharing
facilitates reuse and responding to requests
Can I have
a copy of your
data?I am
not sure where
my data is
Sources: Hanson, Surkis, Yacobucci, NYU Health Sciences Libraries, 2012; https://bit.ly/data_management_snafu Bram Zandbelt, 2017
24. Code and data sharing
promotes good research practices and reduces errors
Version control Documentation
Code review
unit testing
File organization
Fewer reporting errors in OA papers
Bram Zandbelt, 2017Sources: Wicherts et al., PLoS ONE, 2011
Good research practices
25. Code and data sharing
comes with increased visibility
Papers with open code
get more citations
Papers with open data
get more citations
Bram Zandbelt, 2017Sources: Vandewalle, Comput Sci Eng, 2012; Piwowar & Vision, PeerJ, 2013.
26. Code and data sharing
is beginning to be acknowledged
Bram Zandbelt, 2017Sources: https://osf.io/tvyxzl; Kidwell et al., PLoS Biol, 2016;
27. Code and data sharing
helps you increase your research output
Standalone code/data Data papers
Bram Zandbelt, 2017
Software papers
30. RESEARCH ARTICLE SUMMARY
◥
PSYCHOLOGY
Estimating the reproducibility of
psychological science
Open Science Collaboration*
INTRODUCTION: Reproducibility is a defin-
ing feature of science, but the extent to which
it characterizes current research is unknown.
Scientific claims should not gain credence
because of the status or authority of their
originator but by the replicability of their
supporting evidence. Even research of exem-
plary quality may have irreproducible empir-
ical findings because of random or systematic
error.
RATIONALE: There is concern about the rate
and predictors of reproducibility, but limited
evidence. Potentially problematic practices in-
clude selective reporting, selective analysis, and
insufficient specification of the conditions nec-
essary or sufficient to obtain the results. Direct
replication is the attempt to recreate the con-
ditions believed sufficient for obtaining a pre-
viously observed finding and is the means of
establishing reproducibility of a finding with
new data. We conducted a large-scale, collab-
orative effort to obtain an initial estimate of
the reproducibility of psychological science.
RESULTS: We conducted replications of 100
experimental and correlational studies pub-
lished in three psychology journals using high-
powered designs and original materials when
available. There is no single standard for eval-
uating replication success. Here, we evaluated
reproducibility using significance and P values,
effect sizes, subjective assessments of replica-
tion teams, and meta-analysis of effect sizes.
The mean effect size (r) of the replication ef-
fects (Mr = 0.197, SD = 0.257) was half the mag-
nitude of the mean effect size of the original
effects (Mr = 0.403, SD = 0.188), representing a
substantial decline. Ninety-sevenpercent of orig-
inal studies had significant results (P < .05).
Thirty-six percent of replications had signifi-
cant results; 47% of origi-
nal effect sizes were in the
95% confidence interval
of the replication effect
size; 39% of effects were
subjectively rated to have
replicated the original re-
sult; and if no bias in original results is as-
sumed, combining original and replication
results left 68% with statistically significant
effects. Correlational tests suggest that repli-
cation success was better predicted by the
strength of original evidence than by charac-
teristics of the original and replication teams.
CONCLUSION: No single indicator sufficient-
ly describes replication success, and the five
indicators examined here are not the only
ways to evaluate reproducibility. Nonetheless,
collectively these results offer a clear conclu-
sion: A large portion of replications produced
weaker evidence for the original findings de-
spite using materials provided by the original
authors, review in advance for methodologi-
cal fidelity, and high statistical power to detect
the original effect sizes. Moreover, correlational
evidence is consistent with the conclusion that
variation in the strength of initial evidence
(such as original P value) was more predictive
of replication success than variation in the
characteristics of the teams conducting the
research (such as experience and expertise).
The latter factors certainly can influence rep-
lication success, but they did not appear to do
so here.
Reproducibility is not well understood be-
cause the incentives for individual scientists
prioritize novelty over replication. Innova-
tion is the engine of discovery and is vital for
a productive, effective scientific enterprise.
However, innovative ideas become old news
fast. Journal reviewers and editors may dis-
miss a new test of a published idea as un-
original. The claim that “we already know this”
belies the uncertainty of scientific evidence.
Innovation points out paths that are possible;
replication points out paths that are likely;
progress relies on both. Replication can in-
crease certainty when findings are reproduced
and promote innovation when they are not.
This project provides accumulating evidence
for many findings in psychological research
and suggests that there is still more work to
do to verify whether we know what we think
we know.
▪
RESEARCH
SCIENCE sciencemag.org 28 AUGUST 2015 • VOL 349 ISSUE 6251 943
The list of author affiliations is available in the full article online.
*Corresponding author. E-mail: nosek@virginia.edu
Cite this article as Open Science Collaboration, Science 349,
aac4716 (2015). DOI: 10.1126/science.aac4716
Original study effect size versus replication effect size (correlation coefficients). Diagonal
line represents replication effect size equal to original effect size. Dotted line represents replication
effect size of 0. Points below the dotted line were effects in the opposite direction of the original.
Density plots are separated by significant (blue) and nonsignificant (red) effects.
ON OUR WEB SITE
◥
Read the full article
at http://dx.doi.
org/10.1126/
science.aac4716
..................................................
onMarch7,2016DownloadedfromonMarch7,2016DownloadedfromonMarch7,2016DownloadedfromonMarch7,2016DownloadedfromonMarch7,2016DownloadedfromonMarch7,2016DownloadedfromonMarch7,2016DownloadedfromonMarch7,2016DownloadedfromonMarch7,2016Downloadedfrom
Bram Zandbelt, 2017
Practicing open science
provides opportunities for scientific collaboration
32. Practicing open science
enables you to contribute to open source projects
Bram Zandbelt, 2017
776 contributors
116 contributors 71 contributors
257 contributors
702 contributors
552 contributors
49 contributors
33. Practicing open science
is increasingly valued and mandated by institutions
Source: https://www.academictransfer.com; http://www.nicebread.de/open-science-hiring-practices/ Bram Zandbelt, 2017
34. Take home message:
Open science has many benefits for individual researchers
Bram Zandbelt, 2017
Open publishing
… comes with greater visibility
… lets you retain author rights and control reuse of materials
… enables you to benefit from transparent peer review
… can be achieved in many ways
… can be low-cost and even free of charge
Practicing open science
… provides you with new funding opportunities
… enables you to comply with funder mandates
Practicing open science
… provides new opportunities for scientific collaboration
… is increasingly valued and mandated by institutions
Code and data sharing
… facilitates reuse and responding to requests
… promotes good research practices and reduces errors
… is beginning to be acknowledged
… comes with greater visibility
… helps you increase your research output
35. … and we haven’t even talked about
the benefits of preregistration :-)
Sources: Wagenmakers & Dutilh (2016). APS Observer. Bram Zandbelt, 2017
37. Bram Zandbelt, 2017
Now, let’s learn about your views
1. Engaging in open science will boost my career
2. Engaging in open science will accelerate my research
3. Engaging in open science will improve the quality of my research
Proponent Opponent
vs.
Proponent Opponent
vs.
Proponent Opponent
vs.
Three debates