1) The document discusses methods for enhancing reproducibility in science, including use of provenance tracking and electronic lab notebooks to record experimental methods and changes over time.
2) It provides the example of the NeuronUnit software toolkit and how it can be registered in SciCrunch with a unique ID to allow it to be identified and tracked as it changes and is integrated into workflows.
3) Standards for describing experiments, samples, and results in a "machine actionable" way are recommended to facilitate pooling and comparing results across studies through meta-analysis.
1. ×
Reproducibility
FAIR principles in Open Science and Data Stewardship
Findable Accessible Intelligible Reproducible (FAIR)
• PhD program Neuroscience: Russell Jarvis, Mentor: Professor Sharon Crook
2. In the First section
• Dr Ellison discussed and clarified some of the problems stemming from
failure to reproduce and replicate in science.
• In this section I will talk about some methods for enhancing
reproducibility in a digitized world.
3. Definition of prov·e·nance
ˈprävənəns/
noun
Definition of provenance
1: origin, source
2: the history of ownership of a valued object or work of art or literature
https://www.merriam-
webster.com/dictionary/provenance
Provenance
4. Findable Accessible Interoperable Reproducible (FAIR)
FAIR Science
Recommendations That Failure to Address
Experimental Methods.
https://www.slideshare.net/carolegoble/open-sciencemcrgoble2015
And Freire 2013 fair-dom.org
Paywalls?
Licence
Administration
a bottleneck
Method history. Is
visible via staging and
track changes
Machine
Readable and
Human
Readable
Mirrored servers
Team Developed Research
and Software,
decentralized
Reproducible
5. The Findable in FAIR
● To repeat the same experiment, it’s likely that you will
need to use the same materials Provenance of both
materials, and ideas is challenging
● An existing tool SciCrunch enables people to register
research objects using a unique and universal RRID
before or without publication, meaning that changes to a
product can be tracked, and differences between very
similar items can be resolved.
6. • https://github.com/elabftw/elabftw
• Like track changes and time stamping of changes to experimental workflow.
• RRIDs plugin into work flows.
• a software environment is designed to facilitate tracking of data, and the
persistence of research objects and analysis tools via RRIDs. http://www.researchobject.org
● http://www.researchobject.org
Electronic Lab books:
Gathering Scattered Research Components
Fighting Entropy with a Smart Organizing Frame Work
that’s part of an International Effort
7. https://scicrunch.org/resources/Any/search?q=neuronunit&l=neuronunitNeuronUnit
Cite this (NeuronUnit, RRID:SCR_015634)
URL: https://github.com/scidash/neuronunit
Resource Type: Resource, software resource, source code, software toolkit
Software toolkit for data-driven validation of neuron and ion channel models using SciUnit.
NeuronUnit implements an interface to several simulators and model description languages,
handles test calculations according to domain standards, and enables automated construction
of tests based on data from several major public data repositories.
•Tools
• SciCrunch: Registry (3)
•Importantly https://scicrunch.org/ unpublished tools can be
registered and uniquely identified as they evolve.
•Not even my tool, but I was allowed to register it.
•Since this tool is designed to be mashed-up with other tools, it
becomes a module in a work flow.
•Designed for experimental as well as digital tools, and products.
A Practical Example of RRID
8. ● Greater standardization of scientific language, coupled with
simple expression ideas.
● Unambiguous identification and descriptions of results,
experiment design, samples, code and models.
• Makes scientific publications more ‘Machine Actionable’, ie
meta analysis requires less manual interventions like data-
base querying and mindless busy work on the user end.
• Machine Actionable: If it’s easier to pool and compare
results from overlapping research studies than more if it will
be done.
• ‘Requiem for the Spike’
Intelligible in FAIR
9. Recommendations from the Standard
Center for Reproducible Neuroscience:
The OHBM Replication Award
● Direct Quote: “One of the major avenues to enhance
the reproducibility of a field is the replication of
previous studies, but the incentives to perform and publish
replications are much weaker compared to the incentives to publish
novel research.” http://reproducibility.stanford.edu/award/
● Publishing: Journals could share peer-review data
● Poldrack, R. A. et al. Scanning the horizon: towards transparent and reproducible neuroimaging research. Nat. Rev. Neurosci. 18, 115–126 (2017).
10. Meta-analysis and Similar Enterprises
● Neuroelectro machine reads journal articles; it extracts out
variable names and then computes mean values of reported
variables for example:
●Resting Membrane Potential, Spike Width, and Spike Amplitude.
●Supports greater data mileage, as opposed to a novel hypothesis
to test
https://neuroelectro.org/
https://f1000research.com/gateways/PRR
11. What Established Scientists Think Would
Help
Baker, M. 1,500 scientists lift the lid on reproducibility. Nat. News 533, 452
(2016).