Being Reproducible:
Models, Research
Objects and R* Brouhaha
Professor Carole Goble, carole.goble@manchester.ac.uk
The University of Manchester, UK
The FAIRDOM Association Coordinator
ELIXIR-UK Head of Node
Co-lead ELIXIR Interoperability Platform
SSBSS 2017, July 17 2017, Cambridge, UK
4th International Synthetic & Systems Biology Summer School
Reproducibility
Rampancy
47/53 “landmark” publications
could not be replicated
[Begley, Ellis Nature, 483, 2012]
Retraction
http://www.nature.com/news/misconduct-is-the-main-cause-of-life-sciences-retractions-1.11507
Misconduct is the main cause
of life-sciences retractions
Zoë Corbyn
01 October 2012
Vahan Simonyan,
Center for Biologics
Evaluation and Research
Food and Drug Administration
USA
NIH Rigor and
Reproducibility
https://www.nih.gov/research-
training/rigor-reproducibility
cos.io/top
http://www.acmedsci.ac.uk/policy/policy-
projects/reproducibility-and-reliability-of-
biomedical-research/
John P. A. Ioannidis How to Make More Published ResearchTrue, October 21, 2014 DOI: 10.1371/journal.pmed.1001747
Reproducibility of biological experiments
is hard
for in vivo/vitro and
for in silico analysis
• OS version
• Revision of scripts
• Data analysis software versions
• Version of data files
• Command line parameters written on
a napkin
• “Black magic” only a grad student
knows
Fix with latest technologies, best
practices and willingness
[Keiichiro Ono, Scripps Institute]
The first
step is to
be
FAIR
See the whole of
the previous talk…
Record All
Automate All
Contain All
Access All
Findable (Citable)
Accessible (Trackable)
Interoperable (Intelligible)
Reusable (Reproducible)
design
cherry picking data, random seed
reporting, non-independent bias, poor
positive and negative controls, dodgy
normalisation, arbitrary cut-offs,
premature data triage, un-validated
materials, improper statistical analysis,
poor statistical power, stop when “get to
the right answer”, software
misconfigurations misapplied black box
software
reporting
incomplete reporting of software configurations, parameters & resource
versions, missed steps, missing data, vague methods, missing software
Empirical Statistical Computational
V. Stodden, IMS Bulletin (2013)
Reproducibility and reliability of biomedical
research: improving research practice
https://www.sciencenews.org/article/12-reasons-research-goes-wrong
“When I use a word," Humpty Dumpty
said in rather a scornful tone, "it means
just what I choose it to mean - neither
more nor less.”
Carroll, Through the Looking Glass
re-compute
replicate
rerun
repeat
re-examine
repurpose
recreate
reuse
restore
reconstruct review
regenerate
revise
recycle
redo
robustness
tolerance
verificationcompliancevalidation assurance
remix
Scientific publications goals:
(i) announce a result
(ii) convince readers its correct.
Papers in experimental science
should describe the results and
provide a clear enough protocol to
allow successful repetition and
extension.
Papers in computational science
should describe the results and
provide the complete software
development environment, data
and set of instructions which
generated the figures.
VirtualWitnessing*
*Leviathan and theAir-Pump: Hobbes, Boyle, and the
Experimental Life (1985) Shapin and Schaffer.
Jill Mesirov
David Donoho
“Micro” Reproducibility
“Macro” Reproducibility
Fixivity
Validate
Verify
Trust
Repeatability:
“Sameness”
Same result
1 Lab
1 experiment
Reproducibility:
“Similarity”
Similar result
> 1 Lab
> 1 experiment
why the differences?
https://2016-oslo-
repeatability.readthedocs.org/en/latest/repeatability-discussion.html
Validate
Verify
Method Reproducibility
the provision of enough detail about
study procedures and data so, in
theory or in actuality, the same
procedures could be exactly
repeated.
Result Reproducibility
(aka replicability)
obtaining the same results from the
conduct of an independent study
whose procedures are as closely
matched to the original experiment
as possible
Goodman, et al ScienceTranslational Medicine 8 (341) 2016
Validate
Verify
What are you reproducing?
Algorithm vs its script conflation
Methods
techniques, algorithms,
spec. of the steps, models
Materials
datasets, parameters,
algorithm seeds
Instruments
codes, services, scripts,
underlying libraries,
workflows, ref datasets
Laboratory
sw and hw infrastructure,
systems software,
integrative platforms
computational environment
Productivity
Track differences
Validate
Verify
Validate
Verify
Recompute By Degrees
Fixivity - Liveness
• New/updated/deprecated methods,
datasets, services, codes, h/w
• Snapshots
Dependency – Containment
• Streams, non-portable data/software,
• 3rd party services, supercomputer access,
licensing restrictions….
• Locally contained and maintained
• External dependencies
Transparency
• Blackboxes, proprietary software,
manual steps
Robustness
• Bounds of use
• Stochastics, non-deterministics,
contexts
https://xkcd.com/797/
Components and Dependencies
Software are typically
compound works.
Libraries. Plug-ins.
Code fragments.
We are encouraged to
reuse and not reinvent
Combining licenses.
License compatibilities
Black boxes
• closed codes
• closed external or cloud
services
• method obscurity
• manual steps
[Thanks to Jason Scott]
The ReproducibilityWindow
all experiments become less reproducible over
time….
• Can’t contain everything
– Pesky Internet in a Box
• Can’t automate everything
– Pesky people intervening
• Can’t fix and fossils everything
– Pesky science keeps changing
Results may vary
Bonus slide
At SSBSS Theodor Gescher came up with REALSCI
Robust -many runs
Environment -describe the equipment/OS
Another -done by not your lab
Limits -parameters
Standards -well understood/comprehensible methods
Complete -not cherry picking
Immortal -community supported commodity systems
Mixed Central and Distributed stores:
Containment and Dependencies. Upload vs Referencing
In House Stores
External Databases
Publishing services
Model Resources
Mixed Central and Distributed stores:
Containment and Dependencies. Upload vs Referencing
In House Stores
External Databases
Publishing services
Model Resources
Migrations into FAIRDOMHub
For long term reproducibility
Shades of Reproducibility
Running an active instrument
Reading an archived record
Are you using
hard-wired
localhost ids?
Workflows
SOPs
Containers, cloud services, common services
Markup languages,
reporting guidelines and
checklists, ontologies,
catalogues
Sounds hard….
what can I do?
Catalogue
Protocol specs and sharing…
A language for specifying
experimental protocols for
biological research in way that is
precise, unambiguous, and
understandable by both humans
and computers.
Validation Data
https://fairdomhub.org/sops/203https://fairdomhub.org/investigations/56
Standard Operating Procedures
Quality Control
in situ reproducible models in FAIRDOM
metadata annotation against standards
validation, comparison and simulation
SBML Model simulation
Model comparison
Model versioning
Reproducing simulations
[Jacky Snoep, Dagmar Waltemath, Martin Peters, Martin Scharm]
JWS Online
Tracking versi0ns
Tracking model versions smartly
Scharm, M., Wolkenhauer, O., & Waltemath, D. (2015). An algorithm to detect and
communicate the differences in computational models describing biological
systems. Bioinformatics, btv484
Model simulation in FAIRDOMHub
using JWS Online
A simulation database allows a one-click, live
figure reproduction in a FAIRDOM-SEEK
JWS model Excel data file
Dagmar Waltemath, Uni Rostock
Jacky Snoep, Uni Stellenbosch
Simulation Experiment Description Markup
Language: XML-based format for encoding
simulation setups, to ensure exchangeability and
reproducibility of simulation experiments
• which models to use in an experiment,
• modifications to apply on the models before using them,
• which simulation procedures to run on each model,
• what analysis results to output,
• and how the results should be presented.
FAIRDOMHub Journal Programme
Molecular Systems Biology
ModelTechnical curation forJournals
[Jacky Snoep (Stellenbosch), DagmarWaltemath, Martin Peters, Martin Scharm (Rostock)]
* store DOI citable supplementary files on FAIRDOMHub
** model and data curation
*** reproducible clickable figures in papers using SED-ML
Cataloguing
Packaging
Penkler, G., du Toit, F., Adams,
W., Rautenbach, M., Palm, D. C.,
van Niekerk, D. D. and Snoep, J.
L. (2015), Construction and
validation of a detailed kinetic
model of glycolysis in
Plasmodium falciparum. FEBS J,
282: 1481–1511.
doi:10.1111/febs.13237
https://fairdomhub.org/investigations/56
DOI: 10.15490/seek.1.investigation.56
Snapshot
preservation
active
18/07/2017 39
An “evolving manuscript” would begin with a pre-
publication, pre-peer review “beta 0.9” version of an
article, followed by the approved published article itself, [
… ] “version 1.0”.
Subsequently, scientists would update this paper with
details of further work as the area of research develops.
Versions 2.0 and 3.0 might allow for the “accretion of
confirmation [and] reputation”.
Ottoline Leyser […] assessment criteria in science revolve
around the individual. “People have stopped thinking
about the scientific enterprise”.
http://www.timeshighereducation.co.uk/news/evolving-manuscripts-the-future-of-scientific-communication/2020200.article
Packaging: CombineArchive
https://sems.uni-rostock.de/projects/combinearchive/
Scharm M,Wendland F, Peters M,Wolfien M,TheileT,Waltemath D
SEMS, University of Rostock
zip-like file with a manifest & metadata
- Bundling files - Keeping provenance
- Exchanging data - Shipping results
Bergmann, F.T.,Adams, R., Moodie, S., Cooper, J., Glont, M., Golebiewski, M., ... & Olivier, B. G. (2014). COMBINE archive and OMEX format:
one file to share all information to reproduce a modeling project. BMC bioinformatics,15(1), 1.
Standards-based metadata framework for
bundling (scattered) resources with context and citation
Packaging:
Research Objects
http://researchobject.org
Packaging:
Research Objects
Publishing
Archive
Institutional
Archive
1.Export
2.Exchange
http://researchobject.org
Manifest
Construction
Container
Manifest
Description
Packaging Platforms:
Zip files, BagIt,
Docker, Conda, Singularity
Repositories
FAIRDOMHub
Packaging:
Research Objects in a nutshell
Different
manifest
description
profiles for
different kinds of
objects
FromVirtual Machines to Executable Containers
for portable execution
• Containers everything required to make a piece of
software run is packaged into isolated containers.
• UnlikeVMs, containers do not bundle a full operating
system - only libraries and settings required to make
the software work.
• Efficient, lightweight, self-contained systems
• Guarantees that software will always run the same,
regardless of where it’s deployed.
https://www.software.ac.uk/c4rr/ https://biocontainers.pro/
Biocontainers
Use commodity and community systems
Sustained platforms
Communities to drive them
Tooling and training
Spreadsheets are the Cockroaches of Science
EU FAIR Data Expert Group Consultation
https://github.com/FAIR-Data-
EG/consultation/issues
What to know more?
Go on a Software or Data Carpentry Course
https://tess.elixir-europe.org
Make software open and reusable
Software Sustainability Institute ,
http://www.software.ac.uk
Goble, Better Software Better Research
IEEE Internet Computing 18(5), (2014 )
DOI: 10.1109/MIC.2014.88
Jiménez RC, Kuzak M, Alhamdoosh M et al.
Four simple recommendations to
encourage best practices in research
software [version 1; referees: 3 approved].
F1000Research 2017, 6:876 (doi:
10.12688/f1000research.11407.1)
Use Common
Platforms
Get the licencing
right…
MATLAB
Mathematica….
Proprietary
software
Cloud Centralised Service
insitu reproducibility….
Galaxy
FAIRDOMHub + JWS Online
Blackbox vs
Whitebox
https://view.commonwl.org/workflows/github.com/Protein
sWebTeam/ebi-metagenomics-
cwl/tree/fa86fce/workflows/rna-selector.cwl
Use and document workflows
preferrably a workflow management system, Living Research Objects!
http://commonwl.org/
Workflow repository
Use a workflow – the vision!
preferrably a workflow management system
preferrably described using CommonWorkflow Language
Experimental
workflows
Event BUS Business Process Management
Taverna Knime Galaxy
Workflow
BPM layer
Workflow
Computation
Application
layer
Computing resources Databases
Effector
layer
Front-end
Web interface / Monitoring interface
Pipeline
Pilot
FAIRDOM SEEK
Workflow repository
Workflow portal
repository
launch, results
FAIRDOM
[Jean Loup Fallon, Carole Goble]
https://hive.biochemistry.gwu.edu/htscsrs/workshop_2017
Reproducible Pipelines for Robust Regulation
BioCompute Objects
Emphasis on fixing the
pipeline so it can be
replicated, and on
reporting the
parameter space
Use an Electronic Lab Notebook
What can you do?
• Follow the 10 RACA Principles
• Take action, be imperfect
• Demand reproducibility in reviews.
• Educate your PIs and supervisors.
[Norman Morrison]
Technological Debt: Appropriate Effort
Retrospective Reusability 
What are the incentives?
[Garza] [Malone] [Resnik]
Acknowledgements
• David De Roure
• Tim Clark
• Sean Bechhofer
• Robert Stevens
• Christine Borgman
• Victoria Stodden
• Marco Roos
• Jose Enrique Ruiz del Mazo
• Oscar Corcho
• Ian Cottam
• Steve Pettifer
• Magnus Rattray
• Chris Evelo
• Katy Wolstencroft
• Robin Williams
• Pinar Alper
• C. Titus Brown
• Greg Wilson
• Kristian Garza
• Juliana Freire
• Jill Mesirov
• Simon Cockell
• Paolo Missier
• Paul Watson
• Gerhard Klimeck
• Matthias Obst
• Jun Zhao
• Pinar Alper
• Daniel Garijo
• Yolanda Gil
• James Taylor
• Alex Pico
• Sean Eddy
• Cameron Neylon
• Barend Mons
• Kristina Hettne
• Stian Soiland-Reyes
• Rebecca Lawrence
• Michael Crusoe
Jon OlavVik,
Norwegian University of Life Science
Maksim Zakhartsev
University Hohenheim, Stuttgart,
Germany
Alexey Kolodkin
Siberian Branch
Russian Academy of Sciences
Tomasz Zieliński,
SynthSys Centre
University Edinburgh, UK
Martin Peters, Martin Scharm
Systems Biology Bioinformatics
University of Rostock, Germany
Web sites
• Force11 http://www.force11.org
• TeSS https://tess.elixir-europe.org
• FAIRDOM http://www.fair-dom.org
• FAIRDOMHub http://www.fairdomhub.org
• Software Carpentry http://software-carpentry.org
• Data Carpentry http://datacarpentry.org
• Software Sustainability Institute http://www.software.ac.uk
• Rightfield http://www.rightfield.org.uk
• FAIRSharing http://www.fairsharing.org
• CommonWorkflow Language http://commonwl.org/
Reading List (refs also throughout)
• John P. A. Ioannidis How to Make More Published ResearchTrue, October 21, 2014 DOI:
10.1371/journal.pmed.1001747
• Ioannidis JPA (2005) Why Most Published Research FindingsAre False. PLoS Med 2(8): e124.
doi:10.1371/journal.pmed.0020124
• Steven N. Goodman*, Daniele Fanelli and John P. A. Ioannidis,What does research reproducibility mean? Science
Translational Medicine 01 Jun 2016:Vol. 8, Issue 341, pp. 341ps12 DOI: 10.1126/scitranslmed.aaf5027
• Sandve GK, Nekrutenko A,Taylor J, Hovig E (2013)Ten Simple Rules for Reproducible Computational Research.
PLoS Comput Biol 9(10): e1003285. doi:10.1371/journal.pcbi.1003285
• Massimiliano Assante, Leonardo Candela, DonatellaCastelli, Paolo Manghi and Pasquale Pagano, Science 2.0
Repositories:Time for a Change in Scholarly Communication, D-Lib Magazine January/February 2015,Volume 21,
Number 1/2 , DOI: 10.1045/january2015-assante
• Waltemath, D., Henkel, R., Hälke, R., Scharm, M., &Wolkenhauer, O. (2013). Improving the reuse of
computational models through version control.Bioinformatics, 29(6), 742-748.
• Bergmann, F.T., Adams, R., Moodie, S., Cooper, J., Glont, M., Golebiewski, M., ... & Olivier, B. G. (2014).
COMBINE archive andOMEX format: one file to share all information to reproduce a modeling project. BMC
bioinformatics,15(1), 1.
• Scharm, M.,Wolkenhauer, O., &Waltemath, D. (2015). An algorithm to detect and communicate the differences
in computational models describing biological systems. Bioinformatics, btv484
• http://www.reuters.com/article/2012/03/28/us-science-cancer-idUSBRE82R12P20120328
• http://www.acmedsci.ac.uk/policy/policy-projects/reproducibility-and-reliability-of-biomedical-research/

Being Reproducible: SSBSS Summer School 2017

  • 1.
    Being Reproducible: Models, Research Objectsand R* Brouhaha Professor Carole Goble, carole.goble@manchester.ac.uk The University of Manchester, UK The FAIRDOM Association Coordinator ELIXIR-UK Head of Node Co-lead ELIXIR Interoperability Platform SSBSS 2017, July 17 2017, Cambridge, UK 4th International Synthetic & Systems Biology Summer School
  • 2.
  • 3.
    47/53 “landmark” publications couldnot be replicated [Begley, Ellis Nature, 483, 2012]
  • 4.
  • 5.
    Vahan Simonyan, Center forBiologics Evaluation and Research Food and Drug Administration USA
  • 6.
  • 7.
    John P. A.Ioannidis How to Make More Published ResearchTrue, October 21, 2014 DOI: 10.1371/journal.pmed.1001747
  • 8.
    Reproducibility of biologicalexperiments is hard for in vivo/vitro and for in silico analysis • OS version • Revision of scripts • Data analysis software versions • Version of data files • Command line parameters written on a napkin • “Black magic” only a grad student knows Fix with latest technologies, best practices and willingness [Keiichiro Ono, Scripps Institute] The first step is to be FAIR See the whole of the previous talk…
  • 9.
    Record All Automate All ContainAll Access All Findable (Citable) Accessible (Trackable) Interoperable (Intelligible) Reusable (Reproducible)
  • 10.
    design cherry picking data,random seed reporting, non-independent bias, poor positive and negative controls, dodgy normalisation, arbitrary cut-offs, premature data triage, un-validated materials, improper statistical analysis, poor statistical power, stop when “get to the right answer”, software misconfigurations misapplied black box software reporting incomplete reporting of software configurations, parameters & resource versions, missed steps, missing data, vague methods, missing software Empirical Statistical Computational V. Stodden, IMS Bulletin (2013) Reproducibility and reliability of biomedical research: improving research practice https://www.sciencenews.org/article/12-reasons-research-goes-wrong
  • 12.
    “When I usea word," Humpty Dumpty said in rather a scornful tone, "it means just what I choose it to mean - neither more nor less.” Carroll, Through the Looking Glass re-compute replicate rerun repeat re-examine repurpose recreate reuse restore reconstruct review regenerate revise recycle redo robustness tolerance verificationcompliancevalidation assurance remix
  • 13.
    Scientific publications goals: (i)announce a result (ii) convince readers its correct. Papers in experimental science should describe the results and provide a clear enough protocol to allow successful repetition and extension. Papers in computational science should describe the results and provide the complete software development environment, data and set of instructions which generated the figures. VirtualWitnessing* *Leviathan and theAir-Pump: Hobbes, Boyle, and the Experimental Life (1985) Shapin and Schaffer. Jill Mesirov David Donoho
  • 14.
  • 15.
    Repeatability: “Sameness” Same result 1 Lab 1experiment Reproducibility: “Similarity” Similar result > 1 Lab > 1 experiment why the differences? https://2016-oslo- repeatability.readthedocs.org/en/latest/repeatability-discussion.html Validate Verify
  • 16.
    Method Reproducibility the provisionof enough detail about study procedures and data so, in theory or in actuality, the same procedures could be exactly repeated. Result Reproducibility (aka replicability) obtaining the same results from the conduct of an independent study whose procedures are as closely matched to the original experiment as possible Goodman, et al ScienceTranslational Medicine 8 (341) 2016 Validate Verify
  • 17.
    What are youreproducing? Algorithm vs its script conflation Methods techniques, algorithms, spec. of the steps, models Materials datasets, parameters, algorithm seeds Instruments codes, services, scripts, underlying libraries, workflows, ref datasets Laboratory sw and hw infrastructure, systems software, integrative platforms computational environment
  • 18.
  • 19.
    Validate Verify Recompute By Degrees Fixivity- Liveness • New/updated/deprecated methods, datasets, services, codes, h/w • Snapshots Dependency – Containment • Streams, non-portable data/software, • 3rd party services, supercomputer access, licensing restrictions…. • Locally contained and maintained • External dependencies Transparency • Blackboxes, proprietary software, manual steps Robustness • Bounds of use • Stochastics, non-deterministics, contexts
  • 20.
    https://xkcd.com/797/ Components and Dependencies Softwareare typically compound works. Libraries. Plug-ins. Code fragments. We are encouraged to reuse and not reinvent Combining licenses. License compatibilities
  • 21.
    Black boxes • closedcodes • closed external or cloud services • method obscurity • manual steps [Thanks to Jason Scott]
  • 22.
    The ReproducibilityWindow all experimentsbecome less reproducible over time…. • Can’t contain everything – Pesky Internet in a Box • Can’t automate everything – Pesky people intervening • Can’t fix and fossils everything – Pesky science keeps changing Results may vary
  • 23.
    Bonus slide At SSBSSTheodor Gescher came up with REALSCI Robust -many runs Environment -describe the equipment/OS Another -done by not your lab Limits -parameters Standards -well understood/comprehensible methods Complete -not cherry picking Immortal -community supported commodity systems
  • 24.
    Mixed Central andDistributed stores: Containment and Dependencies. Upload vs Referencing In House Stores External Databases Publishing services Model Resources
  • 25.
    Mixed Central andDistributed stores: Containment and Dependencies. Upload vs Referencing In House Stores External Databases Publishing services Model Resources Migrations into FAIRDOMHub For long term reproducibility
  • 26.
    Shades of Reproducibility Runningan active instrument Reading an archived record Are you using hard-wired localhost ids? Workflows SOPs Containers, cloud services, common services Markup languages, reporting guidelines and checklists, ontologies, catalogues Sounds hard…. what can I do? Catalogue
  • 27.
    Protocol specs andsharing… A language for specifying experimental protocols for biological research in way that is precise, unambiguous, and understandable by both humans and computers.
  • 28.
  • 29.
  • 30.
    in situ reproduciblemodels in FAIRDOM metadata annotation against standards validation, comparison and simulation SBML Model simulation Model comparison Model versioning Reproducing simulations [Jacky Snoep, Dagmar Waltemath, Martin Peters, Martin Scharm] JWS Online
  • 31.
  • 32.
    Tracking model versionssmartly Scharm, M., Wolkenhauer, O., & Waltemath, D. (2015). An algorithm to detect and communicate the differences in computational models describing biological systems. Bioinformatics, btv484
  • 33.
    Model simulation inFAIRDOMHub using JWS Online
  • 34.
    A simulation databaseallows a one-click, live figure reproduction in a FAIRDOM-SEEK JWS model Excel data file Dagmar Waltemath, Uni Rostock Jacky Snoep, Uni Stellenbosch Simulation Experiment Description Markup Language: XML-based format for encoding simulation setups, to ensure exchangeability and reproducibility of simulation experiments • which models to use in an experiment, • modifications to apply on the models before using them, • which simulation procedures to run on each model, • what analysis results to output, • and how the results should be presented.
  • 35.
  • 36.
    ModelTechnical curation forJournals [JackySnoep (Stellenbosch), DagmarWaltemath, Martin Peters, Martin Scharm (Rostock)] * store DOI citable supplementary files on FAIRDOMHub ** model and data curation *** reproducible clickable figures in papers using SED-ML
  • 37.
    Cataloguing Packaging Penkler, G., duToit, F., Adams, W., Rautenbach, M., Palm, D. C., van Niekerk, D. D. and Snoep, J. L. (2015), Construction and validation of a detailed kinetic model of glycolysis in Plasmodium falciparum. FEBS J, 282: 1481–1511. doi:10.1111/febs.13237 https://fairdomhub.org/investigations/56 DOI: 10.15490/seek.1.investigation.56 Snapshot preservation active
  • 38.
    18/07/2017 39 An “evolvingmanuscript” would begin with a pre- publication, pre-peer review “beta 0.9” version of an article, followed by the approved published article itself, [ … ] “version 1.0”. Subsequently, scientists would update this paper with details of further work as the area of research develops. Versions 2.0 and 3.0 might allow for the “accretion of confirmation [and] reputation”. Ottoline Leyser […] assessment criteria in science revolve around the individual. “People have stopped thinking about the scientific enterprise”. http://www.timeshighereducation.co.uk/news/evolving-manuscripts-the-future-of-scientific-communication/2020200.article
  • 39.
    Packaging: CombineArchive https://sems.uni-rostock.de/projects/combinearchive/ Scharm M,WendlandF, Peters M,Wolfien M,TheileT,Waltemath D SEMS, University of Rostock zip-like file with a manifest & metadata - Bundling files - Keeping provenance - Exchanging data - Shipping results Bergmann, F.T.,Adams, R., Moodie, S., Cooper, J., Glont, M., Golebiewski, M., ... & Olivier, B. G. (2014). COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project. BMC bioinformatics,15(1), 1.
  • 40.
    Standards-based metadata frameworkfor bundling (scattered) resources with context and citation Packaging: Research Objects http://researchobject.org
  • 41.
  • 42.
    Manifest Construction Container Manifest Description Packaging Platforms: Zip files,BagIt, Docker, Conda, Singularity Repositories FAIRDOMHub Packaging: Research Objects in a nutshell Different manifest description profiles for different kinds of objects
  • 43.
    FromVirtual Machines toExecutable Containers for portable execution • Containers everything required to make a piece of software run is packaged into isolated containers. • UnlikeVMs, containers do not bundle a full operating system - only libraries and settings required to make the software work. • Efficient, lightweight, self-contained systems • Guarantees that software will always run the same, regardless of where it’s deployed. https://www.software.ac.uk/c4rr/ https://biocontainers.pro/ Biocontainers
  • 44.
    Use commodity andcommunity systems Sustained platforms Communities to drive them Tooling and training Spreadsheets are the Cockroaches of Science
  • 45.
    EU FAIR DataExpert Group Consultation https://github.com/FAIR-Data- EG/consultation/issues
  • 46.
    What to knowmore? Go on a Software or Data Carpentry Course https://tess.elixir-europe.org
  • 47.
    Make software openand reusable
  • 48.
    Software Sustainability Institute, http://www.software.ac.uk Goble, Better Software Better Research IEEE Internet Computing 18(5), (2014 ) DOI: 10.1109/MIC.2014.88 Jiménez RC, Kuzak M, Alhamdoosh M et al. Four simple recommendations to encourage best practices in research software [version 1; referees: 3 approved]. F1000Research 2017, 6:876 (doi: 10.12688/f1000research.11407.1)
  • 49.
    Use Common Platforms Get thelicencing right… MATLAB Mathematica…. Proprietary software Cloud Centralised Service insitu reproducibility…. Galaxy FAIRDOMHub + JWS Online Blackbox vs Whitebox
  • 50.
    https://view.commonwl.org/workflows/github.com/Protein sWebTeam/ebi-metagenomics- cwl/tree/fa86fce/workflows/rna-selector.cwl Use and documentworkflows preferrably a workflow management system, Living Research Objects! http://commonwl.org/ Workflow repository
  • 51.
    Use a workflow– the vision! preferrably a workflow management system preferrably described using CommonWorkflow Language Experimental workflows Event BUS Business Process Management Taverna Knime Galaxy Workflow BPM layer Workflow Computation Application layer Computing resources Databases Effector layer Front-end Web interface / Monitoring interface Pipeline Pilot FAIRDOM SEEK Workflow repository Workflow portal repository launch, results FAIRDOM [Jean Loup Fallon, Carole Goble]
  • 52.
    https://hive.biochemistry.gwu.edu/htscsrs/workshop_2017 Reproducible Pipelines forRobust Regulation BioCompute Objects Emphasis on fixing the pipeline so it can be replicated, and on reporting the parameter space
  • 53.
    Use an ElectronicLab Notebook
  • 54.
    What can youdo? • Follow the 10 RACA Principles • Take action, be imperfect • Demand reproducibility in reviews. • Educate your PIs and supervisors.
  • 55.
    [Norman Morrison] Technological Debt:Appropriate Effort Retrospective Reusability 
  • 56.
    What are theincentives? [Garza] [Malone] [Resnik]
  • 57.
    Acknowledgements • David DeRoure • Tim Clark • Sean Bechhofer • Robert Stevens • Christine Borgman • Victoria Stodden • Marco Roos • Jose Enrique Ruiz del Mazo • Oscar Corcho • Ian Cottam • Steve Pettifer • Magnus Rattray • Chris Evelo • Katy Wolstencroft • Robin Williams • Pinar Alper • C. Titus Brown • Greg Wilson • Kristian Garza • Juliana Freire • Jill Mesirov • Simon Cockell • Paolo Missier • Paul Watson • Gerhard Klimeck • Matthias Obst • Jun Zhao • Pinar Alper • Daniel Garijo • Yolanda Gil • James Taylor • Alex Pico • Sean Eddy • Cameron Neylon • Barend Mons • Kristina Hettne • Stian Soiland-Reyes • Rebecca Lawrence • Michael Crusoe
  • 58.
    Jon OlavVik, Norwegian Universityof Life Science Maksim Zakhartsev University Hohenheim, Stuttgart, Germany Alexey Kolodkin Siberian Branch Russian Academy of Sciences Tomasz Zieliński, SynthSys Centre University Edinburgh, UK Martin Peters, Martin Scharm Systems Biology Bioinformatics University of Rostock, Germany
  • 59.
    Web sites • Force11http://www.force11.org • TeSS https://tess.elixir-europe.org • FAIRDOM http://www.fair-dom.org • FAIRDOMHub http://www.fairdomhub.org • Software Carpentry http://software-carpentry.org • Data Carpentry http://datacarpentry.org • Software Sustainability Institute http://www.software.ac.uk • Rightfield http://www.rightfield.org.uk • FAIRSharing http://www.fairsharing.org • CommonWorkflow Language http://commonwl.org/
  • 60.
    Reading List (refsalso throughout) • John P. A. Ioannidis How to Make More Published ResearchTrue, October 21, 2014 DOI: 10.1371/journal.pmed.1001747 • Ioannidis JPA (2005) Why Most Published Research FindingsAre False. PLoS Med 2(8): e124. doi:10.1371/journal.pmed.0020124 • Steven N. Goodman*, Daniele Fanelli and John P. A. Ioannidis,What does research reproducibility mean? Science Translational Medicine 01 Jun 2016:Vol. 8, Issue 341, pp. 341ps12 DOI: 10.1126/scitranslmed.aaf5027 • Sandve GK, Nekrutenko A,Taylor J, Hovig E (2013)Ten Simple Rules for Reproducible Computational Research. PLoS Comput Biol 9(10): e1003285. doi:10.1371/journal.pcbi.1003285 • Massimiliano Assante, Leonardo Candela, DonatellaCastelli, Paolo Manghi and Pasquale Pagano, Science 2.0 Repositories:Time for a Change in Scholarly Communication, D-Lib Magazine January/February 2015,Volume 21, Number 1/2 , DOI: 10.1045/january2015-assante • Waltemath, D., Henkel, R., Hälke, R., Scharm, M., &Wolkenhauer, O. (2013). Improving the reuse of computational models through version control.Bioinformatics, 29(6), 742-748. • Bergmann, F.T., Adams, R., Moodie, S., Cooper, J., Glont, M., Golebiewski, M., ... & Olivier, B. G. (2014). COMBINE archive andOMEX format: one file to share all information to reproduce a modeling project. BMC bioinformatics,15(1), 1. • Scharm, M.,Wolkenhauer, O., &Waltemath, D. (2015). An algorithm to detect and communicate the differences in computational models describing biological systems. Bioinformatics, btv484 • http://www.reuters.com/article/2012/03/28/us-science-cancer-idUSBRE82R12P20120328 • http://www.acmedsci.ac.uk/policy/policy-projects/reproducibility-and-reliability-of-biomedical-research/