This document provides a summary of the state of software curation based on an environmental scan of research software repositories and related practices. The summary finds:
1) There are no comprehensive indices of software archives and orders of magnitude fewer software archives than data archives. Institutional repositories offer little functionality for software archiving.
2) Very few funders have policies addressing software curation. There is little available advice for researchers who wish to curate, cite, and preserve software.
3) Substantial reproducibility failures continue to be reported due to a lack of software preservation. In summary, software curation looks a lot like data curation did a decade ago, with no universal standards for citing and archiving software.
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Software Repositories for Research-- An Environmental Scan
1. Software Repositories for
Research
-- An environmental scan
Micah Altman
MIT Libraries
Prepared for
Digital Preservation 2016
Milwaukee
November 2016
2. Disclaimer
These opinions are my own, they are not the opinions of MIT, any of
the project funders, nor (with the exception of co-authored
previously published work) my collaborators
Secondary disclaimer:
“It’s tough to make predictions, especially about the future!”
-- Attributed to Woody Allen, Yogi Berra, Niels Bohr, Vint Cerf, Winston Churchill, Confucius, Disreali [sic], Freeman Dyson,
Cecil B. Demille, Albert Einstein, Enrico Fermi, Edgar R. Fiedler, Bob Fourer, Sam Goldwyn, Allan Lamport, Groucho Marx,
Dan Quayle, George Bernard Shaw, Casey Stengel, Will Rogers, M. Taub, Mark Twain, Kerr L. White, etc.
2
3. Related Publications
• Altman M, Jackman S. “Nineteen Ways of Looking at Statistical
Software”. Journal of Statistical Software. 2011;42.
• Altman, Micah, and Gary King. "A proposed standard for the
scholarly citation of quantitative data." D-lib 13, no. 3 (2007):
• Altman, M., Gill, J. and McDonald, M.P., 2004. Numerical issues in
statistical computing for the social scientist. John Wiley & Sons.
Reprints available from:
informatics.mit.edu 3
7. What is Software
Working definition:
“Part of a computer system that consists of encoded information or computer
instructions” (wikipedia) that is directly executable within a system.
Corollaries
Software generally is composed of instantiations of algorithms, heuristics,
and fixed information (internal data).
The behavior and output of software generally depends on the execution
context: execution environment (software, hardware, network, networked
resources), configuration parameters, and dynamic inputs.
7
8. Some Caution About Definitions
Software is often tightly coupled to data
Boundaries among software objects and systems are fuzzy & permeable
Usefulness of software is strongly dependent on the intent of the user,
knowledge and capabilities of user (documentation matters), and
execution context.
8
"... if they [philosophers] do ask and they want a definition, they do not want the most natural definition, e.g. of 'chair' they
do not want the definition 'something to sit on'. Why are they not satisfied with the normal definition of chair, or, to put the
question in another way, why do they wish to ask for the definition of a physical object?"
Source: "From the Minutes of the Moral Science Club, 23.2.1939" in Wittgenstein in Cambridge (2008)
9. Research Questions
Characterizing Research Software Repositories and Related Practices
How is software related to research formally disseminated?
Which “repositories” (points for mid/long term publishing/access of
software) are recognized at the discipline level?
What is the relative prevalence and affordances of “repositories” for
software as compared to other established disciplinary repositories?
What practices, requirements, or standards for software curation and
preservation are recognized at the disciplinary level?
9
11. Literature Review
Data Curation, Publication and Citation
Software significant properties, use cases
Software repositories
Software & scientific reproducibility
Software Engineering Methodology
11
12. Web Research - Practice
Review of research repositories
Sources: OpenDOAR, Re3Data, Sherpajuliet
Goals: Estimate prevalence of repositories that accept research software; identify exemplar
repositories, characterize feature sets by repository category
Methods: term-based queries; descriptive statistics; stratified content case studies
Review of Software Directories
Sources: OpenHub, OSDir, DMOZ
Goals: Identify additional software repositories used in research
Methods: qualitative text analysis; descriptive statistics
12
13. Web Research - Policies
Review of funder policies
Sources: Roarmap; US Federal Agency Websites
Goals: Estimate prevalence of funder policies on software curation; identify exemplar
policies; identify recommended repositories
Methods: qualitative text analysis; descriptive statistics
Review of Journal Policies
Source: Google Scholar, WoS, DOAJ, Software Sustainability Institute Index
Goals: Estimate prevalence of journals that publish research software; prevalence of
software policies at journals exemplar policies; identify recommended repositories
Methods: qualitative text analysis; descriptive statistics
13
23. Use Cases and Motivating Value
23
Historic / cultural - historical scholarship
- “intrinsic value”
Replication and reproducibility - check claims made in research
- reduced deliberate research fraud
- check reliability (robustness) of results
- check validity (accuracy)
Reuse - efficiency
- increase speed of development
- standards compliance
- apply methodology to a different corpus
- increased quality and dependability
Render other digital objects - renders other objects meaningful
- see digital preservation use cases
Legal - record of licensing, ownership, copyright
- manage legal risks/accountability
- compliance with laws/funding mandates
- reduce barriers to long-term access for other historic use, replication, reuse, rendering
Citation and attribution - track individual academic career
- track software development/history
- track institutional outputs
- track funder outputs
26. Preliminary findings: State of Software Curation
1. No comprehensive indices of software archives
2. Orders of magnitude fewer software archives than data archives.
( Corollary: Institutional repositories offer little functionality for software
archiving, even when nominally supported )
3. Very small proportion of funders have policies addressing software curation
4. There is little available advice for researchers who wish to curate, cite, &
preserve software
5. Substantial reproducibility reproducibility failures related to software continue
to be reported
26
“Nothing Exists” - Parmenides (ca. 500 BCE)
27. Contrast with Data Curation
-- Lack of Progress• Compliance
– Funder: data management plans, open data
– Publishers: data access/archiving/citation
• Norms & practices
– Joint data citation principles
– Recognition of data in funder biosketches
– Increased recognition of reproducibility gaps
– Increased recognition of open data/open science
• Technical infrastructure
– Data repository directories
– Data citation indices
– ORCID researcher identifier and registry
• Recognition
– Data citation indices
– Virtual branded archives
– High-profile data publications
27
28. Summing it all up…
Software curation looks a lot like data curation a decade ago…
28
“How much slower would scientific progress be if the near universal standards for scholarly citation of articles and books had
never been developed? Suppose shortly after publication only some printed works could be reliably found by other scholars; or
if researchers were only permitted to read an article if they first committed not to criticize it, or were required to coauthor with
the original author any work that built on the original. How many discoveries would never have been made if the titles of books
and articles in libraries changed unpredictably, with no link back to the old title; if printed works existed in different libraries
under different titles; if researchers routinely redistributed modified versions of other authors' works without changing the title
or author listed; or if publishing new editions of books meant that earlier editions were destroyed? …
“Unfortunately, no such universal standards exist for citing quantitative data software, and so all the problems listed above
exist now. Practices vary from field to field, archive to archive, and often from article to article.
The data software cited may no longer exist, may not be available publicly, or may have never been held by anyone but the
investigator. Data software listed as available from the author are unlikely to be available for long and will not be available after
the author retires or dies. … Data software are sometimes listed in the bibliography, sometimes in the text, sometimes not at
all, and rarely with enough information to guarantee future access to the identical data set. Replicating published tables and
figures even without having to rerun the original experiment, is often difficult or impossible”
-- Altman & King 2007