Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Executable papers


Published on

Presentation for HCLS Sci Discourse on executable papers

Published in: Technology, Business
  • Be the first to comment

Executable papers

  1. 1. Executable Papers: publishing science that works Anita de Waard, Elsevier LabsHCLS Scientific Discourse Group June 20, 2011
  2. 2. Elsevier ChallengesGoals:- Invite and survey ideas in innovative science publishing- Create a community of people working on similar issues, from different backgrounds/viewpointsRules:- Open submission; very interdisciplinary panel of judges; open publication of submissions- IPR stays with author; if commercial development, Elsevier has right of first refusalChallenges so far:- 2008/9: Elsevier Grand Challenge for knowledge enhancement in the life sciences: 2010/11: ISMB Killer App award: rewarding bioinformatics apps that work for biologists 2011: Elsevier Executable Paper challenge:
  3. 3. Executable Paper ChallengeDriven by issues in publishing computational science:- How can we develop a model for executable files that is compatible with the user’s operating system and architecture and adaptable to future systems?- How do we manage very large file sizes?- How do we validate data and code, and decrease the reviewer’s workload?- How to support registering and tracking of actions taken on the ‘executable paper?’Coorganised with International Conf on Computational Sci For high-performance and (geo/eco/bio/chem)‘-informatics’ fields- Actually challenge participants were a different community!
  4. 4. The Finalists: - a web portal for creating and sharing executable research papers data and code model for reproducible research and executable papers http:// The Author-Review-Execute environment System: Web 3.0 and Active Documents Mache: Creating Dynamic Reproducible Science Provenance Based Infrastructure for Creating Executable Papers Identifier for Computation Results http://vcr.stanford.edu8.R2 Platform for Reproducible Research Collage Authoring Environment
  5. 5. SHARE - a web portal for creating and sharing executable research papers built to house the submissions to the Transformation Tool Contest (TTC)- an environment where all software and data related to the paper is optimally installed and ready for (temporary and secure) evaluation- a specific virtual machine image can be instantiated within the paper- SHARE supports multiple operating systems both at the level of the remote virtual machines as well as at the level of the connecting clients running on the user’s machine- more than 100 heterogenous images have been contributed by different research communities so far
  6. 6. A-R-E: The Author-Review-Execute environment A data-driven, loosely coupled, and distributed approach to support the life cycle of an (executable) paper: authoring, reviewing, publication and study: - finding out which paragraph is providing the information bit pertinent to the reference - navigate from data points in a plot to the data items in raw experimental data that led to these points (e.g. point to an excel sheet column with experimental data) - navigate into the program code that led to a specific data set- Based on a semantic wiki:
  7. 7. A Provenance Based Infrastructure for Creating Executable Papers VisTrails provides a mechanism to store provenance for workflows- Code and plug-ins for LaTeX, Wiki, Microsoft Word, and PowerPoint- CrowdLabs ( to allow papers to point to results that can be executed on a remote server and interactively explored from a Web browser
  8. 8. Universal Identifier for Computation Results Verifiable Computational Result (VCR): A computational result (eg. table, figure, chart, dataset), together with the metadata describing in detail the computations that created it every computation automatically generates a detailed chronicle of its inputs and outputs as part of the process execution. The chronicle is automatically stored in a standard format on a VCR repository for later access- Verifiable Result Repository (Repository): A web- services provider that archives VCRs and later serves up views of specific computational results- Verifiable Result Identifier (VRI): A URL (web address) that universally and permanently identifies a repository and causes it to serve up views of a specific VCRa DOI-like string that permanently and uniquely identifies the chronicle associated to that result and the repository that can serve views of that chronicle.
  9. 9. The Collage Authoring Environment environment which enables authors to seamlessly embed chunks of executable code (called assets) into scientific publications: - input forms: used by the user to feed input data into the running experiment - visualizations: render an experiment result which can be directly visualized in the research paper - code snippets: embed an editable view of the code which enacts a specific computation and may be used to generate additional assets- allow repeated execution of these assets on underlying computing and data storage resources:
  10. 10. Next step: The Executable Journal?- Ideally, we’d like all these tools to work together- In fact, we’d like that to be how we communicate informatics/ computer science!- Submit a paper with a piece of working code- The code works on the platform- The code stays on the platform, and is available for other papers to run on, too!- Advantages: - Clearer communication of software - Less reinvention of the wheel - More collaboration
  11. 11. In other words:“I like the idea of [...] a research object corresponding to aPhD thesis sitting on the (digital) library shelf and then beingre-executed as new data comes along.  So the thesis sitsthere and new results (or papers, or research objects) popout. I like this example because it involves tying down themethod and letting the data flow, instead of the widely heldview that the data sits there and methods are applied to it.[...]These papers then become a way of distributing data andmethods in a highly usable and user-centric way [...]. Soscientists dont need to download and install tools and learnuser interfaces.They just interact with the publishedexecutable papers...” Dave De Roure, email to Wf4ever group
  12. 12. What does this have to do with HCLS?- Might be a good area to explore this in?- E.g. interchange of annotations that we are exploring w/Tim Clark’s group...- Next step: - Funding? - Format? - Platform?- Thoughts??