SlideShare a Scribd company logo

PTU: Using Provenance for Repeatability

1 of 19
Download to read offline
Using Provenance for
    Repeatability
          Quan Pham1, Tanu Malik2, Ian Foster1,2
  Department of Computer Science1,§ and Computation
Institute2,¶ University of Chicago§,¶ and Argonne National
                         Laboratory¶
                         TaPP 2013
Publication Process
• Traditional academic publication process



   • Submit paper            • Review ideas   • Learn novel
                               &experiment      methods.
                               s

• Emerging academic publication process



   • Submit paper            • Review ideas   • Are we reading
                               &experiments     something that is
                             • Validate         repeatable and
                               software         reproducible?
Repeatability Testing

• Scientific progress relies on novel claims and verifiable
  results
• Scientific paper reviewers
  • Validate announced results
  • Validate for different
    data and parameters
  • Validate under different
    conditions and environments

• Challenge: Work under
  time & budget constraints
  Image: from http://catsandtheirmews.blogspot.com/2012/05/update-on-computer-crash.html
Repeatability Testing
       Challenges & Constraints
•   Repeatability requirements
    • Hardware : Single machine/Clusters
    • Software
      • Operating System : Which operating system was used?
        (Ubuntu/RedHat/Debian/Gentoo)
      • Environment: How to capture all environment variables?
      • Tools & libraries installation: How to precisely know all the dependencies?

•   Knowledge constraints
    • Experiment setup: how to setup the experiment?
    • Experiment usage: how the experiment is run?

•   Resource constraints
    • Requires massive processing power.
    • Operates on large amounts of data.
    • Performs significant network communication.
    • Is long-running.
An Approach to Repeatability
         Testing

Challenges & Constraints      Possible Solution
• Repeatability             • Provide a virtual
  requirements                machine
   • Hardware requirement   • Provide a portable
   • Software requirement     software
• Knowledge constraints     Provide a reference
   • Experiment setup       execution
   • Experiment usage
• Resource constraints      Provide selective
                            replay
PTU – Provenance-To-
           Use
• PTU
  • Minimizes computation time during repeatability testing
  • Guarantees that events are processed in the same order
    using the same data

• Authors build a package that includes:
  • Software program
  • Input data
  • Provenance trace

• Testers may select a subset of the package’s
  processes for a partial deterministic replay

Recommended

LDV: Light-weight Database Virtualization
LDV: Light-weight Database VirtualizationLDV: Light-weight Database Virtualization
LDV: Light-weight Database VirtualizationTanu Malik
 
GlobusWorld 2015
GlobusWorld 2015GlobusWorld 2015
GlobusWorld 2015Tanu Malik
 
GeoDataspace: Simplifying Data Management Tasks with Globus
GeoDataspace: Simplifying Data Management Tasks with GlobusGeoDataspace: Simplifying Data Management Tasks with Globus
GeoDataspace: Simplifying Data Management Tasks with GlobusTanu Malik
 
GEN: A Database Interface Generator for HPC Programs
GEN: A Database Interface Generator for HPC ProgramsGEN: A Database Interface Generator for HPC Programs
GEN: A Database Interface Generator for HPC ProgramsTanu Malik
 
Benchmarking Cloud-based Tagging Services
Benchmarking Cloud-based Tagging ServicesBenchmarking Cloud-based Tagging Services
Benchmarking Cloud-based Tagging ServicesTanu Malik
 
EarthCube DDMA AGU
EarthCube DDMA AGUEarthCube DDMA AGU
EarthCube DDMA AGUTanu Malik
 
The Galaxy bioinformatics workflow environment
The Galaxy bioinformatics workflow environmentThe Galaxy bioinformatics workflow environment
The Galaxy bioinformatics workflow environmentRutger Vos
 

More Related Content

What's hot

The DuraMat Data Hub and Analytics Capability: A Resource for Solar PV Data
The DuraMat Data Hub and Analytics Capability: A Resource for Solar PV DataThe DuraMat Data Hub and Analytics Capability: A Resource for Solar PV Data
The DuraMat Data Hub and Analytics Capability: A Resource for Solar PV DataAnubhav Jain
 
FireWorks overview
FireWorks overviewFireWorks overview
FireWorks overviewAnubhav Jain
 
Learning Systems for Science
Learning Systems for ScienceLearning Systems for Science
Learning Systems for ScienceIan Foster
 
Atomate: a high-level interface to generate, execute, and analyze computation...
Atomate: a high-level interface to generate, execute, and analyze computation...Atomate: a high-level interface to generate, execute, and analyze computation...
Atomate: a high-level interface to generate, execute, and analyze computation...Anubhav Jain
 
Reproducible Workflow with Cytoscape and Jupyter Notebook
Reproducible Workflow with Cytoscape and Jupyter NotebookReproducible Workflow with Cytoscape and Jupyter Notebook
Reproducible Workflow with Cytoscape and Jupyter NotebookKeiichiro Ono
 
SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...
SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...
SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...Keiichiro Ono
 
Building Reproducible Network Data Analysis / Visualization Workflows
Building Reproducible Network Data Analysis / Visualization WorkflowsBuilding Reproducible Network Data Analysis / Visualization Workflows
Building Reproducible Network Data Analysis / Visualization WorkflowsKeiichiro Ono
 
Coding the Continuum
Coding the ContinuumCoding the Continuum
Coding the ContinuumIan Foster
 
Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)
Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)
Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)Keiichiro Ono
 
Automating materials science workflows with pymatgen, FireWorks, and atomate
Automating materials science workflows with pymatgen, FireWorks, and atomateAutomating materials science workflows with pymatgen, FireWorks, and atomate
Automating materials science workflows with pymatgen, FireWorks, and atomateAnubhav Jain
 
Introduction to Biological Network Analysis and Visualization with Cytoscape ...
Introduction to Biological Network Analysis and Visualization with Cytoscape ...Introduction to Biological Network Analysis and Visualization with Cytoscape ...
Introduction to Biological Network Analysis and Visualization with Cytoscape ...Keiichiro Ono
 
The Materials Project - Combining Science and Informatics to Accelerate Mater...
The Materials Project - Combining Science and Informatics to Accelerate Mater...The Materials Project - Combining Science and Informatics to Accelerate Mater...
The Materials Project - Combining Science and Informatics to Accelerate Mater...University of California, San Diego
 
Scientific
Scientific Scientific
Scientific marpierc
 
WBDB 2015 Performance Evaluation of Spark SQL using BigBench
WBDB 2015 Performance Evaluation of Spark SQL using BigBenchWBDB 2015 Performance Evaluation of Spark SQL using BigBench
WBDB 2015 Performance Evaluation of Spark SQL using BigBencht_ivanov
 
Lessons Learned on Benchmarking Big Data Platforms
Lessons Learned on Benchmarking  Big Data PlatformsLessons Learned on Benchmarking  Big Data Platforms
Lessons Learned on Benchmarking Big Data Platformst_ivanov
 
Accelerating Discovery via Science Services
Accelerating Discovery via Science ServicesAccelerating Discovery via Science Services
Accelerating Discovery via Science ServicesIan Foster
 
Materials Data Facility: Streamlined and automated data sharing, discovery, ...
Materials Data Facility: Streamlined and automated data sharing,  discovery, ...Materials Data Facility: Streamlined and automated data sharing,  discovery, ...
Materials Data Facility: Streamlined and automated data sharing, discovery, ...Ian Foster
 
Big Data Science with H2O in R
Big Data Science with H2O in RBig Data Science with H2O in R
Big Data Science with H2O in RAnqi Fu
 
Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...
Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...
Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...Keiichiro Ono
 

What's hot (20)

The DuraMat Data Hub and Analytics Capability: A Resource for Solar PV Data
The DuraMat Data Hub and Analytics Capability: A Resource for Solar PV DataThe DuraMat Data Hub and Analytics Capability: A Resource for Solar PV Data
The DuraMat Data Hub and Analytics Capability: A Resource for Solar PV Data
 
FireWorks overview
FireWorks overviewFireWorks overview
FireWorks overview
 
Learning Systems for Science
Learning Systems for ScienceLearning Systems for Science
Learning Systems for Science
 
Atomate: a high-level interface to generate, execute, and analyze computation...
Atomate: a high-level interface to generate, execute, and analyze computation...Atomate: a high-level interface to generate, execute, and analyze computation...
Atomate: a high-level interface to generate, execute, and analyze computation...
 
Reproducible Workflow with Cytoscape and Jupyter Notebook
Reproducible Workflow with Cytoscape and Jupyter NotebookReproducible Workflow with Cytoscape and Jupyter Notebook
Reproducible Workflow with Cytoscape and Jupyter Notebook
 
SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...
SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...
SDCSB Advanced Tutorial: Reproducible Data Visualization Workflow with Cytosc...
 
Building Reproducible Network Data Analysis / Visualization Workflows
Building Reproducible Network Data Analysis / Visualization WorkflowsBuilding Reproducible Network Data Analysis / Visualization Workflows
Building Reproducible Network Data Analysis / Visualization Workflows
 
Coding the Continuum
Coding the ContinuumCoding the Continuum
Coding the Continuum
 
Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)
Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)
Cytoscape Tutorial Session 1 at UT-KBRIN Bioinformatics Summit 2014 (4/11/2014)
 
Automating materials science workflows with pymatgen, FireWorks, and atomate
Automating materials science workflows with pymatgen, FireWorks, and atomateAutomating materials science workflows with pymatgen, FireWorks, and atomate
Automating materials science workflows with pymatgen, FireWorks, and atomate
 
Introduction to Biological Network Analysis and Visualization with Cytoscape ...
Introduction to Biological Network Analysis and Visualization with Cytoscape ...Introduction to Biological Network Analysis and Visualization with Cytoscape ...
Introduction to Biological Network Analysis and Visualization with Cytoscape ...
 
The Materials Project - Combining Science and Informatics to Accelerate Mater...
The Materials Project - Combining Science and Informatics to Accelerate Mater...The Materials Project - Combining Science and Informatics to Accelerate Mater...
The Materials Project - Combining Science and Informatics to Accelerate Mater...
 
Scientific
Scientific Scientific
Scientific
 
WBDB 2015 Performance Evaluation of Spark SQL using BigBench
WBDB 2015 Performance Evaluation of Spark SQL using BigBenchWBDB 2015 Performance Evaluation of Spark SQL using BigBench
WBDB 2015 Performance Evaluation of Spark SQL using BigBench
 
Lessons Learned on Benchmarking Big Data Platforms
Lessons Learned on Benchmarking  Big Data PlatformsLessons Learned on Benchmarking  Big Data Platforms
Lessons Learned on Benchmarking Big Data Platforms
 
Accelerating Discovery via Science Services
Accelerating Discovery via Science ServicesAccelerating Discovery via Science Services
Accelerating Discovery via Science Services
 
Materials Data Facility: Streamlined and automated data sharing, discovery, ...
Materials Data Facility: Streamlined and automated data sharing,  discovery, ...Materials Data Facility: Streamlined and automated data sharing,  discovery, ...
Materials Data Facility: Streamlined and automated data sharing, discovery, ...
 
Big Data Science with H2O in R
Big Data Science with H2O in RBig Data Science with H2O in R
Big Data Science with H2O in R
 
DIET_BLAST
DIET_BLASTDIET_BLAST
DIET_BLAST
 
Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...
Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...
Workshop: Introduction to Cytoscape at UT-KBRIN Bioinformatics Summit 2014 (4...
 

Similar to PTU: Using Provenance for Repeatability

A personal journey towards more reproducible networking research
A personal journey towards more reproducible networking researchA personal journey towards more reproducible networking research
A personal journey towards more reproducible networking researchOlivier Bonaventure
 
Reproducibility of computational workflows is automated using continuous anal...
Reproducibility of computational workflows is automated using continuous anal...Reproducibility of computational workflows is automated using continuous anal...
Reproducibility of computational workflows is automated using continuous anal...Kento Aoyama
 
PBS and Scheduling at NCI: The past, present and future
PBS and Scheduling at NCI: The past, present and futurePBS and Scheduling at NCI: The past, present and future
PBS and Scheduling at NCI: The past, present and futureinside-BigData.com
 
Performance Analysis: The USE Method
Performance Analysis: The USE MethodPerformance Analysis: The USE Method
Performance Analysis: The USE MethodBrendan Gregg
 
Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)
Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)
Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)Robert Nelson
 
2019 03-11 bio it-world west genepattern notebook slides
2019 03-11 bio it-world west genepattern notebook slides2019 03-11 bio it-world west genepattern notebook slides
2019 03-11 bio it-world west genepattern notebook slidesMichael Reich
 
Bioinformatics Analysis Environment for Your Laboratory Use
Bioinformatics Analysis Environment for Your Laboratory UseBioinformatics Analysis Environment for Your Laboratory Use
Bioinformatics Analysis Environment for Your Laboratory UseItoshi Nikaido
 
Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...
Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...
Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...Puppet
 
2014 nicta-reproducibility
2014 nicta-reproducibility2014 nicta-reproducibility
2014 nicta-reproducibilityc.titus.brown
 
Advanced Java Testing @ POSS 2019
Advanced Java Testing @ POSS 2019Advanced Java Testing @ POSS 2019
Advanced Java Testing @ POSS 2019Vincent Massol
 
Building an Experimentation Platform in Clojure
Building an Experimentation Platform in ClojureBuilding an Experimentation Platform in Clojure
Building an Experimentation Platform in ClojureSrihari Sriraman
 
PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T
PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T
PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T Puppet
 
Guider: An Integrated Runtime Performance Analyzer on AGL
Guider: An Integrated Runtime Performance Analyzer on AGLGuider: An Integrated Runtime Performance Analyzer on AGL
Guider: An Integrated Runtime Performance Analyzer on AGLPeace Lee
 
A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...
A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...
A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...Ilkay Altintas, Ph.D.
 
Reproducible research concepts and tools
Reproducible research concepts and toolsReproducible research concepts and tools
Reproducible research concepts and toolsC. Tobin Magle
 
PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...
PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...
PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...Puppet
 
Loadtesting wuc2009v2
Loadtesting wuc2009v2Loadtesting wuc2009v2
Loadtesting wuc2009v2ravneetraman
 
StarWest 2019 - End to end testing: Stupid or Legit?
StarWest 2019 - End to end testing: Stupid or Legit?StarWest 2019 - End to end testing: Stupid or Legit?
StarWest 2019 - End to end testing: Stupid or Legit?mabl
 

Similar to PTU: Using Provenance for Repeatability (20)

A personal journey towards more reproducible networking research
A personal journey towards more reproducible networking researchA personal journey towards more reproducible networking research
A personal journey towards more reproducible networking research
 
Reproducibility of computational workflows is automated using continuous anal...
Reproducibility of computational workflows is automated using continuous anal...Reproducibility of computational workflows is automated using continuous anal...
Reproducibility of computational workflows is automated using continuous anal...
 
PBS and Scheduling at NCI: The past, present and future
PBS and Scheduling at NCI: The past, present and futurePBS and Scheduling at NCI: The past, present and future
PBS and Scheduling at NCI: The past, present and future
 
Performance Analysis: The USE Method
Performance Analysis: The USE MethodPerformance Analysis: The USE Method
Performance Analysis: The USE Method
 
Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)
Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)
Enjoying the Journey from Puppet 3.x to Puppet 4.x (PuppetConf 2016)
 
2019 03-11 bio it-world west genepattern notebook slides
2019 03-11 bio it-world west genepattern notebook slides2019 03-11 bio it-world west genepattern notebook slides
2019 03-11 bio it-world west genepattern notebook slides
 
Bioinformatics Analysis Environment for Your Laboratory Use
Bioinformatics Analysis Environment for Your Laboratory UseBioinformatics Analysis Environment for Your Laboratory Use
Bioinformatics Analysis Environment for Your Laboratory Use
 
Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...
Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...
Virtual Puppet User Group: Puppet Development Kit (PDK) and Puppet Platform 6...
 
2014 nicta-reproducibility
2014 nicta-reproducibility2014 nicta-reproducibility
2014 nicta-reproducibility
 
Advanced Java Testing @ POSS 2019
Advanced Java Testing @ POSS 2019Advanced Java Testing @ POSS 2019
Advanced Java Testing @ POSS 2019
 
Mam she
Mam sheMam she
Mam she
 
Building an Experimentation Platform in Clojure
Building an Experimentation Platform in ClojureBuilding an Experimentation Platform in Clojure
Building an Experimentation Platform in Clojure
 
PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T
PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T
PuppetConf 2016: Enjoying the Journey from Puppet 3.x to 4.x – Rob Nelson, AT&T
 
Guider: An Integrated Runtime Performance Analyzer on AGL
Guider: An Integrated Runtime Performance Analyzer on AGLGuider: An Integrated Runtime Performance Analyzer on AGL
Guider: An Integrated Runtime Performance Analyzer on AGL
 
A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...
A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...
A Maturing Role of Workflows in the Presence of Heterogenous Computing Archit...
 
Reproducible research concepts and tools
Reproducible research concepts and toolsReproducible research concepts and tools
Reproducible research concepts and tools
 
PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...
PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...
PuppetConf 2017: Puppet Development Kit: A Seamless Workflow for Module Devel...
 
Loadtesting wuc2009v2
Loadtesting wuc2009v2Loadtesting wuc2009v2
Loadtesting wuc2009v2
 
StarWest 2019 - End to end testing: Stupid or Legit?
StarWest 2019 - End to end testing: Stupid or Legit?StarWest 2019 - End to end testing: Stupid or Legit?
StarWest 2019 - End to end testing: Stupid or Legit?
 
Automation using Puppet 3
Automation using Puppet 3 Automation using Puppet 3
Automation using Puppet 3
 

PTU: Using Provenance for Repeatability

  • 1. Using Provenance for Repeatability Quan Pham1, Tanu Malik2, Ian Foster1,2 Department of Computer Science1,§ and Computation Institute2,¶ University of Chicago§,¶ and Argonne National Laboratory¶ TaPP 2013
  • 2. Publication Process • Traditional academic publication process • Submit paper • Review ideas • Learn novel &experiment methods. s • Emerging academic publication process • Submit paper • Review ideas • Are we reading &experiments something that is • Validate repeatable and software reproducible?
  • 3. Repeatability Testing • Scientific progress relies on novel claims and verifiable results • Scientific paper reviewers • Validate announced results • Validate for different data and parameters • Validate under different conditions and environments • Challenge: Work under time & budget constraints Image: from http://catsandtheirmews.blogspot.com/2012/05/update-on-computer-crash.html
  • 4. Repeatability Testing Challenges & Constraints • Repeatability requirements • Hardware : Single machine/Clusters • Software • Operating System : Which operating system was used? (Ubuntu/RedHat/Debian/Gentoo) • Environment: How to capture all environment variables? • Tools & libraries installation: How to precisely know all the dependencies? • Knowledge constraints • Experiment setup: how to setup the experiment? • Experiment usage: how the experiment is run? • Resource constraints • Requires massive processing power. • Operates on large amounts of data. • Performs significant network communication. • Is long-running.
  • 5. An Approach to Repeatability Testing Challenges & Constraints Possible Solution • Repeatability • Provide a virtual requirements machine • Hardware requirement • Provide a portable • Software requirement software • Knowledge constraints Provide a reference • Experiment setup execution • Experiment usage • Resource constraints Provide selective replay
  • 6. PTU – Provenance-To- Use • PTU • Minimizes computation time during repeatability testing • Guarantees that events are processed in the same order using the same data • Authors build a package that includes: • Software program • Input data • Provenance trace • Testers may select a subset of the package’s processes for a partial deterministic replay
  • 7. PTU Functionalities • ptu-audit tool • Build a package of authors’ source code, data, and environment variables • Record process- and file-level details about a reference execution % ptu-audit java TextAnalyzer news.txt • PTU package • Display the provenance graph and accompanying run-time details • ptu-exec tool • Re-execute specified part of the provenance graph % ptu-exec java TextAnalyzer news.txt
  • 8. ptu-audit • Uses ptrace to monitor system calls • execve, sys_fork • read, write, sys_io • bind, connect, socket • Collects provenance • Collects runtime information • Makes package
  • 9. ptu-audit • Use ptrace to monitor system calls • execve, sys_fork • read, write, sys_io • bind, connect, socket • Collect provenance • Collect runtime info • Make package
  • 10. PTU Package • [Figure 2. The PTU package. The tester chooses to run the sub-graph rooted at /bin/calculate ]
  • 11. ptu-exec • [Figure 3. ptu-exec re-runs part of the application from /bin/calculate. It uses CDE to re-route file dependencies]
  • 12. Current PTU Components • Uses CDE (Code-Data-Environment) tool to create a package • Details CDE • Uses ptrace to create a provenance graph representing a reference run-time execution • Uses SQLite to store the provenance graph • Uses graphviz for graph presentation • Enhances CDE to run the package
  • 13. PEEL0 • Best, N., et. al., Synthesis of a Complete Land Use/Land Cover Dataset for the Conterminous United States. RDCEP Working Paper, 2012. 12(08). • Wget • R • R • Bash • Raster • Geo script • Rgdal algorithm • Reclassify
  • 14. PEEL0 • [Figure 4: Time reduction in testing PEEL0 using PTU] • Or use the actual execution graph??
  • 15. TextAnalyzer • Murphy, J., et. al., Textual Hydraulics: Mining Online Newspapers to Detect Physical, Social, and Institutional Water Management Infrastructure, 2013, Technical Report, Argonne National Lab. • runs a named-entity recognition analysis program using several data dictionaries • splits the input file into multiple input files on which it runs a parallel analysis
  • 16. TextAnalyzer • [Figure 5. Time reduction in testing TextAnalyzer using PTU]
  • 17. Conclusion • PTU is a step toward testing software programs that are submitted to conference proceedings and journals to conduct repeatability tests • Easy and attractive for authors • Fine control, efficient way for testers
  • 18. Future Works • Other workflow types • Distributed workflows. • Improve performance • Decide how to store provenance compactly in a packge. • Presentation • Improve graphic-user-interface and presentation
  • 19. Acknowledgements • Neil Best • Jonathan Ozik • Center for Robust Decision making on Climate and Energy Policy (NSF grant number 0951576) • Contractors of the US Government under contract number DEAC02-06CH11357

Editor's Notes

  1. Hi everyone,My name is QP,In this presentation, I’d like to introduce a system that use provenance for repeatability.The work is done with TMalik @ CI, UoC and Ifoster @ ANL.----- Meeting Notes (3/28/13 14:37) -----no abbreviationlet's the slide talk: n & v should be there
  2. What is the problem with repeatability in scientific community?Process of publication:+ I submit + Reviewers find interesting claims+ want to verify by re-run, with different data/param, different cond/envSo many things to validate, so little time and budget (hardware)===Problems: r u facing this? 1sldWhy? 1sldChallenges 1sld10-15 sld totalThis is what I presentSteps & stepsHigh lvl overview----- Meeting Notes (3/28/13 14:37) -----1 slide: what is a "new" publication process (authors -> (tester) -> readers)concept of author, tester and repeatability
  3. + uses ptrace to monitor ~50 system calls including process system calls, such as execve() and sys_fork(), for collecting process provenance; file system calls, such as read(), write(), and sys_io(), for collecting file provenance; and network calls, such as bind(), connect(), socket(), and send() for auditing network activity. + obtain process name, owner, group, parent, host, creation time, command line, and environment variables; and file name, path, host, size, and modification time+obtain memory footprint, CPU consumption, and I/O activity data for each process from /proc/$pid/stat + copies the accessed file into a package directory that consists of all sub- directories and symbolic links to the original file’s location.
  4. + uses ptrace to monitor ~50 system calls including process system calls, such as execve() and sys_fork(), for collecting process provenance; file system calls, such as read(), write(), and sys_io(), for collecting file provenance; and network calls, such as bind(), connect(), socket(), and send() for auditing network activity. + obtain process name, owner, group, parent, host, creation time, command line, and environment variables; and file name, path, host, size, and modification time+obtain memory footprint, CPU consumption, and I/O activity data for each process from /proc/$pid/stat + copies the accessed file into a package directory that consists of all sub- directories and symbolic links to the original file’s location.
  5. Reason: why choose /bin/cal (mem intensive, run-time)+ When the entire reference run finishes, PTU builds a reference execution file consisting of the topological sort of the provenance graph. The nodes of the graph enumerate run-time details, such as process memory consumption, and file sizes. The tester, as described next, can utilize the database and the graph for efficient re-execution. + Testers can - examine the provenance graph contained in a package to study the accompanying reference execution. - request a re-execution, either by specifying nodes in the provenance graph or by modifying a run configuration file that is included in the package.
  6. + obtains run configuration and environment variables for each process from the SQLite database + monitors it via ptrace and re-executes CDE functionality of replacing path argument(s) to refer to the corresponding path within the package cde-package/cde-root/. +provides fast audit and re-execution independent of the application. The profiling of processes enables testers to choose the processes to run.
  7. PEEL0Three step workflow processImplemented as an R-program Classification is memory intensive[pic: PEEL0 workflowget -> reclassify -> calculate -> final resulttesters look at (calculate) with question marks]
  8. five process nodes, 10000 exclusive file reads based on the number of files in the dataset, and 422 exclusive file writes for the aggregated dataset Slowdownwhenuse PTU: ~35% for PEEL0
  9. + eight process nodes that in aggregate conduct 616 exclusive file reads, 124 exclusive file writes, and 50 file nodes that are read and written again. + Slowdownwhenuse PTU: ~15% for TextAnalyzer+ TextAnalyzer has a particularly large improvement (>98%) since the entire process is run on a much smaller file.
  10. PTU is a step toward testing software programs that are submitted to conference proceedings and journals to conduct repeatability tests. Peer reviewers often must review these programs in a short period of time. By providing one utility for packaging the software pro- grams and its reference execution without modifying the application, we have made it easy and attractive for authors to use it and a fine control, efficient way for testers to use PTU.