Framework to develop core
FAIR metrics
FAIR Metrics Working Group
Presentation to the NIH Commons Framework
Working Group on FAIR Metrics on July 11, 2017
http://fairmetrics.org
Members
Luiz Olavo Bonino, VU/DTL
Peter Doorn, DANS
Michel Dumontier, Maastricht University
Susanna Sansone, University of Oxford
Erik Schultes, DTL
Mark Wilkinson, Universidad Politécnica de Madrid
Motivation
We believe that increasing the FAIRness of digital resources
will maximize their discovery and reuse.
An assessment can provide the feedback needed to
characterize and improve the FAIRness of a digital resource.
To evaluate the FAIRness of a digital object we need metrics.
Objective
We seek to develop a set of core metrics that can be
utilized in a computational infrastructure to
automatically assess the FAIRness of any digital
resource.
Milestones
● Establish charter (done)
● Establish guidelines for the development of the FAIR metrics (done)
● Establish a metric proposal form (done)
● Develop a set of core FAIR metrics (ongoing)
● Prepare and discuss a Release Candidate for proposed FAIR
metrics and a prototype implementation (Sept 2017)
● Community-wide implementation and review (Oct-Nov, 2017)
● Revision of FAIR metrics and documentation (Jan-Feb 2018)
● Prepare and discuss a Final Recommendation and Reference
Implementation the core set of FAIR metrics (March 2018)
Guidelines
We focus on the development of metrics to assess
compliance to each and every one of the FAIR principles.
Guidelines to drive the development of
a set of core FAIR metrics
Clear: so that anybody can understand what is meant
Realistic: so that anybody can report on what is being asked of them
Discriminating: so that we can distinguish the degree to which a
resource meets the FAIR principles while providing instruction to
maximize their value
Measurable: The assessment can be made in an objective, quantitative,
machine-interpretable, scalable and reproducible manner →
transparency of what is being measured, and how.
Guidelines
We focus on the development of metrics to assess
compliance to each and every one of the FAIR principles.
However, compliance is distinct from impact. We do not believe
that all digital resources are of equal quality, nonetheless, when
any resource is created and published, it should be maximally
discoverable and reusable as per the FAIR principles. Abiding by
the FAIR principles makes unequal resources discoverable, and
this will aid in the assessment of their quality. However, any metric
that assess the popularity of a digital resource is not a measure of
its FAIRness.
Process to establish metrics
For each FAIR principle:
○ Propose one or more possible metrics
○ Fill out the information in the metric proposal form
○ Discuss the merits of the proposal
■ does it conform to the FAIR metric guidelines? (clear,
realistic, discriminating, measureable, universal)
■ is it atomic or does it comprise a number of different or
complementary aspects?
○ Iteratively refine and test the metric proposal until consensus
is achieved
~3.5hrs per metric * 15 principles = 62.5h * 6 people = 315 person hrs
FAIR Metric Proposal Form https://goo.gl/FiQQSc
Milestones
● Establish charter (done)
● Establish guidelines for the development of the FAIR metrics (done)
● Establish a metric proposal form (done)
● Develop a set of core FAIR metrics (ongoing)
● Prepare and discuss a Release Candidate for proposed FAIR
metrics and a prototype implementation (Sept 2017)
● Community-wide implementation and review (Oct-Nov, 2017)
● Revision of FAIR metrics and documentation (Jan-Feb 2018)
● Prepare and discuss a Final Recommendation and Reference
Implementation the core set of FAIR metrics (March 2018)
Discussion

A Framework to develop the FAIR Metrics

  • 1.
    Framework to developcore FAIR metrics FAIR Metrics Working Group Presentation to the NIH Commons Framework Working Group on FAIR Metrics on July 11, 2017 http://fairmetrics.org
  • 2.
    Members Luiz Olavo Bonino,VU/DTL Peter Doorn, DANS Michel Dumontier, Maastricht University Susanna Sansone, University of Oxford Erik Schultes, DTL Mark Wilkinson, Universidad Politécnica de Madrid
  • 3.
    Motivation We believe thatincreasing the FAIRness of digital resources will maximize their discovery and reuse. An assessment can provide the feedback needed to characterize and improve the FAIRness of a digital resource. To evaluate the FAIRness of a digital object we need metrics.
  • 4.
    Objective We seek todevelop a set of core metrics that can be utilized in a computational infrastructure to automatically assess the FAIRness of any digital resource.
  • 5.
    Milestones ● Establish charter(done) ● Establish guidelines for the development of the FAIR metrics (done) ● Establish a metric proposal form (done) ● Develop a set of core FAIR metrics (ongoing) ● Prepare and discuss a Release Candidate for proposed FAIR metrics and a prototype implementation (Sept 2017) ● Community-wide implementation and review (Oct-Nov, 2017) ● Revision of FAIR metrics and documentation (Jan-Feb 2018) ● Prepare and discuss a Final Recommendation and Reference Implementation the core set of FAIR metrics (March 2018)
  • 6.
    Guidelines We focus onthe development of metrics to assess compliance to each and every one of the FAIR principles.
  • 7.
    Guidelines to drivethe development of a set of core FAIR metrics Clear: so that anybody can understand what is meant Realistic: so that anybody can report on what is being asked of them Discriminating: so that we can distinguish the degree to which a resource meets the FAIR principles while providing instruction to maximize their value Measurable: The assessment can be made in an objective, quantitative, machine-interpretable, scalable and reproducible manner → transparency of what is being measured, and how.
  • 8.
    Guidelines We focus onthe development of metrics to assess compliance to each and every one of the FAIR principles. However, compliance is distinct from impact. We do not believe that all digital resources are of equal quality, nonetheless, when any resource is created and published, it should be maximally discoverable and reusable as per the FAIR principles. Abiding by the FAIR principles makes unequal resources discoverable, and this will aid in the assessment of their quality. However, any metric that assess the popularity of a digital resource is not a measure of its FAIRness.
  • 9.
    Process to establishmetrics For each FAIR principle: ○ Propose one or more possible metrics ○ Fill out the information in the metric proposal form ○ Discuss the merits of the proposal ■ does it conform to the FAIR metric guidelines? (clear, realistic, discriminating, measureable, universal) ■ is it atomic or does it comprise a number of different or complementary aspects? ○ Iteratively refine and test the metric proposal until consensus is achieved ~3.5hrs per metric * 15 principles = 62.5h * 6 people = 315 person hrs
  • 10.
    FAIR Metric ProposalForm https://goo.gl/FiQQSc
  • 13.
    Milestones ● Establish charter(done) ● Establish guidelines for the development of the FAIR metrics (done) ● Establish a metric proposal form (done) ● Develop a set of core FAIR metrics (ongoing) ● Prepare and discuss a Release Candidate for proposed FAIR metrics and a prototype implementation (Sept 2017) ● Community-wide implementation and review (Oct-Nov, 2017) ● Revision of FAIR metrics and documentation (Jan-Feb 2018) ● Prepare and discuss a Final Recommendation and Reference Implementation the core set of FAIR metrics (March 2018)
  • 14.