Collaboration Through Interoperability: FundRef and Other Metadata
 

Collaboration Through Interoperability: FundRef and Other Metadata

on

  • 495 views

Carol Anne Meyer's presentation at the Council of Science Editors 2014 annual meeting May 5 2014 ...

Carol Anne Meyer's presentation at the Council of Science Editors 2014 annual meeting May 5 2014
Session Description: There are several organizations, such as CrossRef, the National Library of Medicine, ORCID and Ringgold, which are putting forth ideas to standardize data and data exchange throughout scholarly publishing. This session will discuss new initiatives that address such challenges as easily identifying funding sources, managing author disambiguation, managing institution disambiguation, and standardization of information exchange.
Who Should Attend: Managing Editor/Publisher, Copy Editor/Production Editor, Editorin-
Chief
Standardizing Data and Data Exchange in Scholarly Publishing

Statistics

Views

Total Views
495
Views on SlideShare
443
Embed Views
52

Actions

Likes
1
Downloads
5
Comments
0

4 Embeds 52

http://www.scoop.it 41
https://twitter.com 9
http://webcache.googleusercontent.com 1
http://www.slideee.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Collaboration Through Interoperability: FundRef and Other Metadata Collaboration Through Interoperability: FundRef and Other Metadata Presentation Transcript

  • Carol Anne Meyer CrossRef @meyercarol ORCID: 0000-0003-2443-2804 Collaboration Through Interoperability FundRef and Other Metadata Council of Science Editors Standardizing Data and Data Exchange in Scholarly Publishing 5 May 2014 http://xkcd.com/927/!
  • A not-for-profit trade association of global scholarly publishers
  • CrossRef Has 1950 Members, Representing 4627 Publishers View slide
  • 5 Members Come from 81 Countries View slide
  • Linking Reference Linking   Cited-by Linking   Discovery and Delivery CrossRef Metadata Services   Bibliographic Management CrossRef Metadata Search Document Delivery Multiple Resolution Link Resolvers CrossRef APIs Evaluating CrossCheck Article Level Metrics CrossMark PreScore FundRef Collaborating Linked Data Text and Data Mining FundRef NISO OA Indicator Threaded Publications Journal Article Tag Set (JATS) Enables Powered  by  iThen.cate  
  • Now we have 90 affiliates and 2045 libraries Our Community Includes Affiliates and Libraries
  • We generate more than a billion annual “clicks” to our member publishers’ sites 0 275,000,000 550,000,000 825,000,000 1,100,000,000 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013
  • We have two offices: Lynnfield, MA and Oxford, UK
  • CrossRef has 25 Employees
  • 6! 2! 4! 10! 12! 43! 64! 151! 1612! >500! 201-500! 101-200! 51-100! 26-50! 10-25! 5-10! 1-5! <1! # of Members! RevenueTiers($million)! The Long Tail of Members
  • Mission To be a trusted collaborative organization with broad community connections; authoritative and innovative in support of a persistent, sustainable infrastructure for scholarly communication.
  • Improving scholarly communication through community collaboration 6-Word Mission
  • •  Protein  Data  Bank   •  Standards  in  Genomic  Science   •  Organiza.on  for  Economic  Development  (OECD)   •  Public  Library  of  Science   •  Interna.onal  Union  of  Crystallography  (IUCR)   More than 1 million data items/figures/ components have CrossRef DOIs
  • http://xkcd.com/285/!
  • Best Editorial Practices to Increase Data Transparency 1. Ask Authors to store and cite data 2. Assign CrossRef DOIs to supplementary data 3. Encourage authors to assign DataCite DOIs to their data, and link to articles published using that data via CrossRef DOIs 4. Include journal article (or other publication) bibliographic metadata with data deposits 5. Cite data in publication reference sections using DOIs when available 6. CrossMark participants, link to data in the Publication Record tab.
  • •  author (s) •  journal title •  article title •  volume •  issue •  publication date •  ISSN • page numbers • article IDs • internal identifiers • URL • DOI Sample CrossRef Bibliographic Metadata
  • Additional CrossRef Metadata • ORCID • CrossMark ü Updates (related CrossRef DOIs) ü Publication record information • Text and Data Mining Data • NISO Open Access Identifier
  • PDF! HTML!
  • NISO OA Metadata & Indicators •  2 simple tags: ü “free_to_read” ü “license_ref” •  Embargo periods supported
  • Additional Metadata for •  3 simple tags: ü funder_name ü funder_identifier ü award_number
  • A standard way of reporting funding sources for published scholarly research Launched May 2013
  • For Further Reading http://fundref.crossref.org/docs/funder_kpi_metadata_best_practice.html
  • The Funding Attribution Problem
  • <fn fn-type="financial-disclosure"> <p>This work was supported in part by NIH grant R01 GM094800B to G.J.J., a gift to Caltech from the Gordon and Betty Moore Foundation, and a stipend from the Bayerische Forschungsstiftung to M.P. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.</p> </fn> </fn-group> </back> </article>
  • <body> ... <sec> <title>Funding</title> <p>This work was supported by the <grant-sponsor xlink:href=" http://www.grf.org" id="GS1">Generic Research Foundation</grant-sponsor>, the <grant-sponsor xlink:href=" http://www.energy.gov" id="GS2">Department of Energy</grant- sponsor> Office of Science grant number <grant-num rid="GS2">DE-FG02- 04ER63803</grant-num>, and the <grant- sponsor xlink:href=" http://www.nih.gov" id="GS3">National Institutes of Health</grant-sponsor>. </p> </sec> </body>
  • !   Funding bodies cannot easily track the published output of funding !   Publishers cannot easily report which articles result from research supported by specific funders or grants !   Institutions cannot easily link funding received to published output !   Lack of standard metadata for funding sources makes it difficult to analyze or mine the data Why Does This Matter?
  • !   National Institutes of Health !   NIH? N.I.H.? National Institute of Health? !   Abbreviations, misspellings, translations...
  • The Solution
  • Publishers Relationship with authors submitting manuscripts Established publishing and peer-review systems Funders Relationship with researchers funded by agencies Established award systems and research management processes Institutions Funder compliance education Track funding received The Public Authors Want accountability for how contributions/taxes spent Have funding information at submission
  • FundRef Pilot Brought Together Publishers and Funders
  • The FundRef Registry is a Taxonomy of 6100 Funder Names www.crossref.org/fundref/fundref_registry.html
  • !   6100 funder names and ID numbers from curated Elsevier SciVal registry, donated to FundRef !   Hosted by CrossRef, available under CC0 !   Updated and extended monthly— !   Publishers use this list to ensure consistency More on the FundRef Registry www.crossref.org/fundref/fundref_registry.html
  • FundRef Registry Publisher Submission System Grant Number Funder Production Systems CrossRef Database & Query APIs Funders Researchers Institutions Publishers SHARE!
  • DOI Funding Source Award Number
  • Submission Workflow 1. Collect funding data from authors on submission using FundRef Registry taxonomy
  • Workflow Issues1. Collect funding data from authors on submission using FundRef Registry taxonomy http://www.crossref.org/fundref
  • Implementation Widget - http://labs.crossref.org
  • Submission System Grant Number Funder Production Systems Implementation2. Pass funding data from submission system to production systems Publisher Editorial Check
  • http://labs.crossref.org
  • Workflow3. Deposit FundRef data with CrossRef CrossMark participants should deposit FundRef data within CrossMark deposits CrossMark participation recommended for standard display funding information
  • Workflow 3. Deposit funding data with CrossRef
  • Look up Funding Data http://search.crossref.org/fundref
  • Search for Funder
  • Results
  • Search by Other Metadata
  • Search by Grant Number
  • Search by CrossRef DOI
  • Search by ORCID
  • How to Participate 1. Encourage researchers to submit FundRef info at manuscript submission. (Hint: Ask for ORCIDs too!) 2. Use FundRef Search, CrossRef Metadata Search & CrossRef APIs to retrieve funding information 3. Provide feedback on the tools 4. Use FundRef Registry for funding analysis 5. Always use CrossRef DOIs and ORCIDs when citing research output
  • •  71,000 + unique documents with FundRef records •  75% of the funder names from these relationships are in the FundRef Registry So, How Are We Doing?
  • AAAS ACSESS American Chemical Society American Diabetes Association American Institute of Physics American Psychiatric Publishing American Psychological Association American Physical Society American Society of Neuroradiology Association for Computing Machinery BioMed Central Bioscientifica Copernicus GmBH eLife Sciences Publications Elsevier FapUNIFESP (SciELO) Grupo Comunicar Hindawi Publishing Corporation Institute of Electrical & Electronics Engineers International Union of Crystallography Internet Medical Publishing IOP Publishing Journal of Humanity Journal of Rehabilitation Research & Development Just Medical Media, Ltd. KAMJE Kowsar Medical Institute Landes Bioscience National Library of Serbia Optical Society of America Oxford University Press Royal Society of Chemistry ScienceOpen Spandidos Publications Taylor & Francis The Royal Society Wiley-Blackwell http://www.crossref.org/fundref/fundref_agreement.html These Deposits Come from 11 Publishers of 38 Signed Up
  • Publishers: sign up now! FundRef Terms & Conditions: www.crossref.org/fundref No fees for FundRef deposits
  • •  CrossRef (with FundRef) provides the social and technology standards and practices that makes CHORUS and SHARE possible. •  CrossRef DOIs directs interested parties to the correct documents •  CrossRef’s existing metadata database will hold data about ORCID, FundRef, Open Access Indicator, Text and Data mining •  CrossRef’s Application Programming Interfaces (APIs) and search interfaces will serve these new types of data. PS: What Does FundRef Have to Do With CHORUS and SHARE?
  • FundRef
  • •  CrossRef staff participate on the Technical Working Groups of CHORUS and SHARE •  CrossRef also has expressed an openness to make its infrastructure available for other public access initiatives •  CrossRef does not do custom development for projects that are specific to that project and not generalizable to the industry. Full Disclosure: CrossRef Plays the Field
  • www.crossref.org/fundref cmeyer@crossref.org Thank you!