RDAP13 Elizabeth Moss: The impact of data reuse
Upcoming SlideShare
Loading in...5
×
 

RDAP13 Elizabeth Moss: The impact of data reuse

on

  • 898 views

Kathleen Fear, ICPSR, University of Michigan ...

Kathleen Fear, ICPSR, University of Michigan

“The impact of data reuse: a pilot study of 5 measures”

Panel: Data citation and altmetrics
Research Data Access & Preservation Summit 2013
Baltimore, MD April 4, 2013 #rdap13

Statistics

Views

Total Views
898
Views on SlideShare
732
Embed Views
166

Actions

Likes
2
Downloads
6
Comments
0

2 Embeds 166

http://www.scoop.it 105
https://twitter.com 61

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • An environment with no standard way of citing research data and no established publishing infrastructure to optimize good discovery and attribution
  • One of the reasons we were founded was to share data that not everyone could collect themselvesBig, costly longitudinal studiesInternational studiesFederally funded studies All the more reason to make them available to everyone
  • Will also create an API for scripting to occur to track alternative metrics like downloads statistics by user type
  • Click on the Find Publications link.
  • We provide study-level and citation-level metadata in an XML feedWe are happy to provide this to anyone to improve the landscape of data citation, discovery, and recognition
  • DataPASS partners successfully lobbied ASA to include guidelines for data citation.

RDAP13 Elizabeth Moss: The impact of data reuse RDAP13 Elizabeth Moss: The impact of data reuse Presentation Transcript

  • Viable Data Citation:Expanding the Impact ofSocial Science ResearchRDAP13 Panel on Data Citationand Altmetrics, April 5, 2013Elizabeth Moss, ICPSReammoss@umich.edu
  • At ICPSR • Providing opportunities for tracking and measuring impact • Linking data to the literature, and the challenges involved • Aiding the cultural shift to viable citing practice (impact can be better measured if data use is readily discernable)
  • Top 10 Data Downloads in the Previous Six Months (non-anonymous, distinct users downloading one or more files) ICPSR Study Title # DownloadsNational Longitudinal Study of Adolescent Health (Add Health), 1994-2008 1817National Survey on Drug Use and Health, 2010 1109Chinese Household Income Project, 2002 648General Social Survey, 1972-2010 [Cumulative File] 643National Survey on Drug Use and Health, 2011 603Collaborative Psychiatric Epidemiology Surveys (CPES), 5272001-2003 [United States]Health Behavior in School-Aged Children (HBSC), 2005-2006 509American National Election Study, 2008: Pre- and Post-Election Survey 427India Human Development Survey (IHDS), 2005 395School Survey on Crime and Safety (SSOCS), 2006 339
  • Who uses these shared data?With what impact?
  • Obtaining ICPSRMetadataICPSR metadata areavailable in twoformats:•DDI Codebook XML•MARC21•OAI-PMH
  • Link research data to scholarly literature about it • Increase likelihood of discovery and re-use • Aid students, instructors, researchers, and fundersThe ICPSR Bibliography of Data-related Literature
  • It’s really a searchable database . . . . . . containing 65,000 citations of known published and unpublished works resulting from analyses of data archived at ICPSR . . . that resides in Oracle, with an internal UI for database management . . . that can generate study bibliographies linking each study with the literature about it, and out to the full text
  • It’s useful to all stakeholdersInstructors direct students to begin data-relatedresearch projects by reading some of the major worksbased on the dataAdvanced researchers also use it to conduct a focusedliterature review before deciding to use a datasetReporters and policymakers looking for processedstatistics look for reports explaining studiesPrincipal investigators and funding agencies want totrack how data are used after they are deposited
  • But challenging to provide
  • The state of data citation in thesocial science literature
  • Sample? Abstract? Methods? Acknowledgements? Data “Sighting” (implicit) vs.Discussion? Data Charts and Tables?Footnotes? Citing (explicit) Appendices? References!
  • Typical “sightings”• Sample described, not named, no author information, no access information, only a publication cited• Data named in text, with some attribution, but no access information• Cited in reference section, but with no permanent, unique identifier, so difficult for indexing scripts to find to automate tracking
  • ICPSR’s advocates the use of DOIs• ICPSR has been providing citations to its data since 1990 and started assigning DOIs in 2008• DOIs apply at the study or collection level (a study can have multiple datasets) and resolve to the study home page with richest metadata• DOIs are of the form: doi:10.3886/ICPSR04549
  • A-typical “citing:”In the references, with the DOI doi:10.3886/ICPSR21240
  • Challenges in database search infrastructure • Journal databases fielded for journal article discovery are not ideal for finding data “sightation” • No field searching on methods sections • Full-text search brings back too many bad hits • Limiting to abstract misses too many good hits
  • Challenges in tracking many studies• Tension between highly curating a manageable collection and minimally maintaining a broad collection• Too many publications for efficient collection by humans, so we must make it easy for scripts to do it reliably
  • Challenges of completeness• Data use that is too difficult/costly to find cannot be counted• A selective sample, difficult to draw accurate conclusions in broad analyses of re-use
  • Challenges in publishing practice, andlack of data management planning• Publishing sequence prevents citation creation before publication• Potential for change by educating the PI/mentor• Consciousness raising starting to occur due to funders’ requirements
  • Poorly described and cited data+Excessive human search effort=Too costly, too questionable for confidentmeasure of impact
  • Citing data with a DOI+Minimal human search effort=High hit accuracy for the cost, and betterconfidence of impact measures
  • Finding data with simple search fields Integration with Web of Knowledge All Databases: Research data is equal to research literature
  • Articles linked to underlying data.Increased data discovery.Reward for data citation.Potential for automated tracking.Converting journal searchinfrastructure to meet the needs ofdata, but synching metadata still awork in progress.
  • Building a culture of viable data citationto improve measures of impact
  • Provide PIs and users with citations andDOIs for all study-level data
  • Join groups advocating viable dataciting practice
  • Work with partner repositories tochange publishing practice
  • Three meetings: Journal editors,domain repositories, and funders• Establish consistent data citation in social science journals• Encourage transparency in research• Optimize editorial work flows: sequencing• Develop common standards for repositories• Find long-term funding models repository sustainability
  • Thank youElizabeth Mosseammoss@umich.edu