• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Crowd-sourcing the creation of "articles" within the Biodiversity Heritage Library
 

Crowd-sourcing the creation of "articles" within the Biodiversity Heritage Library

on

  • 952 views

An analysis of crowd-sourced "article" creation and user-generated metadata for a digital repository of biodiversity literature

An analysis of crowd-sourced "article" creation and user-generated metadata for a digital repository of biodiversity literature

Statistics

Views

Total Views
952
Views on SlideShare
952
Embed Views
0

Actions

Likes
1
Downloads
6
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-ShareAlike LicenseCC Attribution-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Add link: http://biodiversitylibrary.org/item/54249
  • Highlight row?Show article in CB

Crowd-sourcing the creation of "articles" within the Biodiversity Heritage Library Crowd-sourcing the creation of "articles" within the Biodiversity Heritage Library Presentation Transcript

  • Crowd-sourcing the creation of “articles” within the Biodiversity Heritage Library
    Bianca Crowley
    crowleyb@si.edu
    Trish Rose-Sandler
    trish.rose-sandler@mobot.org
  • The BHL is…
    A consortium of 13 natural history, botanical libraries and research institutions
    An open access digital library for legacy biodiversity literature.
    An open data repository of taxonomic names and bibliographic information
    An increasingly global effort
    BHL
    LITA 2011
  • Problem: Books vs. Articles
    Librarians manage books
    Users need articles
    BHL
    LITA 2011
  • Solution: “Article-ization”
    Creating articles manually, through the help of our users: BHL PDF Generator
    Creating articles through automated means: BioStorhttp://biostor.org/issn/0006-324X
    Page, R. (2011). Extracting scientific articles from a large digital archive: BioStor and the Biodiversity Heritage Library. BMC Bioinformatics, 12(187). Retrieved from http://www.biomedcentral.com/1471-2105/12/187
    BHL
    LITA 2011
  • LITA 2011
    BHL
  • Create-your-own PDF
    BHL
    LITA 2011
  • Citebank today: http://citebank.org
    BHL
    LITA 2011
  • What is an “article” anyway?
    BHL
    LITA 2011
  • the Good, the Bad, the Ugly
    BHL
    LITA 2011
  • the Good, the Bad, the Ugly
    BHL
    LITA 2011
  • the Good, the Bad, the Ugly
    BHL
    LITA 2011
  • Questions for Data Analysis
    What is the quality, or accuracy, of user provided metadata?
    What kinds of content are users creating?
    How can we improve the PDF generator interface?
    BHL
    LITA 2011
  • Stats
    Jan 2010-Apr 2011
    Approx 60,000 pdfs created from PDF Generator
    40% of those (approx 24,000) were ingested into CiteBank(PDFs without user-contributedmetadata excluded)
    5 reviewers analyzed 945 pdfs (approx 3.9% of the 24,000+ articles going into Citebank)
    **Thanks to reviewers Gilbert Borrego, Grace Costantino, and Sue Graves from the Smithsonian Institution
    BHL
    LITA 2011
  • Methodological approach
    Quantitative – numerical rating system
    Rated titles, authors, beg/end pages
    Its “findability” within CiteBank search often determined how it was rated
    BHL
    LITA 2011
  • Ratings System
    Title
    1=has all characters in title letter for letter
    2=does not have all characters in title letter for letter but still findable in CiteBank search
    3= does not have all characters in title letter for letter and is NOT findable via the CiteBank search
    LITA 2011
    BHL
  • Ratings System
    Author
    1=has all characters in author(s) last name letter for letter
    2=has at least one author’s last name spelled correctly
    3=has no authors or none of the author’s last names are spelled correctly
    LITA 2011
    BHL
  • Ratings System
    Article beginning & ending pages
    1=has all text pages for an article, from start to end
    2=subset of pages from a larger article
    3=a set of pages where the intellectual content has been compromised.
    LITA 2011
    BHL
  • Analysis steps
    LITA 2011
  • Results
    LITA 2011
    BHL
  • What did we learn?
    Ratings were better than we expected
    Many users took the time to create decent metadata
    “good enough” is not great but is still “findable”
    LITA 2011
    BHL
  • But of course…..
    there’s always room for improvement
    Other factors
    BHL-Australia’s new portalhttp://bhl.ala.org.au/
    BHL
    LITA 2011
  • Changes we madefor UI so far
    • Asking users if they want to contribute their article to CiteBank
    • Making article title a required field and validating it so its at least 2 or more characters
    •  Review button for users to review page selections and metadata (inspired by BHL-AUS)
    • Reduced text and increased more intuitive graphics (inspired by BHL-AUS)
    BHL
    LITA 2011
  • Brief survey of proposed changes
    Overwhelmingly positive response to proposed change
    But of course…..
    there’s always room for improvement
    BHL
    LITA 2011
  • Success Factors
    Monitor the creation of the metadata to look at user behavior and patterns
    Engage with your users
    Incentivize your users
    LITA 2011
  • http://biodiversitylibrary.org
    @BioDivLibrary/pages/Biodiversity-Heritage-Library/63547246565/photos/biodivlibrary/sets//group/biodiversity-heritage-library
    Bianca Crowley
    crowleyb@si.edu
    Trish Rose-Sandler
    trish.rose-sandler@mobot.org
    BHL
    LITA 2011