Successfully reported this slideshow.
Your SlideShare is downloading. ×

MW18 Demonstration: Wisdom Of The Crowd(Sourced Content) – Library And Archives Canada’s New Crowdsourcing Platform

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 11 Ad

MW18 Demonstration: Wisdom Of The Crowd(Sourced Content) – Library And Archives Canada’s New Crowdsourcing Platform

Download to read offline

By Michael Smith, Library and Archives Canada, Canada

At long last, Library and Archives Canada (LAC) is following in the footsteps of other institutions such as the National Archives and Records Administration (NARA) and the National Archives (UK) in developing a platform for the public to transcribe, tag and translate manuscripts and photos from our diverse collection.

This talk will discuss the reasons behind our decision to build a crowdsourcing platform and the early success of our two crowdsourcing pilot projects. It will also detail the ways in which crowdsourcing platforms from other organizations influenced our design and how we are ensuring that the content is accessible and discoverable via LAC’s new collection search platform. Challenges such as addressing the Government of Canada’s Official Languages policy and ensuring the clear segregation of authoritative data vs. crowdsourced data will also be discussed along with a brief overview of user testing results and lessons learned.

By Michael Smith, Library and Archives Canada, Canada

At long last, Library and Archives Canada (LAC) is following in the footsteps of other institutions such as the National Archives and Records Administration (NARA) and the National Archives (UK) in developing a platform for the public to transcribe, tag and translate manuscripts and photos from our diverse collection.

This talk will discuss the reasons behind our decision to build a crowdsourcing platform and the early success of our two crowdsourcing pilot projects. It will also detail the ways in which crowdsourcing platforms from other organizations influenced our design and how we are ensuring that the content is accessible and discoverable via LAC’s new collection search platform. Challenges such as addressing the Government of Canada’s Official Languages policy and ensuring the clear segregation of authoritative data vs. crowdsourced data will also be discussed along with a brief overview of user testing results and lessons learned.

Advertisement
Advertisement

More Related Content

Slideshows for you (20)

Similar to MW18 Demonstration: Wisdom Of The Crowd(Sourced Content) – Library And Archives Canada’s New Crowdsourcing Platform (20)

Advertisement

More from MuseWeb Foundation (20)

Recently uploaded (20)

Advertisement

MW18 Demonstration: Wisdom Of The Crowd(Sourced Content) – Library And Archives Canada’s New Crowdsourcing Platform

  1. 1. Wisdom of the Crowd(sourced content) Library and Archives Canada's New Crowdsourcing Platform April 20, 2018 Michael Smith
  2. 2. Co-Lab - Backgrounder • Need for Crowdsourcing Application Explored in 2015-2016 • Success of 2 transcription pilot projects – Coltman Report – Lady MacDonald Diary • Supports our legislated mandate 2
  3. 3. Why Now? Impact of Not Proceeding • Falling further behind our like-minded institutions • Lack of relevance and resonance with users • Ever increasing backlog of description • Ever increasing lack of discoverability and usability of our collections • Lost opportunity to engage clients to perform work to improve our holdings 3
  4. 4. What Others are Doing • https://anno.tate.org.uk/#!/ • https://www.archives.gov/citizen-archivist • https://www.operationwardiary.org/
  5. 5. Language and Accessibility • Crowdsourced content is collected in the language of the contributor • Balance between English and French ‘challenges’ • Meets all current Government of Canada accessibility standards
  6. 6. User testing/ Bug fixes
  7. 7. Search Integration/ Distinguishing Data
  8. 8. Concerns/ Mitigation • What if someone adds a tag or transcription that is offensive? (Examples of Canadian graffiti)
  9. 9. Lessons Learned • High level business plan is key • Have a single business lead/contact • If possible, follow an agile business model/phased approach • TEST, TEST, TEST • Make sure you have approvals for name/branding
  10. 10. What are you waiting for? Get involved & start contributing today! Thank You!
  11. 11. Questions bac.co-lab.lac@canada.ca Michael D. Smith A/Manager, Online Content and Copyright, Public Services Branch Library and Archives Canada Government of Canada michael.smith3@canada.ca Tel: 613-790-2415

Editor's Notes

  • 7 MIN

    Thank you all for being here!
    I’m ________________
    And its my pleasure to be here today to discuss LAC’s new crowd-sourcing tool Co-Lab.
    Some of you may have been to my demo yesterday but if you missed me, I’d be happy to answer any questions after the presentations.
  • As some of you may know, Library and Archives Canada is the custodian of a huge amount of archival and bibliographic material that it needs to make available to Canadians.
    Given the sheer amount of holdings LAC manages, it is impossible for the organization to describe and transcribe all of its collection.

    We have already had early successes with the transcription of Lady Macdonald’s Diary and more recently the Coltman Report both of which were hosted by Our Digital World, a non-profit organization that supports digital stewardship of community cultural heritage.

    For this reason, LAC has developed Co-LAB - our very own crowdsourcing application inspired by similar platforms such as the US National Archives and Records Administration’s (NARA) Citizen Archivist model. It also supports our legislated mandate to make its holdings available to the Canadian public

    I am happy to say that THIS TUESDAY! LAC launched Co-Lab! An easy-to-use tool that allows everyone to enrich Canada’s history by tagging keywords, transcribing content and describing digitized images found in our collection.


  • The risks of not proceeding with this project greatly outweighed the risks of proceeding.
    Perhaps most importantly we risked losing an unique opportunity to engage our clients and inspire them to perform work to improve LAC’s holdings.
    Which will go on to benefit all Canadians.
  • Here is what other institutions are doing.

    Tate
    NARA
    UK Nat Archives

    By looking at what had already been done and speaking with our colleagues from other institutions, we were able to better define our goals and also gain insight into what worked well.
  • Being the Government Language and Accessibility was a key concern.
    To mitigate this we will ensure that:

    Crowdsourced content is collected in the language of the contributor and users are free to choose what language they use.
    We will look to find a balance between English and French ‘challenges’ to ensure that official languages are represented equally.
    As we have developed the tool, we have made sure that the application meets all current Government of Canada accessibility standards
  • Extensive internal and external user testing was done at each important phase of he project.
    The feedback was extremely valuable and informed our decisions on improving the user experience, and design and functionality improvements.
    Most importantly, colleagues spent many hours testing the tool and reporting and prioritizing bugs to be fixed prior to launch.

  • One of the features of this tool is that Crowdsourced content is stored in a separate database which is regularly indexed and made searchable via LAC’s new Collection SearchBETA

    Data appearing in search results is displayed in a new tab, separate from the authoritative LAC records.
    Thus ensuring the clear segregation of authoritative data vs. crowdsourced data

    Additionally, we have introduced the functionality that allows users to search our collection and select digitized material (not associated with a specific challenge) to
    tag, transcribe or even translate. This essentially opens up the entire collection to user contributed data.

  • The number one concern that came up was, what if someone posts something offensive

    Given that there is no delay on tags or transcription or a requirement for any review by LAC before it becomes viewable, this was a legitimate concern.
    However, anyone at all – any member of “the crowd” can delete or edit any transcription or tagging content and the spirit of crowdsourcing is that the crowd keeps the crowd in check.

    Our Communications approach surrounding the launch of the tool encourages all users to be active in both contribution and review of other contributions.
    This is in line with what other institutions, like NARA, have done for many years.
    It is also clear in the display of the crowd contributions that this material is not LAC-reviewed.

    Finally, As a failsafe, we have an admin feature on the tool that allows us to “lock” any items from further contribution to give us time to address any issue.



  • So please! Try I out and get involved.
    And finally, in the spirit of Open Data and Open Government, we are looking at ways to share our code for this tool and help other institutions
    to engage with users and enrich their collections.

×