Smithsonian Crowdsourcing
Upcoming SlideShare
Loading in...5
×
 

Smithsonian Crowdsourcing

on

  • 844 views

 

Statistics

Views

Total Views
844
Views on SlideShare
479
Embed Views
365

Actions

Likes
0
Downloads
2
Comments
0

1 Embed 365

http://www.scoop.it 365

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • First off I want to play a short video “mash up” if you will of composer Eric Whitacre’scrowdsourced virtual choir and Wired editor in chief Chris Anderson speaking at the Smithsonian 2.0 conference as way to set the stage for this talk.
  • The Smithsonian’s mission is the increase and diffusion of knowledgeand the interpretation of that vision through the lens of Smithsonian Mobile is to Recruit the World to increase and diffuse knowledge by using mobile platforms to enlist collaborators globally.
  • It’s about thinking outside the audiotour box…“From we do the talking to we help you do the talking.”– Chris Anderson, Wired, Smithsonian 2.0 Conference, 24 Jan 2009 http://smithsonian20.si.edu/schedule_webcast2.html
  • To tell how something or someone is doing, you have to have some standard or benchmark to compare against. Quality is, as Chris Anderson said, largely in the eye of the beholder and relative to its contemporary context. But against what scale do you measure “recruiting the world?”There’s one benchmark we can use to set the bar – Wikipedia. You’ve probably all seen some version of this pyramid, or an “engagement ladder” like this. It tells us that in fact the majority of that work is done by a tiny number of people at the top of the engagement pyramid: the specialists and enthusiasts in niche subjects.
  • To date we have launched more than 30 mobile apps and websites, and more than that number again of podcasts and other downloadable audio, video and text content that people are using every day on their mobile devices.Today I want to focus on three in particular: Smithsonian Mobile, Stories from Main Street, and Access American Stories.
  • Practical problem: no budget for creating content or maintaining app.
  • So clearlynot all crowdsourcing or community sourcing projects are created equal. They will not all have the same ratios of participants at the different levels in the engagement pyramid. But I’m starting to track this data for the Smithsonian’s mobile projects so we can measure and report our success in “recruiting the world.”The Smithsonian Mobile app, launched in August 2011, is a modest project by comparison…75 unique individuals have contributed content to the application.3.12 average number of contributions per user.   4 users have contributed 6 or more times.  17 users have contributed 2-5 times.    54 users have only contributed once.234 total contributions.135 deleted contributions.
  • Lifetime downloads: 16,593, over 9,000 with the newest versionHighest rank: #24 in the Education categoryAverage review=3 starsCountries: 87.2% United States; 6.0% Canada; 3.2% Brazil; 1% Mexico; .8% South Africa; .4% Qatar
  • 228 available for playback through the app Tennessee and West Virginia have been our most active states where the exhibition is on tour. We had a single contributor talking about the town where she grew up in upstate New York over ten entries!
  • Here’s another mobile crowdsourcing project: Stories from Main Street. I was corresponding with David Anderson, a crowdsourcing expert from Berkley, about these metrics and how to read them. He had an interesting comment:“…downloading Stories from Main Street (I'm guessing) impliesan interest in supplying a story,whereas downloading the Smithsonian Mobile App (I'm guessing)doesn't imply an interest in contributing comments.So of the two, it's possible that 500/16000 is worse(i.e. reflects a worse user interface or wording) than 70/35000.”
  • None of these apps has had a dedicated marketing budget, but are actively trying to organize events to solicit contributions to AAS.
  • Others seem to satisfy our basic informational needs with a bit more finess. [control click – play]http://66.147.244.104/~amerifl5/americanstories/2012/04/10/alexander-graham-bells-big-box-telephone/
  • And then you get the zingers, that really make you sit up and realize the potential of Joy’s law: how the crowd can help the museum do a better job. http://66.147.244.104/~amerifl5/americanstories/2012/04/10/1845-reproduction-of-eli-whitneys-cotton-gin-patent-model/
  • Or let you know what’s missing from the exhibition
  • Each contributor leaves 5.2 messages on average
  • Amy Sample Ward usefully identifies two different kinds of engagement of mass audiences:“Crowdsourcing invites diversity by encouraging anyone with an idea or interest to participateCrowdsourcing levels the playing field so it isn’t just your “favorites” or those you already know that get to play”http://amysampleward.org/2011/05/18/crowdsourcing-vs-community-sourcing-whats-the-difference-and-the-opportunity/
  • In the Wikipedia example, the base of the engagement pyramid is very broad, 400m visitors per month, compared to the 85,000 people contributing articles nearer the top of the pyramid.
  • Here the community base can be much narrower and still achieve the project’s desired results. The community has special skills and interests as well as a very well-developed network, so a smaller number of individuals in the eco-system get the job done.
  • In addition to understanding the metrics of success in crowdsourcing, our challenge now is to learn how to set goals for the numbers of “watchers” we need in order to have an engagement eco-system with a healthy number of contributors and even “curators” at the top of the pyramid. Further audience research will tell us how best to engage users at all different levels on the pyramid. We will need content for the watchers as well as compelling activities for the producers, and everything in between to have a healthy crowdsourcing eco-system. It is starting to look like in addition to figuring out how to serve both on-site audiences and remote “visitors” who might download our apps but never visit the museum, we need to learn how to combine an appeal for both the “mass market” and our niche audiences – the communities who identify most closely with museums’ niche collections, content, and subject-matter expertise – in the same mobile experience and product.

Smithsonian Crowdsourcing Smithsonian Crowdsourcing Presentation Transcript

  • Thinking outside the audiotour box“From we do the talking to we help you do the talking.”– Chris Anderson, Wired, Smithsonian 2.0 Conference, 24 Jan 2009
  • 30+ SI Mobile Projects to Date http://si.edu/mobile
  • Alexander Graham Bell’s Big Box Telephone11/27/2012 Nancy Proctor, proctorn@si.edu 14
  • Eli Whitney’s Cotton Gin Patent Model11/27/2012 Nancy Proctor, proctorn@si.edu 15
  • What’s missing