Museum Games and UGC: Improving Collections Through Play


Published on

Presentation for the UGC4GLAM (user-generated content for galleries, libraries, museums and archives) in Vienna, May 16-17.

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • If I lose you in the some of the detail of best practice for crowdsourcing games, the three most important things to learn are: crowdsourcing games can help digitise your collections and make them more accessible; people have fun and deeper interactions with the objects while helping out a well-designed crowdsourcing game can be more fun and more productive than general crowdsourcing interfacesMia Ridge @mia_out Games: Blog:
  • Image source: Mia, somewhere over Sweden
  • I worked for the Science Museum while designing this project, and I love finding out about ordinary looking objects that turn out to have interesting stories, so I really wanted to explore games for technical and social history objects. The history of science and tech objects aren't pretty and exciting like an art museum, but they are still part of the history of science and technology and should be available to the public. So my project asked whether metadata games could help people have fun with creating useful content about difficult objects. How many versions of almost identical telescopes could people bear to see?Image source: Mia Ridge, game design workshop
  • Social history collections can contain tens or hundreds of similar objects, including technical items, reference collections, objects whose purpose may not be immediately evident from their appearance, and objects whose meaning may be obscure to the general visitor. The difficult objects I worked with were technical, near-duplicate, poorly catalogued or scantily digitised. Image sources: Powerhouse Museum
  • Design reward and feedback systems so that it's in a player's best interest to add high-quality dataGames are super-scaffolded experiencesPlay testers surprised themselves with how interesting they found the objects, and I saw some of them trying out new things they learned in other conversationsNew forms of engagement - the benefit isn't just in the content created, it's in the new relationships people have with your content - active viewing, curiosity, learning. Leaving a trace is powerful.Tagging and other forms of content can help create multilingual content Image source:
  • An ecosystem of linked games lets you build for different types of participant skills, knowledge, experience; and build for different levels of participation from liking, to tagging, finding facts and links. Use data from one game as input into other games e.g. Use stats from tagging games to reduce the number of repetitive objects in higher-level gamesIt could help resolve some of the difficulties around validating specialist tags or long-form, more subjective content by circulating content between games for validation and ranking for correctness and 'interestingness' by other players.  In this model, content is created about objects in the game; the content is validated; a game-dependent value (score) is assigned to the content; and the player is rewarded. The value of a piece of content may also be validated (e.g. for 'interestingness') when other players show preferences for it. At this point, the object and the new content about it can be used in a new game or presented on a collections page. For some content types, the content may be validated by players in another game after a default value has been calculated but this introduces tricky design issues around delayed responses to actions. The evaluation for Donald suggested that future prototypes with more clearly defined tasks would increase participation rates - matching specific tasks to appropriate objects is a perfect job for crowdsourcing within an ecosystem of games. Image source:
  • Museum Games and UGC: Improving Collections Through Play

    1. 1. Museum Games and UGC: Improving Collections Through Play<br />Mia Ridge, Open University, @mia_out<br />
    2. 2. Overview<br />The magic circle (and other definitions)<br />About MMG (Museum Metadata Games)<br />Benefits of museum crowdsourcing games<br />Best practice in crowdsourcing game design<br />
    3. 3. The magic circle (and other definitions...)<br />
    4. 4. ‘flow’<br />
    5. 5. Gamification?<br />“taking the thing that is least essential to games and representing it as the core of the experience”<br />“a short-term sugar rush of engagement followed by a crash”<br />“emphasizes the shallow, dumb, non-interesting tasks, and it decreases motivation for interesting tasks that might be intrinsically motivated.”<br />
    6. 6. About ‘Museum Metadata Games’<br />
    7. 7. 'difficult' objects:technical, near-duplicate, poorly catalogued or scantily digitised<br />'toy' model steam engines, Powerhouse Museum<br />
    8. 8.
    9. 9.
    10. 10. <ul><li>In evaluation period: 6039 tags (2232 unique tags), average 18 tags per object
    11. 11. Average time on site over 7 minutes, 6.5 pages per visit
    12. 12. But - visitors via Facebook averaged 10 minutes and over 8 pages per visit</li></li></ul><li> One Facebook status update asking for players: 180 turns (176 tagging turns, 4 fact turns), 1179 tags and 4 facts about 145 objects from 26 players in c. 6 hours<br />
    13. 13. Crowdsourcing games work<br />e.g. correcting OCR for libraries with DigitalKoot, Finland, one month after launch: 'over 2 million individual tasks, totalling 100,000 minutes, or 1,700 hours, of work'<br />GWAP, 2008: 50 million verified tags<br />
    14. 14. Games are 'participation engines'<br />Games demolish barriers to participation<br />Games drive on-going participation<br />Games encourage super-taggers<br />Games provide lavish feedback and rewards for effort<br />
    15. 15. Benefits of museum crowdsourcing games<br />The magic circle works<br />You make the rules - design for the data you need<br />New forms of engagement with collections<br />Games encourage informal content that bridges the ‘semantic gap’<br />
    16. 16. Museum crowdsourcing games are<br />good at:<br />not-so-good at:<br />Mental challenge<br />Mystery, curiosity, discovery<br />Novelty (sorta)<br />Instant gameplay<br />Epic meaning, blissful productivity<br />Infinite gameplay<br />Mastery - how to teach skills, scaffold the learning experience, provide meaningful feedback?<br />Flow – needs variable difficulty; balance between boredom and anxiety<br />
    17. 17. Help win the competition for eyeballs <br />(AKA competing for 'participation bandwidth')<br />Design for instant action, gratification<br />Build instructions and requirements into gameplay<br />Reward on-going play <br />Don't require registration<br />Validate procrastination – help people feel good about playing<br />Polish is vital – 'worthy' isn't good enough<br />
    18. 18. More lessons learned<br />Design for flow e.g variable levels of difficulty<br />Fun is personal - design for a specific player persona, test with real audiences<br />Quality of feedback and scoring systems counts<br />Help players acquire, test and master new skills<br />
    19. 19. Ecosystem of games<br /><ul><li>Engage a wider range of players
    20. 20. Simple games help clean and test data for use in other games
    21. 21. Validate and rate specialist content from complex tasks
    22. 22. Be creative - e.g. crowdsource the matching of activities to objects</li></li></ul><li>Potential game 'atoms'<br /><ul><li>Tagging
    23. 23. Debunking
    24. 24. Recording a personal story
    25. 25. Linking
    26. 26. Stating preferences
    27. 27. Categorising
    28. 28. Creative responses</li></li></ul><li>Dealing with problem data?<br />
    29. 29. Thank you!<br />Questions?<br />Mia Ridge<br />@mia_out<br />Games:<br />Blog:<br />