UAB 2011 - Games

500 views
489 views

Published on

Published in: Entertainment & Humor
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
500
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Dear audience, I’m happy to welcome you to my talk about semantic games,I’ve divided the talk into three parts first I will introduce the topic
  • Semantic web needs dataSemantic Web needs Machine processable Meta data; Backbone of the Semantic WebNot a one time effort, Continuos process, Web highly dynamic entityTasks that require human input - Difficult for machines, easy for human “computation”Captchas: distorted images to distinguish between maschinesHigh level Semantic Annotation: Audio / Video / Images / ConceptualizationOntology aligment / evaluation / learning / building hierarchies / reuse, propertiesSo, If we need human => howto motivate themSketch our ideaDifficult task for machine, easy for humans Hide it behind a gameLoads of people play with their Collect input, process it, generate data
  • This is the backend technology J2EE project, modular toolkitSelect a suitable partner to playFinding which answer of a set of answer is right, enable exportDifferent ways to compare your answer, e.g. match with a partner Workflows for Agreement negotiation, skip functionality includedKeep track of players performanceRecord selecting - optimizations
  • Very briefly introduce old gamesSpotTheLink, game for ontology alignment, was implemented dbpedia ontology – proton upperOntoTube – game for annotatingyoutube videosOntoPronto – game for building a huge domain ontology for wikipedia articles
  • ProblemImage annotation with concepts Produce linked data, link images with concepts from dbpedia
  • So how does the game look from a users perspectiveFirst, images are loaded and a depicted concept is loaded on the left side to the screen is on the rightThe game starts, a player has 2 minutes to sort the images either into the match or the discard basket Game stops after all images have been sorted / runs out of time
  • In the end the player gets an overview over earned scores and can review changesReward accuracy over doing something, with the scoring function we want to control the users behaviourIntroduced goldfish reward, which is basically a trophy – the first one who makes 25000 points get’s it and all the counters of start again. Can post this on facebookWhere do we get our data from:As mentioned, we take concepts from Dbpedia, labels and depiction from Dbpedia by quering it’s sparql endpoint.The images we first took from google image search. Bad results, Swtiched to flickrwarpprHow is the data generatedReliabilityMajorityMinimum amount of playersWhat data is generated: rdf triples / foafdepictino / depicts triples
  • Explain how the evaluation was performed
  • Explain Quality ( was the annotation correct, i.e. the image is really a depiction of the concept)Explain difference between times: different scoring functionCorrect: Image and concept matched, for example the image showed a blackberry, and the concept and the image were a blackbarryConcept: only concept matched. For example depiction of the ginko concept showed an illustration of a ginko, however the image with showed a real ginkoUnclear: not decidable, for example a needle tree very distant
  • Still, one of the most important things in a game is the user experienceMade a survey for that and asked how the game was perceived96% of the players understood the game’s goal, more then two third of the people stated that it was sometimes difficult to sort the images
  • Tracability: The user must understand why something happens in the game, why they obtain scored, why heey gets a level up,What the benefits of their actions arepublishing on facebookData sources are important because they have severe impact on the game fun and also on the quality of the resulting outputBoring, complex images, images that are too easyImages not relate, not prefiltered
  • The next Game Tubelink, game for Video FragementInterlinkingSuccessor of ontotube, improved interface, simpler game mechanismsWhich problem to solve: conceptual video image processionOur goal is to produce Linked Open Data triples in the form of videofragmentannotatedBy concept
  • So how does the game work:Video is loaded in the glass bulb, tags, representing the concepts are loaded from dbpedia. tagging, matching, burning
  • The game ends, when the video ends. Uswer gets an Overview, sees why he got points, what the high scores are as well as his personal high scoreWhere do we get our data from:Labels are taken from dbpedia, bootstrapped by analysing keywords. Refined in the gaming process, sort out wrong ones, find related onesVideos taken from youtubeHow is the data generatedReliabilityMajorityMinimum amount of playersTag is right for the videoFor which videofragment is it validWhat data is produced: video fragments, annotate, concept
  • TubeLink there are still some open issues,use case with red bull, Use videos from red bull content poolSet of tags, mapped
  • IntroductionStrategy quiz game, made by use case partner pgpWe produce linkeddata via the quizpart of the gameAnnotate images of objects that occur in the space with dbpedia concepts
  • Explain game principlesStep 1: Player chooses his spaceship, they have different attributes in terms of probes, radar range amount of bombs Step 2: Deploy your supply depots strategically on the map. They have to be protected from the enemiesStep 3:The game starts.The Goals are: to protect ones supply depos, destroy as many enemies as possible(red spots) and discover as many hidden resources (green dots) So how does this work: Each turn a player may use his radar once to locate enemies / resources. He then moves around his spaceship and drops bombs on places he believes an enemy is located and launch probeson places where he thinks a resource is locatedBut he has to move strategically, since every movement costs fuel. At the beginnig of each round his ship is refuled by the amount of oxygen he collected. The oxygen isi indicated by the blueness of the squares, the lighter the more. After each gameround, the enemy ships move around and try to attack your supply doptsThe game ends when you ran out of fuel or your supply depots are destroyed
  • So, when a probe is dropped on a hidden resource a quiz pops upWhere do we get our data from: 2 categorization questions, simple taxonomy, first the broad categorization than the more specific oneConcepts we take are mapped to dbpediaImages are taken from a pgp’s databaseWhat data do we generate: LINKED open data, triples foaf:depiction / depicts
  • We’ve launched a website called semanticgames.org, where we try to collect games that deal with semantic content creationIntroduce kongregate, gaming portal, we published universe game there,built a community around gameing and game developers, large user base, more than 40000 games on there, feedback, ratings, commentsIntroduce efforts for facebok integration, we are also working on facebook ads
  • Introduce semantic games.org, effort to advertise semantic games, group them by the task they tackle such as ontology building, multimedia annotation, and other
  • Describe semantic games detail
  • UAB 2011 - Games

    1. 1. Semantic GamesUser Advisory Board, May 2011 Stefan Thaler, UIBK www.insemtives.eu http://blog.insemtives.eu
    2. 2. Problem and approach5/2/2012 www.insemtives.eu 2 2
    3. 3. Generic Gaming Toolkit• Game partner matching• Data extraction• Answer matching• Agreement negotiation• Reliability evaluation• Record selecting5/2/2012 www.insemtives.eu 3
    4. 4. http://www.ontogame.orghttp://apps.facebook.com/ontogame 4
    5. 5. SeaFish5/2/2012 www.insemtives.eu 5
    6. 6. SeaFish game play5/2/2012 www.insemtives.eu 6
    7. 7. SeaFish game play5/2/2012 www.insemtives.eu 7
    8. 8. SeaFish evaluation settings• 3 different data sets• An online survey• 14456 answers• 931 Game rounds• 548 Generated annotations5/2/2012 www.insemtives.eu 8
    9. 9. SeaFish quality evaluation5/2/2012 9 www.insemtives.eu
    10. 10. SeaFish fun evaluation5/2/2012 www.insemtives.eu 10
    11. 11. SeaFish lessons learnt• Traceability• Select data sources carefully5/2/2012 www.insemtives.eu 11
    12. 12. TubeLink5/2/2012 www.insemtives.eu 12
    13. 13. TubeLink5/2/2012 www.insemtives.eu 13
    14. 14. TubeLink5/2/2012 www.insemtives.eu 14
    15. 15. TubeLink outlook• Scoring• Data extraction• Evaluation• Use Case: SNML - Scenario5/2/2012 www.insemtives.eu 15
    16. 16. The Universe Game5/2/2012 www.insemtives.eu 16
    17. 17. The Universe Game5/2/2012 www.insemtives.eu 17
    18. 18. The Universe Game5/2/2012 www.insemtives.eu 18
    19. 19. Advertising• SemanticGames.org• Kongregate• Facebook5/2/2012 www.insemtives.eu 19
    20. 20. 5/2/2012 www.insemtives.eu 20

    ×