Your SlideShare is downloading. ×
0
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
MediaEval 2012 SED Opening
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

MediaEval 2012 SED Opening

584

Published on

Opening presentation of the Social Event Detection (SED) task at MediaEval 2012 October 2012, Pisa, Italia

Opening presentation of the Social Event Detection (SED) task at MediaEval 2012 October 2012, Pisa, Italia

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
584
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Social Event Detection (SED):Challenges, Dataset and Evaluation Raphaël Troncy <raphael.troncy@eurecom.fr> Vasileios Mezaris <bmezaris@iti.gr> Symeon Papadopoulos <papadop@iti.gr> Emmanouil Schinas <manosetro@iti.gr> Ioannis Kompatsiaris <ikom@iti.gr>
  • 2. What are Events? Events are observable occurrences grouping People Places Time Experiences documented by Media 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -2
  • 3. SED: bigger, longer, harder In 2011  In 2012  2 challenges  3 challenges  73k photos (2,43 Gb) 1 from SED 2011  No training dataset  167k photos (5,5 Gb) cc licence check  18 teams interested  7 teams submitted runs  Training dataset = SED 2011 Considered easy  21 teams interested  F-measure = 85% … from 15 countries (challenge 1)  5 teams submitted runs  F-measure = 69% (challenge 2)  Much harder ! 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -3
  • 4. Three challenges (type and venue)1. Find all technical events that took place in Germany in the test collection.2. Find all soccer events taking place in Hamburg (Germany) and Madrid (Spain) in the collection.3. Find all demonstration and protest events of the Indignados movement occurring in public places in Madrid in the collection  For each event, we provided relevant and non relevant example photos Task = detect events and provide all illustrating photos 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -4
  • 5. Dataset Construction Collect 167332 Flickr Photos (Jan 2009-Dec 2011)  4,422 unique Flickr users, all in CC licence  All geo-tagged in 5 cities: Barcelona (72255), Cologne (15850), Hannover (2823), Hamburg (16958), Madrid (59043) + 0,22 % (403) from EventMedia Altered metadata:  geo-tags removed for 80% of the photos (random)  33466 photos still geo-tagged Provide only metadata … but real media were available to participants if they asked (5,5 Gb) 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -5
  • 6. Ground Truth and Evaluation Measures CrEve annotation tool: http://www.clusttour.gr/creve/  For each of the 6 collections, review all photos and associate them to events (that have to be created)  Search by text, geo-coordinates, date and user  Review annotations made by others  Use EventMedia and machine tags (upcoming:event=xxx) Evaluation Measures:  Harmonic mean (F-score of Precision and Recall)  Normalized Mutual Information (NMI): jointly consider the goodness of the photos retrieved and their correct assignment to different events 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -6
  • 7. What ideally should be found Challenge 1:  19 events, 2234 photos (avg = 117)  Baseline precision (random): 0,01% Challenge 2:  79 events, 1684 photos (avg = 21)  Baseline precision (random): 0,01% Challenge 3:  52 events, 3992 Photos (avg = 77)  Baseline precision (random): 0,02% 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -7
  • 8. Who Has Participated ? 21 Teams registered (18 in 2011) 5 Teams cross the lines (7 in 2011, 2 overlaps) One participant missing at the workshop! 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -8
  • 9. Quick Summary of Approaches 2011: all but 1 participants use background knowledge  Last.fm (all), Fbleague (EURECOM), PlayerHistory (QMUL)  DBpedia, Freebase, Geonames, WordNet 2012: all but 2 participants use a generic approach  IR approach: query matching clusters (metadata, temporal, spatial): MISIMIS  Classification approach:  Topic detection with LDA, city classification with TF-IDF, event detection using peaks in timeline using the query topics: AUTH-ISSEL  Learning model using the training data and SVM: CERTH-ITI  Background knowledge: QMUL, DISI 2012: all approaches are NOT fully automatic  Manual selection of some parameters (e.g. topics) 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy -9
  • 10. Results – Challenge 1 (Technical Events) Precision Recall F-score NMIAUTHISSEL_4 76,29 94,9 84,58 0,7238CERTH_1 43,11 11,91 18,66 0,1877DISI_1 86,23 59,13 70,15 0,6011MISIMS_2 2,52 1,88 2,15 0,0236QMUL_4 3,86 12,85 5,93 0,0475 90 84,58 80 70,15 70 60 50 40 30 18,66 20 10 5,93 2,15 0 Runs AUTHISSEL_4 CERTHITI_1 DISI_1 MISIMS_2 QMUL_4 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy - 10
  • 11. Results – Challenge 2 (Soccer Events) Precision Recall F-score NMIAUTHISSEL_4 88,18 93,49 90,76 0,8499CERTH_1 85,57 66,19 74,64 0,6745DISI_1MISIMS_2 34,49 17,25 22,99 0,1993QMUL_4 79,04 67,12 72,59 0,6493 100 90,76 90 80 74,64 72,59 70 60 50 40 30 22,99 20 10 0 Runs AUTHISSEL_4 CERTHITI_3 DISI_1 MISIMS_2 QMUL_1 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy - 11
  • 12. Results – Challenge 3 (Indignados Events) Precision Recall F-score NMIAUTHISSEL_4 88,91 90,78 89,83 0,738CERTH_1 86,24 54,61 66,87 0,4654DISI_1 86,15 47,17 60,96 0,4465MISIMS_2 48,3 46,87 47,58 0,3088QMUL_4 22,88 33,48 27,19 0,1988 100 89,83 90 80 70 66,87 60,96 60 47,58 50 40 30 27,19 20 10 0 Runs AUTHISSEL_4 CERTHITI_3 DISI_1 MISIMS_2 QMUL_4 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy - 12
  • 13. Conclusion Lessons learned  Clear winner for all tasks: generic approach but manual selection of the topics  Use of background knowledge still useful if well-used Looking at next year SED  Shlomo Geva (Queensland University of Technology) + Philipp Cimiano (University of Bielefeld)  Dataset: bigger, more diverse  Media: photos and videos ? (at least 10% videos?)  Metadata: include some social network relationships, participation at events  Evaluation measures: event granularity? Time/CPU? 04/10/2012 - Social Event Detection (SED) Task - MediaEval 2012, Pisa, Italy - 13

×