2. Outline
•
•
•
•
•
The search and hyperlinking task
Dataset (Videos + user input)
Ground truth creation
Evaluation procedure
Results
5/13/13
LIME workshop - WWW2013
3. ME 2013 Search & Hyperlinking
• ME 2012 S&HL “brave new” task:
– Search: retrieve known-item video segment given a
natural language description
– Linking: link known-item video segment to similar
segments in the collection (blib.tv)
• ME 2013 S&HL “regular” task
– Search: retrieve known video segment (known-item)
given a textual query and visual cues
– Linking: link user-defined anchor within the knownitem to relevant target video segments
7/2/13
DGA workshop - July 2013, Paris
4. Terminology
• Video (e.g, 2 hours)
• Interesting segment (e.g. 10 min)
• Anchor: segment for which a user
requests a link (e.g., 1 min)
“I want to know more about this”
• Hyperlink
• Target: relevant segment for given
anchor (e.g., 5 min)
7/2/13
DGA workshop - July 2013, Paris
5. ME 2013 Search & Hyperlinking
Jump-in point
7/2/13
Anchor
DGA workshop - July 2013, Paris
6. Dataset: Video collection
• copyright cleared broadcasts from the period
of 01.04.2008 – 11.05.2008
• 1667 hours, 2323 videos
• ~200 videos were rebroadcast
5/13/13
LIME workshop - WWW2013
14. Ground truth creation
• Search sub-task: done (known-items)
• Judgments linking sub-task:
– top-10 of one run per participant (7) by users
locally at BBC
– top-10 of most runs using Amazon's mechanical
turk
– ~70% Agreement
– Similar measurements
5/13/13
LIME workshop - WWW2013
28. Conclusions
•
•
•
•
•
Task defined by real users
Innovative elements perceived as interesting
Crowdsourcing valid evaluation strategy
Experimental setup works
Provides a promising starting point for further
research
29. The Search and Hyperlinking task was funded by
We are grateful to
Jana Eggink and
Andy O'Dwyer
from the BBC for preparing the collection and hosting the user trials.
... and of course Martha for advise & crowdsourcing access.