How Do We Deep-Link? Leveraging User-Contributed Time-Links for Non-Linear Video Access

541 views

Published on

Provision of non-Linear access to videos by capturing how users deep-link videos. A CUbRIK short research paper presented at ACM Multimedia 2013

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
541
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
3
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

How Do We Deep-Link? Leveraging User-Contributed Time-Links for Non-Linear Video Access

  1. 1. How Do We Deep-Link? Leveraging User-Contributed Time-Links for Non-Linear Video Access Raynor Vliegendhart, Babak Loni, Martha Larson, and Alan Hanjalic Multimedia Information Retrieval Lab, Delft University of Technology Introduction Contributions ● Problem: How do users deep-link? ● Notion of Viewer Expressive Reaction (VER): (i.e., refer to points in a video by explicitly mentioning time-codes) ● Motivation: Leverage time-codes within deep-link comments for enabling non-linear video access ● Dataset: MSRA-MM 2.0 / YouTube comments Reflects viewers’ perceptions of noteworthiness (but extend beyond depicted content and induced affect) ● Viewer Expressive Reaction Variety taxonomy (VERV): Captures how users deep-link; Shown to be appropriate for automatic filtering Envisioned Future Retrieval System Deep-links ▼ 0:44 omg so cute cats at play +surprise Funny cats in water what’s the breed of the cat at 2:14? by NekoTV • 2 year ago • 3,491 views Funny cats in and around water 3:24 The song at 3:33 is called “The eye of the tiger” epic failure at 0:23 Wrestling kittens 2:59 That didn’t go too well by Rizzzalie • 5 years ago • 8,136 views World wrestling federation, young feline division 5:17 Damn! i liked it till 2:11 then it became boring ► Nyaaa wrote: 1:12 omg, that’s impossible! 0:39 ► Xfade wrote: O_o now I didn’t expect that at all! 0:33 Stalking cat That move at 6:12 was dumb by lowdope • 4 years ago • 4,435 views Moire blog: http://moire.lowdope.com/ 1:14 (These deep-link comments occur unprompted on social video sharing platforms) ► jb87 wrote: Approach Results ● Taxonomy elicitation via crowdsourcing (Amazon Mechanical Turk): whoa. unreal creepy eyes at 1:02 ● Annotation agreement: ● Given: 3 deep-link comments per video ● Task: Describe why a comment was posted (2–4 sentences) ● Post-processing: Card-sorting technique ● VER/non-VER: 2,842 comments (84.6%) ● VERV: 2,140 comments (63.7%) ● Automatic classification results: ● Misclassification challenges: ● “Funny” comments often labeled as here by humans, but classified as love by the classifier ● Annotation crowdsourcing task, for: ● Comments with multiple interpretations ● Comments with multiple sentences ● Validating elicited VERV taxonomy ● Annotating 3,359 deep-link comments: ● Whether it contains a true deep-link (VER/non-VER) Future Work ● VERV class (if and only if VER comment) ● Improve automatic classification by adding content features ● Linear SVM comment classification experiment (unigram features) Contact: R.Vliegendhart@tudelft.nl @ShinNoNoir ● Develop the envisioned deep-link retrieval system 21st ACM international conference on Multimedia, Barcelona, Spain, 2013

×