The profusion of digital content now available to an individual, average consumer/viewer/listener is over-whelming, potentially forcing consumers into increasingly narrow bands of media experience as they retreat to limiting choices as a coping strategy. Professional recommenders, such as newspaper film and TV reviewers, are similarly overwhelmed, and paradoxically, more relied upon by consumers whilst considered less relevant as alternative automated and semi-automated recommendation systems emerge: for movies (e.g. The Netflix competitions), music (e.g. last.fm), or books (e.g. Amazon).
Examples:1.) Taking a photoAfter taking the photo one could have the idea that it makes sense to annotate the picture with the place it was taken at.One way could be to first take the picture and then employ a massive database consisting of millions of pictures of places to estimate where the picture was taken.Another way could be to put a little inexpensive GPS sensor into the camera and slightly extend the production workflow by storing the current GPS location along the image data.Which approach do you think would be more successful?=> Such cases is what we are interested in. How can the workflow be extended do make things much easier?2.) Annotating music recordingsLyrics extraction very hard from final recording but in the studio with multitrack much easier and automatic.
Figure: subfields in the ICT sector
Semantic Media Project Introduction - Mark Sandler (Barbican Arts Centre, Oct 2012)
Project Introduction Mark SandlerBarbican Arts Centre, October 2012
Semantic Media – Problem Area TV Productions Music / Radio Productions Consumer: How to find relevant content in large media collections? Producer: How to monetize, how to subvert piracy? Film Productions Photo Productions Source of images: Google
Navigation in Content Collections:Previous Approaches Automatic annotations often not as detailed and robust as needed Reason: Automatic methods have no access to knowledge only available during production User interfaces are not as rich as needed Reason: Metadata does not incorporate relevant external information
Semantic Media - Concept 1:Annotation As Part of Production Workflow Employing knowledge of the production process leads to simplified and hence more robust (automatic) metadata generation procedures Integrating additional information usually discarded after production allows for richer annotations Resulting novel workflow systems facilitate automation and assist content producers as well consumers throughout the content life-cycle
Semantic Media - Concept 1: Example Metadata: Where was this picture taken? What is in it? What’s the weather like? Source of image: Wikipedia
Semantic Media - Concept 1: Example Metadata: Who are the actors (in this episode)? What are the story lines?
Semantic Media - Concept 2: Incorporating GlobalKnowledge Using Linked Data Technology Managing and exposing enhanced metadata using semantic web and linked data technology allows for uniting various sources of information and thus improving the user experience with richer interfaces
Semantic Media - Concept 2: Example BBC Music website + Structured Wikipedia Data = Improved User Experience More about this later…
Goals of the Semantic Media Project Creating a forum for researchers / developers Encouraging interdisciplinary research bringing together specialists across the entire ICT sector Sparking new collaborations between researchers (including industry partners) by funding mini-projects, student exchanges and internships Encourage leading researchers to develop roadmaps guiding the direction of future research efforts and grant applications Encourage substantial grant applications: UK & EU
Semantic Media – The Network Future challenges and opportunities require researchers from across the ICT landscape to contribute and collaborate EPSRC priority “Working Together” Semantic Media Network as a hub for future collaborative grant applications
Funding - Opportunities and Examples Exchange of students across working groups and internships / placements Construction of ontologies appropriate for 3D+t content description (sound, video, objects) Capturing of motion information in a film/tv set to capture scene-descriptive metadata to associate with the primary media stream (i.e. video) Fusion of metadata from disparate sources to build a composite metadata stream associated with a single media stream, propagating through the value chain from producer to consumer, e.g: Metadata from several musical instruments to create a composite harmony stream Motion metadata streams from several actors in a scene to create a composite action stream Combining rights-related metadata (e.g. using MPEG Value Chain Ontology ), user generated and other tags downstream from creation Application of temporal logic on (time-structured) media metadata streams  Use of capture-at-source metadata to enhance the production workflow Ethnographic studies of metadata-enhanced production tools to assess their fitness for purpose
Funding - Application 1.) Short semi-formal application 2.) Steering committee selects best ideas 3.) Start working Funding available for 10-20 projects £10k-£50k each More details on the website soon…
Getting involved Just the beginning forming a network. Participate by: Join our mailing list for announcements and discussions (soon to come and you will be informed about it) Have an idea for a feasibility study and put it on our idea-wiki Become an active member of the steering committee Help organizing meetings (maybe focused on a specific subfield) Help documenting the research landscape by participating in the landscape-wiki Participate in future meetings, sandpits, tutorials, as well as collaborative grants and paper submissions Help identifying people who might be interested in this network and invite them (or tell us) Check our website: semanticmedia.org.uk
Programme For Today 09:00 Registration and Welcome Coffee 09:30 Project Introduction: Mark Sandler (Queen Mary, University of London) Invited talk: Karlheinz Brandenburg (Director of the Fraunhofer Institute for Digital Media Technology, Illmenau, Germany): Extraction of 09:45 metadata from media data: From music recommendation to recognition of big apes Networking Event: 60 second quick-fire summaries of the participants 10:45 background and interests. 12:00 Fingerfood Buffet with Poster Presentations and Demos Invited Talk: David De Roure (Director of the Oxford e-Research 13:30 Centre, University of Oxford) 14:30 Coffee Break 15:00 Invited Talk: Yves Raimond and David Rogers (BBC) 16:00 Networking Event: "Speeddating" 17:00 End