Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

New Media Services from a Mobile Chipset Vendor and Standardization Perspective

194 views

Published on

The media landscape changes significantly over the last few years by new content formats, new service offerings, additional consumption devices and new monetization models. Think of Netflix, DAZN, Mediatheks, mobile devices, interactive content, smart TVs, Virtual and Augmented Reality, and so on. Many of these efforts have been realized by a limited usage of standards, but are standards irrelevant? Secondly, more and more services are enabled by latest mobile compute platforms enabling new services and experiences. This presentation will provide an overview some of these trends and will motivate the development of global interop standards. Specific aspects will include the move of linear TV services to the Internet (both mobile and fixed) as well recent advances on Extended Reality and immersive media trends.

Published in: Technology
  • Be the first to comment

New Media Services from a Mobile Chipset Vendor and Standardization Perspective

  1. 1. New Media Services from a Mobile Chipset Vendor and Standardization Perspective Qualcomm Technologies, Inc. @qualcomm_techNovember 2018 Thomas Stockhammer Director Technical Standards @haudiobe
  2. 2. 2 Disclaimer • Roles ◦ DASH-IF: IOP chair, Live, Ad Insertion, ATSC TF Chair, Guidelines editor ◦ MPEG: DASH editor, CMAF co-editor, MPEG-I architecture co-chair ◦ CTA WAVE: SC member, DPCTF chair ◦ DVB: SB Member, DVB CM-I chair (now vice chair) ◦ 3GPP: • Rel-12: TV Video Profiles, DASH extensions • Rel-13: MBMS APIs • Rel-14: Rapporteur for enhanced TV (3GPP award), • Rel-15: HDR for TV Video, VR Streaming, SAND4M • Rel-16: 5G Media, XR over 5G, TyTraC ◦ Also in VR-IF Board • Presentation will focus on research challenges in the context of the above work Presentation is centric to my work University of Klagenfurt - TEWI-Kolloquium
  3. 3. 3 Outline • New Media Services • Qualcomm‘s perspective • Technologies for new media services • Standardization Efforts and Research Challenges University of Klagenfurt - TEWI-Kolloquium
  4. 4. 4 Media 20 years ago TV Broadcaster University of Klagenfurt - TEWI-Kolloquium Radio Broadcaster News Publisher
  5. 5. 5 Media Today Internet transmission e.g. DTT Presentation Presentation Service Presentation TV device Non-TV deviceMobile delivery Broadcaster and Content Provider University of Klagenfurt - TEWI-Kolloquium
  6. 6. 6 New Media Some Examples
  7. 7. 7
  8. 8. 8University of Klagenfurt - TEWI-Kolloquium
  9. 9. 9 Emotional Streaming https://www.cnet.com/news/with-5g-you-wont-just-be-watching-video-itll-be-watching-you-too/ • Bob is watching a horror movie using an HMD. He is fascinated, but his body reaction, eye rolling, and other attributes are collected and are used to create a personalized story line. Movie effects are adjusted for personal preferences while reactions are collected when watching the movie. Bob’s emotional reactions determine the story-line. • Remember the last time you felt terrified during a horror movie? Take that moment, and all the suspense leading up to it, and imagine it individually calibrated for you. It's a terror plot morphing in real time, adjusting the story to your level of attention to lull you into a comfort zone before unleashing a personally timed jumpscare. • Or maybe being scared witless isn't your idea of fun. Think of a rom-com that stops from going off the rails when it sees you rolling your eyes. Or maybe it tweaks the eye color of that character finally finding true love so it's closer to your own, a personalized subtlety to make the love-struck protagonist more relatable. University of Klagenfurt - TEWI-Kolloquium
  10. 10. 10 New Media Attributes • Better quality • Personalized • Interactive and Lean-back and reactive • New production formats • Accessible anywhere on different devices • Gaming-like • Different story-lines • Multimedia and informative • Intuitive • New business models (subscription, ad, etc.) A random collection of thoughts University of Klagenfurt - TEWI-Kolloquium
  11. 11. 11 Qualcomm
  12. 12. 12 New technologies for future Media/XR requirements Providing an always-on experience that intelligently enhances our lives Immersive The visuals, sounds, and interactions are so realistic that they are true to life Cognitive It understands the real world, learns personal preferences, and provides security & privacy Connected An always-on, low power wearable with fast wireless cloud connectivity anywhere University of Klagenfurt - TEWI-Kolloquium
  13. 13. 13 Intuitive interactions Sound quality Visual quality Interaction and Immersion Qualcomm - We want to immerse you Immersion is enabled by different components that work together Extreme pixel quantity and quality Screen is very close to the eyes Stereoscopic display Humans see in 3D Spherical view Look anywhere with a full 360° spherical view High resolution audio Up to human hearing capabilities 3D audio Realistic 3D, positional, surround audio that is accurate to the real world Crystal clear voice Clear voice that is enhanced with noise cancellation technology Precise motion tracking Accurate on-device motion tracking Minimal latency Minimized system latency to remove perceptible lag Natural user interfaces Seamlessly interact with VR using natural movements, free from wires Learn more about our vision for the future of VR: www.qualcomm.com/VRUniversity of Klagenfurt - TEWI-Kolloquium
  14. 14. 14 Intuitive interactions Sound quality Visual quality Immersion AR introduces additional requirements for immersion Seamlessly integrating virtual objects with the real world Keeping the world stable Seamlessly anchor virtual objects to the real world Common illumination Lighting virtual objects realistically and dynamically Occlusion Showing and hiding virtual objects appropriately Realistic virtual sounds Modifying virtual sounds based on the real world environment Natural user interfaces Seamlessly interact with AR using natural movements, free from wires University of Klagenfurt - TEWI-Kolloquium
  15. 15. 15 Presence • Presence is the most crucible aspect to a true virtual reality experience. To have an effective VR, The user has to feel that he or she is within the artificially created world. This is the only way to elicit emotions and involuntary, reflex-like reactions from the user. • Tracking ◦ 6 degrees of freedom tracking - ability to track user's head in rotational and translational movements. ◦ 360 degrees tracking - track user's head independent of the direction the user is facing. ◦ Sub-millimeter accuracy - tracking accuracy of less than a millimeter. ◦ No jitter - no shaking, image on the display has to stay perfectly still. ◦ Comfortable tracking volume - large enough space to move around and still be tracked.[1] • Latency ◦ Less than 20 ms motion-to-photon latency - less than 20 milliseconds of overall latency (from the time you move your head to when you see the display change). ◦ Fuse optical tracking and IMU data - ◦ Minimize loop: tracker → CPU → GPU → display → photons.[1] • Persistence ◦ Low persistence - Turn pixels on and off every 2 - 3 ms to avoid smearing / motion blur. ◦ 90 hz+ refresh rate to eliminate visible flicker.[1] • Resolution ◦ Correct stereoscopic 3D rendering ◦ At least 1k by 1k pixels per eye ◦ No visible pixel structure - you cannot see the pixels.[1] • Optics ◦ Wide FOV - greater than 90 degrees field of view. ◦ Comfortable eyebox - the minimum and maximum eye-lens distance wherein a comfortable image can be viewed through the lenses. ◦ High quality calibration and correction - correction for distortion and chromatic aberration that exactly matches the lens characteristics.[1] Oculus Connect 2014: Brendan Iribe Keynote University of Klagenfurt - TEWI-Kolloquium
  16. 16. 16 Peak Download Speed: 1.2 Gbps Peak Upload Speed: 150 Mbps Ultra HD Premium video playback and encoding @ 4K (3840x2160) 60fps, 10bit HDR, Rec 2020 color gamut eXtended Reality (XR) Sensors Qualcomm® Snapdragon™ Neural Processing Engine (NPE) SDK Snapdragon 845 - how to get access University of Klagenfurt - TEWI-Kolloquium Snapdragon™ X20 LTE modem Adreno 630 Visual Processing Subsystem Wi-Fi Qualcomm® Hexagon™ 685 DSP Qualcomm Spectra™ 280 ISP Qualcomm Aqstic™ Audio Qualcomm® Kryo™ 385 CPU System Memory Qualcomm® Mobile Security *Compared to Snapdragon 835 Multimedia/XR/AR Computer vision, image processing, sensor processing, graphics, video processing, location, and cloud interaction Benefits • Integrated and optimized • Enhanced battery life • Thermal efficiency • Standardized implementation • Mass market cost • Variety of use cases and industry support Entire SoC is used! … and Snapdragon 855 comes 5G/X50 with 7nm
  17. 17. 17 We’re developing foundational technology for AR Qualcomm Technologies’ investments and the confluence of mobile technologies Computer Vision Cognitive & Security Heterogeneous Computing Next-Gen Connectivity • 6-DOF VIO • SLAM & 3DR • Object detection & recognition • AI for advanced cognitive processing • Local and cloud machine learning • Security and privacy • Lower power, higher perf. AR visual processing • Advancements in always-on sensor fusion • Next-gen AR audio • Gigabit LTE and Wi-Fi • Pioneering 5G technologies • Connectivity convergence University of Klagenfurt - TEWI-Kolloquium
  18. 18. 18 op Snapdragon 845 Room-Scale 6DoF SLAM 2K x 2K @ 120fps Qualcomm® Snapdragon™ 820 3DoF 1K x 1K @ 60 fps Snapdragon 835 6DoF 1.5K x 1.5K @ 60 fps Mobile XR is advancing in every generation University of Klagenfurt - TEWI-Kolloquium
  19. 19. 19 Room Scale 6DoF SLAM Smart Computing Intuitive Interactions University of Klagenfurt - TEWI-Kolloquium
  20. 20. 20 Advanced Security Iris Authentication Intuitive Interactions University of Klagenfurt - TEWI-Kolloquium
  21. 21. 21 Adreno Tile Rendering Eye-Tracking Multiview Rendering Fine Grain Preemption Visual Processing Qualcomm® Adreno™ GPU Foveation University of Klagenfurt - TEWI-Kolloquium
  22. 22. 22 Adreno Tile Rendering Eye-Tracking Multiview Rendering Fine Grain Preemption Visual Processing Qualcomm® Adreno™ GPU Foveation University of Klagenfurt - TEWI-Kolloquium
  23. 23. 23University of Klagenfurt - TEWI-Kolloquium
  24. 24. 24 Natural Interactions Hand Tracking University of Klagenfurt - TEWI-Kolloquium
  25. 25. 25 • Always On • Voice commands • 3D Audio • Echo Cancellation • Noise reduction Audio Intelligence Sounds University of Klagenfurt - TEWI-Kolloquium
  26. 26. 26 Conventional 6-DoF: “Outside-in” tracking External sensors determine the user’s position and orientation University of Klagenfurt - TEWI-Kolloquium
  27. 27. 27 Mobile 6-DoF: “Inside-out” tracking Visual inertial odometry (VIO) for rapid and accurate 6-DoF pose University of Klagenfurt - TEWI-Kolloquium
  28. 28. 28 Mobile 6-DoF: “Inside-out” tracking Visual inertial odometry (VIO) for rapid and accurate 6-DoF pose 6-DoF position & orientation (aka “6-DoF pose”) Captured from tracking camera image sensor at ~30 fps Mono or stereo camera data Accelerometer & gyroscope data Sampled from external sensors at 800 / 1000 Hz Camera feature processing Inertial data processing “VIO” subsystem on Qualcomm® Snapdragon™ Mobile Platform New frame accurately displayed Qualcomm® Hexagon™ DSP algorithms • Camera and inertial sensor data fusion • Continuous localization • Accurate, high-rate “pose” generation & prediction Qualcomm Hexagon is a product of Qualcomm Technologies, Inc. University of Klagenfurt - TEWI-Kolloquium
  29. 29. 29 Device Architecture / Technology Enablers for XR Optimizations needed across the SoC and system SW Direct access to VR features • Bypass Android latency Global time stamps • Maintain synchronization across various processing engines Late latching • Using the latest pose • Asynchronous threads for consistent frame rate Foveated rendering • Reduce pixel generation workload while maintaining high image quality HW streaming interfaces • Bypass DRAM with engine to engine communication • ISP to DSP • Sensors to DSP Multiview stereoscopic rendering • Single pass render of left and right eye High frame rate • 90 FPS for reduced frame latency Accurate motion tracking • Fast and accurate 6-DOF • Accurate predictive 6-DOF for a small future window Fast connectivity • Low latency connectivity • High bandwidth Single buffer • Low latency connectivity • High bandwidthUniversity of Klagenfurt - TEWI-Kolloquium
  30. 30. Standardization Efforts Global, Optimized HW Implementation, Interoperability
  31. 31. 31 Creating immersive XR experiences through standards-based technologies We need alignment in several key areas Professional production & user generation Video and audio compression AppsAccess network Transport: Content delivery Display and rendering University of Klagenfurt - TEWI-Kolloquium
  32. 32. 32 Container Presentation Engine (Rendering Control Engine) Graphics Rendering Engine (e.g. basedon OpenGL or WebGL) 2D/3D Audio Rendering Engine 2D Video (MPEG-compatible) Audio (MPEG-compatible) Static3D Resources (PC,Mesh, Texture) Timed3D Resources (PC,Mesh, Texture) Audio (non-MPEG) MPEGVideo Decoder Media-specific Parser/ Decoder Media-specific Parser/ Decoder Audio Parser/ Decoder MPEG AudioDecoder TimedScene Data/ SceneGraph/ Shader/Script Parser SensorInput LocalCamera Input LocalUserInput (keyboard,mouse, haptics) Local Microphone Input Timeline Control Head/Eye TrackingInput XRDisplay XRAudio Output Transport Network/File Interface Cloud Processing
  33. 33. Khronos/OpenXR
  34. 34. 34 OpenXR™ is creating an open standard for VR and AR applications and devices The Problem University of Klagenfurt - TEWI-Kolloquium
  35. 35. 35 OpenXR Philosophies Enable both VR and AR applications The OpenXR standard unified common VR and AR functionality to streamline software and hardware development for a wide variety of products and platforms Be future-proof While OpenXR 1.0 is focused on enabling the current state-of-the-art, the standard is built around a flexible architecture and extensibility to support rapid innovation in the software and hardware spaces for years to come Do not try to predict the future of XR technology While trying to predict the future details of XR would be foolhardy, OpenXR uses forward-looking API design techniques to enable designers to easily harness new and emerging technologies Unify performance-critical concepts in XR application development Developers can optimize to a single, predictable, universal target rather than add application complexity to handle a variety of target platforms
  36. 36. MPEG-i
  37. 37. 37 MPEG-I • In October 2016, MPEG initiated a new project on “Coded Representation of Immersive Media”, referred to as MPEG-I based on a survey. • After the launch of the project, several phases, activities, and projects have been launched that enable services considered in MPEG-i. • Core technologies as well as additional enablers are implemented in parts of the MPEG-I standard. • Currently 8 parts are under development. • Part 1 – Immersive Media Architectures • Part 2 – Omnidirectional MediA Format • Part 3 – Versatile Video Coding • Part 4 – Immersive Audio Coding • Part 5 – Point Cloud Coding • Part 6 – Immersive Media Metrics • Part 7 – Immersive Media Metadata • Part 8 – Network-Based Media Processing University of Klagenfurt - TEWI-Kolloquium
  38. 38. 38 Different Track Activities • System: OMAF (File Format, DASH, Metadata) • Video: HEVC or AVC + SEI • Audio: MPEG-H Audio or simple AAC 3DoF V1 12/2017 • System: OMAF (File Format, DASH, Metadata) • Video: HEVC or AVC + Metadata • Audio no updates 3DoF+ 12/2019 • System: Rendering Centric, Scene Description • Video: PCC, Dense Light Field, others • Audio: Rendering centric audio architecture 6DoF Different tracks • Network-based Media Processing • MetricsComplementary
  39. 39. 39 Immersive Cloud Media Decoder Media Retrieval Engine Presentation Engine Cloud Media Requests Media Resource References Timing Information Spatial Information Media consumption information Decoder Decoder Local Storage Manifest, Index, … Texture Buffer #1 Shader Buffer Vertex Buffer #n Vertex Buffer #1 Texture Buffer #n Texture Buffer #2 Audio Decoder RenderingSync Sync Information Shader Information Protocol Plugin Format Plugin MPEG is currently investigating storage and streaming formats for immersive media
  40. 40. 40 Current frame-based approach decoder
  41. 41. 41 Desired object-based approach decoder
  42. 42. 42 Use case #1: Decoding of 360 videos with PC object MPEG is currently investigating advanced decoder interfaces for immersive media
  43. 43. 43 The XR Challenge Constrained mobile wearable environment Thermally efficient for sleek, ultra-light designs Long battery life for all-day use XR workloads Compute intensive Complex concurrencies Always-on Real-time Latency Sensitive Approach: Split Rendering University of Klagenfurt - TEWI-Kolloquium
  44. 44. 44 Dumb Device: Pixel Streaming Overview VR graphics workload split into rendering workload on powerful XR server and ATW on device Low motion-to-photon latency preserved via on device Asynchronuous Time Warping (ATW) Snapdragon HMD • Chipset: SD 835 • Monocular 6-DOF Head tracking • Display Rate: 70Hz • Display Resolution: 2560x1440 • Downlink: 802.11ad (60GHz), 32 element antenna • Uplink: 802.11 n(5GHz),2x2 MIMO Rendering Server • Content: Unity Adam Demo* • Eye Buffer : 1440x1440 • Frame Rate: 60fps • Codec: H264 • Avg/Peak Encoded Rate: 100/120 Mbps Example Implementation Compressed Rendered Frame 6-DOF Processing 6-DOF Processing Game Engine Game Engine Video Encoder Video Encoder Low Latency Transport Low Latency Transport GPU (ATW Plus + Error Concealment) GPU (ATW Plus + Error Concealment) Rendered Frames Compressed Rendered Frame Video Stream XR Edge Server HMD HMD Pose mmWave Compressed Rendered Frame Rendered Frames Low Latency Transport Low Latency Transport Video DecoderVideo Decoder System Architecture Qualcomm Confidential and Proprietary University of Klagenfurt - TEWI-Kolloquium
  45. 45. 45 Phase 2: Vector Streaming Split rendering framework based on generating textures and meshes (geometry) for even XR graphics quality Seamless support for enhanced XR rendering optimizations (e.g. foveated rendering, asynchronous space warp) 6-DOF Processing 6-DOF Processing Game Engine Game Engine Video Encoder Video Encoder Low Latency Transport Low Latency Transport GPU (Rendering +Warping) GPU (Rendering +Warping) XR Edge Server HMD HMD Pose 5G NR Low Latency Transport Low Latency Transport Video Decoder Video Decoder System Architecture Texture + Mesh Compressed Texture+ Mesh Compressed Texture+ Mesh Video Stream Compressed Texture+Mesh Texture+Mesh University of Klagenfurt - TEWI-Kolloquium
  46. 46. 4646 Wireless Split Rendering Research Leveraging 5G to deliver next gen VR/AR Experiences XR content partially rendered on powerful compute resources Compressed content delivered via multi- Gbps, sub ms latency wireless link Mobile Edge Compute 5G Wireless XR Platform Power-efficient, latency sensitive rendering and 6-DOF processing University of Klagenfurt - TEWI-Kolloquium
  47. 47. 47 Back to earth Challenges in classical media distribution
  48. 48. 48 Media Today Internet transmission e.g. DTT Presentation Presentation Service Presentation TV device Non-TV deviceMobile delivery Broadcaster and Content Provider University of Klagenfurt - TEWI-Kolloquium
  49. 49. SOME OBJECTIVES AND APPROACHES UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM • Enabling broadcast-grade linear TV service on the Internet • Making media service more personalized, interactive and immersive • Enabling monetization of media services • Making services accessible on many different devices and platforms • Ensuring an end-to-end work flow with all enablers is in place • Encode and package once, distribute to all ObjectivesObjectives • Interoperability programs • Identifying commercial Demand • Global standards and ecosystems • End-to-end workflows and ecosystems • Supporting implementations by test, open source, conformance and reference tools Approaches:Approaches:
  50. 50. Media Distribution: Status Check • UEs are very different from the how they looked liked when 3GPP SA4 services were initially designed. Typically, a client for a vertical service was integrated, such as considered for PSS and MBMS. • Mobile video traffic accounted for 60 percent of total mobile data traffic in 2016. Mobile video traffic now accounts for more than half of all mobile data traffic. • More than three-fourths of the world’s mobile data traffic will be video by 2021. Mobile video will increase 9-fold between 2016 and 2021, accounting for 78 percent of total mobile data traffic by the end of the forecast period. • Media Frameworks • Android Stagefright • Browser-based media playback • iOS media framework University of Klagenfurt - TEWI-Kolloquium
  51. 51. Possible 5G Media Architecture (Where is split?) University of Klagenfurt - TEWI-Kolloquium
  52. 52. Abstracted Architecture University of Klagenfurt - TEWI-Kolloquium
  53. 53. 53 5G Media Streaming Architecture • Stage-2 work for 5G Media Streaming Architecture was created in S4-181524. • The objectives are to create a new 5G Media Streaming (5GMSA) architecture specification which supports: ◦ MNO and 3rd party Media Downlink Streaming Services with relevant functions and interfaces to support: • Different collaboration scenarios between third party-providers and mobile network operators for media distribution over 5G; • Appropriate service and session definitions in the context of 5G Media Distribution, especially for third-party media services and corresponding network interfaces to establish, announce and discover those; • A distribution-independent service establishment and content ingest interface; • Relevant functions for operators and third-party service providers in different collaboration scenarios, including but not limited to aspects such as session management, QoS framework, network assistance, QoE reporting, accessibility, content replacement, notification, content rights management, etc. • The delivery of 3GPP-defined media formats and profiles as well as third-party formats based on commonly defined packaging formats. ◦ Note: potential evolutions of 5GS e.g. MBMS, will be considered when specified. • MNO and 3rd party Media Uplink Streaming Services based on the non-IMS FLUS architecture: ◦ Specify the non-IMS FLUS entities and interfaces as part of the 5GMSA where the FLUS sink is not in the UE; ◦ Enable different collaboration scenarios between third party-providers and mobile network operators for media over 5G. • Corresponding UE functions and APIs; • Compatible deployments with EPS and MBMS; • Usage of 5G specific features such as network slicing and edge computing. Rel-16 Work Item agreed last week in Busan: Ericsson, QC, Samsung, Orange, … University of Klagenfurt - TEWI-Kolloquium
  54. 54. 54 FS_TyTraC • TR26.925v0.2.0 is developed, latest version in S4-181414 • Includes 5G QoS model and 5QIs • Media Service Questionnaire S4-181521 ◦ Already collected information S4-181267: Comcast, Hulu, IRT, Amazon and Bitmovin ◦ More to come based on updated questionnaire: https://goo.gl/forms/3GeCpv2J6tIWn3O12. Rel-16 study item to collect typical traffic characteristics of existing media services University of Klagenfurt - TEWI-Kolloquium
  55. 55. 55 Other Industry Efforts MPEG, DASH-IF, DVB, CTA
  56. 56. 56 Large Scale Internet Media Distribution • MPEG (Comcast, Netflix, Apple, Microsoft, Dolby, Hulu, etc.) ◦ CMAF  converged format between HLS and DASH. Promising low-latency, encryption and so on. Great per se, technical issues ◦ DASH  extensions to DASH MPD to enable new service features ◦ ISO BMFF  core group to address extensions to packaging formats • DASH-IF (Hulu, Netflix, Comcast, Amazon, Microsoft, Akamai, Google, etc.) ◦ LL-DASH  Addresses the ability to deliver low-latency in scale and quality ◦ Target Ad Insertion  ability to replace content downstream based on clear rules ◦ Content Ingest  Ability to Ingest content to distribution system with all necessary metadata ◦ Event APIs  ability to deliver interactive metadata ◦ Simplification of specification and clarification of many corner cases  example to come • DVB (TV Manufacturers, European Broadcasters, European Network Operators, etc.) ◦ Low-Latency DASH together with DASH-IF ◦ DVB-I  Internet-based service layer ◦ ABR Multicast  addressing IP multicast based delivery • W3C/CTA WAVE (Comcast, Netflix, Apple, Microsoft, Dolby, Hulu, BBC, Google etc.) ◦ Bridging MPEG world with the web world ◦ Browser and HTML-5 media playback – interactive media ◦ Encryption and DRM • How do we get this all back to 3GPP and where is 3GPP’s role in this? Standards and Ecosystems University of Klagenfurt - TEWI-Kolloquium
  57. 57. 57 Splicing – what works and what does not? Video 2, (PC1)Video 1 Questions? 1. Did we miss any important cases? 2. Can we exclude/forbid any of the case? 3. Do we have a consistent MPD signaling in place for all cases? 4. Do we have a clear playback understanding in place for all cases? 5. Do we have a playback implementation, e.g. on dash.js? Answers • Working with content providers and player developers to create a consistent set of content rules and playback requirements Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2 PC 1Video 1 Video 2 PC 1Video 1
  58. 58. 58 Low-Latency DASH and Ad Insertion UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  59. 59. Random Access and Start-up Delay End-to-End Latency Compression Efficiency Network Efficiency and Scalability  Number of Requests  Number of Invalid Requests Robustness to Errors Note: Not all apply for all services, but may be relevant KEY FUNCTIONS AND PERFORMANCE INDICATORS UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  60. 60. LOW-LATENCY STREAMING - CHUNKING UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM DASH Packager CHCIC CNC CNC CICCNC IS CNC CNC CIC CNC CNC CNC CIC HTTP Chunk HTTP Chunk DASH Segment MPD CNC = CMAF non-initial chunk CIC = CMAF initial chunk CH = CMAF Header Low-Latency DASH Client CDN stores Segments Regular DASH Client Segments Chunks 10s 3s Encoder Many technical details need to be addressed
  61. 61. SERVICE PROPERTY DESCRIPTION AND DASH CLIENT Primary issues  Application based solution is not reliable, some environments work w/o app  DASH client needs to makes complex decisions based on information from the service offering, the device capabilities, user interaction and network status – very hard on low latency  The service provider wants to express the desired service capabilities supporting/forcing the DASH client to appropriate execute the rate adaption Encoder and DASH Packager IS CNC CNC CICCNC CNC CIC HTTP Chunk HTTP Chunk DASH Segment MPD Low-Latency DASH Client CDN caches Segments Chunks Application DASH Headend Settings DASH Client Settings • player = dashjs.MediaPlayer().create(); • player.initialize(video, url, true); • player.setLowLatencyEnabled(true); • player.setLiveDelay(1); • results in 2.2s delay UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  62. 62. 62 MPEG DASH supporting work • Producer Reference Time in MPD and segments for latency measurements • Initialization Set provides a common set of media properties across the Media Presentation. ◦ An Initialization Set may be selected at the start of a Media Presentation in order to establish the relevant decryption, decoding and rendering environment. ◦ Relevant for multi-period and ad insertion • Service Description ◦ Addresses service providers influence on DASH client operation ◦ Specification work • Service description reference model • The semantics of the service description • mapping of the semantics to DASH MPD • example client implementation usage guidelines As part of Amd.5 of ISO/IEC 23009-1 University of Klagenfurt - TEWI-Kolloquium Media Streaming Application Selection Logic DASH access engine Media engine Event Processing Event + timing MPEG Format media + timing MPD Selection metadata Selected Adaptation Sets Media output MPD Segment data Service Description Playback Control
  63. 63. Origin DASH Packager MPD with ad prepared Segments SCTE-35 Interpreter MPD Manipulator (Proxy) SCTE-35 Interpreter SCTE-35 in MPD (or emsg) Light Extended Live Sim MPD with ads Segments DASH Client 1‘ 2 1 Ad Server TARGETED AD INSERTION 4 Ad Parameters Many technical details need to be addressed UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  64. 64. 64 Splicing – what works and what does not? Video 2, (PC1)Video 1 Questions? 1. Did we miss any important cases? 2. Can we exclude/forbid any of the case? 3. Do we have a consistent MPD signaling in place for all cases? 4. Do we have a clear playback understanding in place for all cases? 5. Do we have a playback implementation, e.g. on dash.js? Answers • Working with content providers and player developers to create a consistent set of content rules and playback requirements Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2Video 1 Video 2 PC 1Video 1 Video 2 PC 1Video 1
  65. 65. 65 CMAF and CTA WAVE UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  66. 66. CMAF Content Stand-alone HLS HLS as HTML-5 video tag Stand-alone DASH DASH as HTML- 5 video tag HTML-5 MSE- based Type-3 player CDN, Broadcast, multicast Application DASH MPD HLS M3U8 referencing DIFFERENT PLAYERS – SINGLE ENCODING AND COMMON DELIVERY Platforms and PlayersContent OfferingManifest Delivery UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  67. 67. 67 CTA WAVE • Content Spec  CMAF + extensions • Application APIs  HTML-5 • Device Playback ◦ Requirements on performance of media playback when CMAF media is used ◦ Tests for playback, switching, splicing, overlap ◦ Spec will be released by end of 2018 University of Klagenfurt - TEWI-Kolloquium
  68. 68. 68 DVB Internet Services (DVB-I) UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  69. 69. DVB-I, the mission… • DVB-I, where the “I” stands for “Internet” – In the context of audio-visual services, “The Internet” is used for “Over-The-Top” (OTT) delivery – Well, “The Internet”, as in “CDN overlaid, edge assisted, adaptive delivery, media cloud” • …To enable DVB services to be discovered and consumed by devices with basic Internet connectivity, principally a non-managed broadband connection and HTTP access, providing a similar user proposition to that of a DVB broadcast service University of Klagenfurt - TEWI-Kolloquium The Internet 1..n 1..n 1..n 1..n 1..n ISP network
  70. 70. DVB-I, the technologies • Harnessing foundation technologies to provide a complete DVB solution for live OTT delivery: – DVB-DASH (ABR – adaptive bit-rate) • ETSI TS 103 285 – Low-latency DASH (LL-DASH) • Technical work ongoing – Multicast ABR (MABR) • Technical work ongoing • Reference Architecture published A176 • Protocols selected (align with 3GPP and ATSC) • Potential synergies with other ongoing DVB work items: – Targeted Advertising – Home Broadcast • Potential liaison activities: University of Klagenfurt - TEWI-Kolloquium DVB-I DVB-DASH LL-DASH MABR Signalling/ Metadata Service Discovery DVB AV Codecs
  71. 71. DVB-I, the vision • Functional overview; likely roles and elements of the DVB-I specification University of Klagenfurt - TEWI-Kolloquium DVB-I service portal Aggregator Broadcaster DVB-I (DASH, CMAF, WAVE) Gatekeeper e.g. DTT §§§ Regulator Presentation PresentationPresentation e.g. 3GPP EnTV and 5G Where appropriate/necessary: • Licensed broadcasters only; • Protect end users from illegal / subversive services Enable integration of service lists or innovation in their management TV device Non-TV device
  72. 72. Status of DVB-I • Commercial Requirements completed, approval by mid August • Main themes of the into more than 50 CRs – Applicability to TVs (w/ & w/o app) and non-TV devices incl. mobile & browser – Over-the-top possible, also optimization/management – Relying on DVB-DASH for delivery, likely LL-DASH once ready – Key concept are Service Lists including Service information, which are semi-static and provide some equivalence to DVB-SI – User experience equivalent to DVB-S/T/C/IPTV – Services can be 24x7x365 and can be part-time – Services can be a mix of live events and VoD Assets, and personalized – Hybrid services and devices are considered – Trust, security and privacy aspects are considered – Expected to have a receiver profile for a minimum-to-implement features for FTA services – And many more … • Considered a starting point to replicate broadcast experience • Technical work just started University of Klagenfurt - TEWI-Kolloquium
  73. 73. DVB-I ref Client Origin HTTP Server Browser ReferenceUI DVB-I Service Offering DVB-I Service ref client API API Origin HTTP Server DVB-DASH Offering Origin HTTP Server DVB Content Guide DVB-DASH ref player API Content Guide ref Client DASH Validator DVB-I Service Validator DVB-I Content Guide Validator University of Klagenfurt - TEWI-Kolloquium
  74. 74. DVB-I & DASH Source File Live contribution encoding with time codeburn Looping Camera ABR encoding + chunked encapsulation + MPD generation CDN DASH client including decoder DASH Presentations Live and Low- Latency Live Simulator Source File ABR encoding + encapsulation + MPD generation Akamai Dash.js + browser DASH-IF live sim + Amazon EC2 FFMPEG + Amazon EC2 AWS Elemental Live AkamaiFFMPEG + offline Akamai Akamai DASH services DVB-I Service List Simulator Reference DVB-I Client dvbi-sim + Amazon EC2 dvbi.js + dash.js + browser Dvb development University of Klagenfurt - TEWI-Kolloquium
  75. 75. 75 Summary UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM
  76. 76. Standards are relevant for the TV Grade Media moving to new devices and experiences No longer vertical services, but individual enablers APIs, testing, reference implementations, modular designs Global efforts - SUMMARY UNIVERSITY OF KLAGENFURT- TEWI-KOLLOQUIUM JOIN THE EFFORTS
  77. 77. Follow us on: For more information, visit us at: www.qualcomm.com & www.qualcomm.com/blog Thank you! Nothing in these materials is an offer to sell any of the components or devices referenced herein. ©2018 Qualcomm Technologies, Inc. and/or its affiliated companies. All Rights Reserved. Qualcomm and Snapdragon are trademarks of Qualcomm Incorporated, registered in the United States and other countries. Other products and brand names may be trademarks or registered trademarks of their respective owners. References in this presentation to “Qualcomm” may mean Qualcomm Incorporated, Qualcomm Technologies, Inc., and/or other subsidiaries or business units within the Qualcomm corporate structure, as applicable. Qualcomm Incorporated includes Qualcomm’s licensing business, QTL, and the vast majority of its patent portfolio. Qualcomm Technologies, Inc., a wholly-owned subsidiary of Qualcomm Incorporated, operates, along with its subsidiaries, substantially all of Qualcomm’s engineering, research and development functions, and substantially all of its product and services businesses, including its semiconductor business, QCT.

×