Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

of

CAdViSE or how to find the Sweet Spots of ABR Systems Slide 1 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 2 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 3 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 4 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 5 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 6 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 7 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 8 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 9 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 10 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 11 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 12 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 13 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 14 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 15 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 16 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 17 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 18 CAdViSE or how to find the Sweet Spots of ABR Systems Slide 19
Upcoming SlideShare
What to Upload to SlideShare
Next
Download to read offline and view in fullscreen.

0 Likes

Share

Download to read offline

CAdViSE or how to find the Sweet Spots of ABR Systems

Download to read offline

With the recent surge in Internet multimedia traffic, the enhancement and improvement of media players, specifically DASH media players happened at an incredible rate. DASH Media players take advantage of adapting a media stream to the network fluctuations by continuously monitoring the network and making decisions in near real-time. The performance of algorithms that are in charge of making such decisions was often difficult to be evaluated and objectively assessed.

CAdViSE provides a Cloud-based Adaptive Video Streaming Evaluation framework for the automated testing of adaptive media players. In this talk, I will introduce the CAdViSE framework, its application, and propose the benefits and advantages that it can bring to every web-based media player development pipeline. To demonstrate the power of CAdViSE in evaluating Adaptive Bitrate (ABR) algorithms I will exhibit its capabilities when combined with objective Quality of Experience (QoE) models. For this talk, my team at Bitmovin/ATHENA has selected the ITU-T P.1203 (mode 1) model in order to execute experiments and calculate the Mean Opinion Score (MOS), and better understand the behavior of a set of well-known ABR algorithms in a real-life setting. The talk will display how we tested and deployed our framework using a modular architecture into a cloud infrastructure. This method yields a massive growth to the number of concurrent experiments and the number of media players that can be evaluated and compared at the same time, thus enabling maximum potential scalability. In my team’s most recent experiments, we used Amazon Web Services (AWS) for demonstration purposes. Another awesome feature of CAdViSE that will be discussed here is the ability to shape the test network with endless network profiles. To do so, we used a fluctuation network profile and a real LTE network trace based on the recorded internet usage of a bicycle commuter in Belgium.

CAdViSE produces comprehensive logs for each media streaming experimental session. These logs can then be applied against different goals, such as objective evaluation to stitch back media segments and conduct subjective evaluations afterwards. In addition, startup delays, stall events, and other media streaming defects can be imitated exactly as they happened during the experimental streaming sessions.

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

CAdViSE or how to find the Sweet Spots of ABR Systems

  1. 1. CAdViSE or how to find the Sweet Spots of ABR Systems November 15th, 2021 Babak Taraghi DDRC 2021
  2. 2. ● Introduction & Background ● What is CAdViSE? ○ Components and Architecture ○ CAdViSE In Action ○ Further In-depth Studies ● Summary ● Questions and Answers Agenda
  3. 3. Introduction & Background I 3 ● HTTP Adaptive Streaming (HAS) Is a technique used to deliver the media files from an origin computer to the client, which adapts the delivered media file properties to the current network link conditions. ● Media Players and their ABR Algorithms ABR algorithm is the key function of deciding which bit rate segments to download, based on the current state of the network. ● Significant Network Link Attributes ○ Corrupted Packets ○ Available Bandwidth ○ Delay ○ Packet Loss or Duplicates HTTP/2-Based Methods to Improve the Live Experience of Adaptive Streaming - Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/The-concept-of-HTTP-Adaptive-Streaming-HAS-was-introduced-As-shown-in-Figure-1-video_fig1_283073448
  4. 4. Introduction & Background II 4 ● Quality of Experience (QoE) ○ Is a measure of the delight or annoyance of a customer's experiences with a service. HAS QoE metrics: ■ Start-up delay ■ Delivered Media Quality ■ Stall Events (rebuffering) ○ Mean Opinion Score (MOS) could be measured both Objectively and Subjectively, Predicted MOS and Perceived MOS mentalmind/Shutterstock.com
  5. 5. What is CAdViSE* 5 ● Cloud-based Adaptive Video Streaming Evaluation Framework for the Automated Testing of Media Players ○ A test environment (testbed) which can be instantiated in a cloud infrastructure, examines multiple media players with different network attributes, and conclude the evaluation with visualized statistics and insights into the results. ● Cloud Deployment, Amazon Web Services (AWS) ● Dockerized Environment ● Pluggable Media Players and ABR Algorithms ● Integratable with modern CI/CD pipelines ● Shape the network with real-life network traces (Network Profiles) * Taraghi, B., Zabrovskiy, A., Timmerer, C., & Hellwagner, H. (2020, May). CAdViSE: cloud-based adaptive video streaming evaluation framework for the automated testing of media players. In Proceedings of the 11th ACM Multimedia Systems Conference (pp. 349-352).
  6. 6. CAdViSE* Components and Architecture 6 ● Application Layer ○ Runner, Initializer and Starter scripts ○ Written with Bash Script, Python and Javascript ● Cloud Components ○ Player Container (VNC and Selenium) ○ Network Emulator ○ EC2 Instances, SSM Execution, DynamoDB, S3 and Cloudwatch ● Logs and Analytics ○ Bitmovin Analytic Players Plugin ○ Comprehensive Logs * Taraghi, B., Zabrovskiy, A., Timmerer, C., & Hellwagner, H. (2020, May). CAdViSE: cloud-based adaptive video streaming evaluation framework for the automated testing of media players. In Proceedings of the 11th ACM Multimedia Systems Conference (pp. 349-352).
  7. 7. CAdViSE In Action, Implementation and Application 7 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 ● Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms* ○ Seven well-know ABR algorithms and Media Players ○ 4 Challenging Network Profiles ● Using the streaming session CAdViSE logs; ○ Create JSON files as feed for the QoE Model ○ Stitch back the HAS multimedia segments (audio visual files) and generate a single MP4 ● Participants from Amazon Mechanical Turk (Mturk) ○ 835 Participants in our subjective evaluations ○ 5723 votes in total, out of which 4704 proved to be reliable 1000 2000 3000 4000 5000 6000 7000 8000 10 20 30 40 50 60 70 80 90 100 110 120 k b p s s e c o n d Ramp Up Ramp Dow n 1000 2000 3000 4000 5000 6000 7000 8000 10 20 30 40 50 60 70 80 90 100 110 120 k b p s s e c o n d Stable Fluctuation
  8. 8. Results and Findings I 8 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 Measurement of Significant Metrics for each Media Player or ABR Algorithm Performance in different Network Profiles
  9. 9. Results and Findings II 9 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 Comparison of Predicted MOS against Perceived MOS for each Media Player or ABR algorithm with Ramp Up network profile 2.56 2.67 2.63 2.26 2.84 2.26 2.79 3.62 3.73 3.65 3.45 3.68 3.41 3.73 1.00 1.50 2.00 2.50 3.00 3.50 4.00 4.50 5.00 BBA0 BOLA dash.js Elastic FastMPC Quetra Shaka Pearson's Correlation Coefficient 0.94 Objective MOS Subjective MOS
  10. 10. Results and Findings III 10 * Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner. 2021. Understanding quality of experience of heuristic- based HTTP adaptive bitrate algorithms. In <i>Proceedings of the 31st ACM Workshop on Network and Operating Systems Support for Digital Audio and Video</i> (<i>NOSSDAV '21</i>). Association for Computing Machinery, New York, NY, USA, 82–89. DOI:https://doi.org/10.1145/3458306.3458875 Comparison of Predicted MOS against Perceived MOS for each Media Player or ABR algorithm with Fluctuation network profile 2.22 1.86 1.99 2.07 1.91 1.98 1.98 3.39 3.21 3.29 3.12 3.10 3.08 3.30 1.00 1.50 2.00 2.50 3.00 3.50 4.00 4.50 5.00 BBA0 BOLA dash.js Elastic FastMPC Quetra Shaka Pearson's Correlation Coefficient 0.52 Objective MOS Subjective MOS
  11. 11. In-Depth Studies (INTENSE) 11 ● Minimum Noticeable Stall event Duration (MNSD) Evaluation. The minimum threshold of a stall event duration that is noticeable by end-users. ● Stall event vs. Quality level switch (SvQ) Evaluation. We assessed the end-user preference regarding these two scenarios. ● Short stall events vs. a Longer stall event (SvL) Evaluation. We studied the impact of multiple short stall events in contrast with a single longer stall event on the QoE from both predicted and perceived MOS perspectives. ● Relation of Stall event impact on the QoE with Video Quality level (RSVQ) Evaluation. ● Objective QoE Models Comparison. * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098.
  12. 12. Stall Events’ Patterns 12 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098.
  13. 13. MNSD Evaluation I 13 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. The decrease of noticed stall events starts from stall events with a duration of less than 0.301 seconds. More than 45% of the subjects could not notice the stall events with a duration of less than 0.051 seconds.
  14. 14. MNSD Evaluation II 14 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. We have determined that any stall event with a duration of less than 0.004 seconds was not noticeable for the participants in the MNSD evaluation.
  15. 15. SvQ & RSVQ Evaluations 15 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. Subjects tend to watch a higher quality version even if it is obtained by adding a stall event with a duration of six seconds. Stall events have a minor penalty on the QoE when the quality of videos is low
  16. 16. SvL Evaluation 16 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. The analysis results demonstrate a preference for a longer stall event over stall events with high frequency but with the same total duration as the longer stall event.
  17. 17. QoE Models Comparison 17 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. ITU-T P.1203 model shows the best performance for all evaluations with the highest PCC and SRCC (more than 0.8) and the most minor RMSE 0.326
  18. 18. Summary 18 * Taraghi, B., Nguyen, M., Amirpour, H., & Timmerer, C. (2021). Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming. IEEE Access, 9, 118087-118098. • Overview of HTTP Adaptive Streaming • Measurement of Quality of Experience • Introducing CAdViSE: cloud-based adaptive video streaming evaluation framework for the automated testing of media players • Showcase of Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms • Another use-case for CAdViSE, Intense: In-Depth Studies on Stall Events and Quality Switches and Their Impact on the Quality of Experience in HTTP Adaptive Streaming
  19. 19. Thank you 19

With the recent surge in Internet multimedia traffic, the enhancement and improvement of media players, specifically DASH media players happened at an incredible rate. DASH Media players take advantage of adapting a media stream to the network fluctuations by continuously monitoring the network and making decisions in near real-time. The performance of algorithms that are in charge of making such decisions was often difficult to be evaluated and objectively assessed. CAdViSE provides a Cloud-based Adaptive Video Streaming Evaluation framework for the automated testing of adaptive media players. In this talk, I will introduce the CAdViSE framework, its application, and propose the benefits and advantages that it can bring to every web-based media player development pipeline. To demonstrate the power of CAdViSE in evaluating Adaptive Bitrate (ABR) algorithms I will exhibit its capabilities when combined with objective Quality of Experience (QoE) models. For this talk, my team at Bitmovin/ATHENA has selected the ITU-T P.1203 (mode 1) model in order to execute experiments and calculate the Mean Opinion Score (MOS), and better understand the behavior of a set of well-known ABR algorithms in a real-life setting. The talk will display how we tested and deployed our framework using a modular architecture into a cloud infrastructure. This method yields a massive growth to the number of concurrent experiments and the number of media players that can be evaluated and compared at the same time, thus enabling maximum potential scalability. In my team’s most recent experiments, we used Amazon Web Services (AWS) for demonstration purposes. Another awesome feature of CAdViSE that will be discussed here is the ability to shape the test network with endless network profiles. To do so, we used a fluctuation network profile and a real LTE network trace based on the recorded internet usage of a bicycle commuter in Belgium. CAdViSE produces comprehensive logs for each media streaming experimental session. These logs can then be applied against different goals, such as objective evaluation to stitch back media segments and conduct subjective evaluations afterwards. In addition, startup delays, stall events, and other media streaming defects can be imitated exactly as they happened during the experimental streaming sessions.

Views

Total views

202

On Slideshare

0

From embeds

0

Number of embeds

137

Actions

Downloads

0

Shares

0

Comments

0

Likes

0

×