Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Network and Operating System Support for Digital Audio and Video (NOSSDAV’21)
Understanding Quality of Experience of Heuri...
Overview and the Motivation
• Main Idea
– Evaluate a set of well-known heuristic-based ABR algorithms.
– Conduct subjectiv...
Heuristic-based ABR Algorithms
• Seven ABR algorithms and two media players in total used for the evaluations:
– dash.js v...
4
• Using CAdViSE we have simulated 4
network profiles as shown in Figure 1:
– Ramp Down profile
– Ramp Up profile
– Fluct...
Significant Metrics
1. Startup delay
2. Stall events
3. Requested segment bitrate
• Shaka player default ABR algorithm and BOLA algorithm with a subjective MOS of 3.73 had
the best performance in Ramp Up ...
• 3.39 was the highest gained subjective MOS which shows good performance of BBA0
algorithm in a tough network profile (Fl...
Conclusions and Future Work
• We have evaluated seven well-known ABR algorithms in this paper.
– Learning-based ABR algori...
Thanks to my co-authors and
NOSSDAV’21 conference reviewers.
Dr. Bentaleb.
Professor Timmerer.
Professor Zimmermann.
Profe...
Upcoming SlideShare
Loading in …5
×

of

Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 1 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 2 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 3 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 4 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 5 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 6 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 7 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 8 Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Slide 9
Upcoming SlideShare
What to Upload to SlideShare
Next
Download to read offline and view in fullscreen.

0 Likes

Share

Download to read offline

Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms

Download to read offline

Adaptive BitRate (ABR) algorithms play a crucial role in delivering the highest possible viewer’s Quality of Experience (QoE) in HTTP Adaptive Streaming (HAS). Online video streaming service providers use HAS – the dominant video streaming technique on the Internet – to deliver the best QoE for their users. Viewer’s delightfulness relies heavily on how the ABR of a media player can adapt the stream’s quality to the current network conditions. QoE for end-to-end video streaming sessions has been evaluated in many research projects to give better insight into the quality metrics. Objective evaluation models such as ITU Telecommunication Standardization Sector (ITU-T) P.1203 allow for the calculation of Mean Opinion Score (MOS) by considering various QoE metrics, and subjective evaluation is the best assessment approach in investigating the end-user opinion over a video streaming session’s experienced quality. We have conducted subjective evaluations with crowdsourced participants and evaluated the MOS of the sessions using the ITU-T P.1203 quality model. This paper’s main contribution is subjective evaluation analogy with objective evaluation for well-known heuristic-based ABRs.

Related Books

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms

  1. 1. Network and Operating System Support for Digital Audio and Video (NOSSDAV’21) Understanding Quality of Experience of Heuristic-based HTTP Adaptive Bitrate Algorithms Babak Taraghi, Abdelhak Bentaleb, Christian Timmerer, Roger Zimmermann, and Hermann Hellwagner
  2. 2. Overview and the Motivation • Main Idea – Evaluate a set of well-known heuristic-based ABR algorithms. – Conduct subjective evaluations with crowdsourced participants and evaluate the MOS of the streaming sessions using the ITU-T P.1203 quality model. – To investigate the correspondence of subjective and objective evaluations for well-known heuristic- based ABRs. • Platforms and Participants – Using CAdViSE as our testbed to conduct the objective evaluations. CAdViSE provides a cloud-based platform to evaluate multiple ABR algorithms or media players under various network conditions. – Using Amazon Mechanical Turk (MTurk) to conduct our subjective evaluation. MTurk is a crowdsourcing website to hire remotely located crowd-workers to perform discrete on-demand tasks. – 835 participants in our subjective evaluation phase. – We have gathered 5723 votes in total, out of which 4704 proved reliable. • Test Sequences – Sintel, Valkaama, Big Buck Bunny, and Tears of Steel.
  3. 3. Heuristic-based ABR Algorithms • Seven ABR algorithms and two media players in total used for the evaluations: – dash.js v3.1.3 (Dynamic) • A combination of throughput-based and BOLA algorithms. – Shaka v3.0.4 (Throughput-based) • A simple throughput-based algorithm that uses throughput heuristics with an Exponential Weighted Moving Average (EWMA) smoothing function. – BBA0 • A buffer-based ABR algorithm that uses the current buffer occupancy to select the bitrate for the next segment. – BOLA • A buffer-based ABR algorithm that formulates ABR decisions as a utility maximization function using Lyapunov optimization. – Elastic • A hybrid ABR that uses feedback control theory that generates a long-lived Transport Control Protocol (TCP). – FastMPC • A hybrid ABR algorithm that uses a Model Predictive Control (MPC) approach. – Quetra • A buffer-based ABR that formulates ABR decisions as a queuing theory model, which calculates the expected buffer occupancy given a bitrate choice, network throughput, and buffer capacity.
  4. 4. 4 • Using CAdViSE we have simulated 4 network profiles as shown in Figure 1: – Ramp Down profile – Ramp Up profile – Fluctuation profile – Stable profile • Recorded the streaming session logs. • Calculated the MOS using the P.1203 quality model by utilizing the logs. • Stitched back the media segments and created a single file for subjective evaluations. Powered by CAdViSE* Objective Evaluation & Network Simulations *: Available at https://github.com/cd-athena/CAdViSE
  5. 5. Significant Metrics 1. Startup delay 2. Stall events 3. Requested segment bitrate
  6. 6. • Shaka player default ABR algorithm and BOLA algorithm with a subjective MOS of 3.73 had the best performance in Ramp Up network profile. • Bola algorithm also had the best performance in Ramp Down network profile with a MOS of 3.65. Result and Findings I 6
  7. 7. • 3.39 was the highest gained subjective MOS which shows good performance of BBA0 algorithm in a tough network profile (Fluctuation). • Dash.js default ABR algorithm had the best performance in Stable network profile with a MOS of 3.98. Result and Findings II 7
  8. 8. Conclusions and Future Work • We have evaluated seven well-known ABR algorithms in this paper. – Learning-based ABR algorithms will be included in our future work. • This paper’s main contribution is to investigate the correspondence of subjective and objective evaluations for well-known heuristic-based ABRs. – Other Quality of Experience models will be compared in our future work. • Introduced four network profiles which were simulated using our testbed, CAdViSE. – Real network profiles/traces will be used in our future work. • Deeper statistical analysis over the findings and obtained result would be another direction for our future work.
  9. 9. Thanks to my co-authors and NOSSDAV’21 conference reviewers. Dr. Bentaleb. Professor Timmerer. Professor Zimmermann. Professor Hellwagner. Babak Taraghi Klagenfurt 2021

Adaptive BitRate (ABR) algorithms play a crucial role in delivering the highest possible viewer’s Quality of Experience (QoE) in HTTP Adaptive Streaming (HAS). Online video streaming service providers use HAS – the dominant video streaming technique on the Internet – to deliver the best QoE for their users. Viewer’s delightfulness relies heavily on how the ABR of a media player can adapt the stream’s quality to the current network conditions. QoE for end-to-end video streaming sessions has been evaluated in many research projects to give better insight into the quality metrics. Objective evaluation models such as ITU Telecommunication Standardization Sector (ITU-T) P.1203 allow for the calculation of Mean Opinion Score (MOS) by considering various QoE metrics, and subjective evaluation is the best assessment approach in investigating the end-user opinion over a video streaming session’s experienced quality. We have conducted subjective evaluations with crowdsourced participants and evaluated the MOS of the sessions using the ITU-T P.1203 quality model. This paper’s main contribution is subjective evaluation analogy with objective evaluation for well-known heuristic-based ABRs.

Views

Total views

554

On Slideshare

0

From embeds

0

Number of embeds

508

Actions

Downloads

0

Shares

0

Comments

0

Likes

0

×