Skyfire - Measuring, Quantifying and Improving QoE 2014
Upcoming SlideShare
Loading in...5
×
 

Skyfire - Measuring, Quantifying and Improving QoE 2014

on

  • 112 views

In this white paper, Skyfire explains the essential algorithmic components that go into determining QoE, and how Rocket Optimizer's Experience Assurance technology measures, quantifies and then ...

In this white paper, Skyfire explains the essential algorithmic components that go into determining QoE, and how Rocket Optimizer's Experience Assurance technology measures, quantifies and then instantly improves the video experience, using the power of the cloud.

Statistics

Views

Total Views
112
Views on SlideShare
112
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Skyfire - Measuring, Quantifying and Improving QoE 2014 Skyfire - Measuring, Quantifying and Improving QoE 2014 Document Transcript

  • Measuring, Quantifying and Improving Quality of Experience 2014
  • Measuring, Quantifying and Improving Quality of Experience | 2014 Page 2 Keep in touch Measuring, Quantifying and Improving Quality of Experience Introduction – Why QoE Matters In an era in which mobile operators are under immense competitive and investor pressure to differentiate themselves from each other, while also increasing data’s percentage of ARPU – all while coping with unprecedented over-the-top video services clogging their networks - an increasingly crucial component of both happy customers and a healthy balance sheet is an ever-improving Quality of Experience (QoE) for customers. Quality of experience is fairly simply defined. QoE measures total system/network performance using subjective and objective measures of customer satisfaction. Many operators’ quality of experience has been severely tested in recent years by the explosive growth of mobile video traveling on their networks, particularly from popular OTT streaming applications like YouTube, DailyMotion and countless others. These applications not only put pressure on mobile networks by consuming a tremendous amount of bandwidth, but also must be delivered at a very fast rate in order to avoid rebuffering and slow start times. In a 2012 survey by Harris Interactive of US smartphone owners, 40% of customers who had switch wireless operators during the previous year said they did so because of “data quality/service”, leaving their current operator in search of better data connections with a new one. This was second only to “price” as a key reason for churning. In the UK, 43% of consumers who had switched cited “data quality/service” as a leading reason for having done so; again, this reason for leaving was cited as second only to price. The latest Cisco Virtual Networking Index projects that mobile video - already 50% of the bandwidth on global wireless networks - will grow to 66% of all bandwidth by 2017. As such, it is clear that a leading indicator of experience quality for operators is successful delivery of video to mobile subscribers: free of buffering, stuttering, slow starts, audio/video sync problems and excessive pixilation. What Factors Determine Quality of Experience? In general, there are two main elements to video quality of experience: visual quality and playback quality. Visual Quality Perceived video quality is determined by a number of factors including: u Video resolution u Number of compression artifacts u Audio quality u Smoothness (i.e., frame rate) Video quality is solely based on the source video itself, as opposed to other factors such as device, network speed, etc. Playback Quality Playback quality is determined by how well the video is played back to the user, and is impacted by: u Device (is it capable of playing the video?) u Network (is it fast enough to deliver the video?) Particularly in the smartphone era, devices are generally very good at video playback; as such, the network is the primary determinant of playback quality. If the network cannot deliver video fast enough for smooth playback, it will result in: u Long start times u Stalling during playback
  • Measuring, Quantifying and Improving Quality of Experience | 2014 Page 3 Keep in touch Measuring, Quantifying and Improving Quality of Experience What Matters Most to Consumers? When surveyed, wireless consumers are quite clear on what “quality of experience” means to them when it comes to video. According to the same 2012 Harris Interactive survey of UK and US smartphone owners, 87% of British and 86% of US consumers indicated that when their mobile connection is poor, they care more about seeing a standard definition video which plays smoothly than seeing a high definition video with slow starts, stuttering and re-buffering. In other words, how users perceive their quality of experience is more heavily influenced by transmission quality than visual quality. Consumers are very willing to watch a video that has been optimized for their experience, with a delivery bitrate matched for their individual place on the network at that time (congested, within a building, or behind a wall), “as long as it plays”. Given a choice between multiple stall-free, fast-starting videos, of course, users will choose the one with the highest visual quality. 20% 40% 60% 80% 100% 86% 14% Prefer SD video with high transmission quality (e.g. faster start and smoother playback) Prefer HD video, but with longer start times and potential buffering
  • Measuring, Quantifying and Improving Quality of Experience | 2014 Page 4 Keep in touch Measuring, Quantifying and Improving Quality of Experience Measuring QoE in Skyfire’s Rocket Optimizer Solution Skyfire’s mobile video optimization solution, Rocket Optimizer, incorporates 3 key metrics that together indicates both video quality as well as playback quality. These 3 metrics are calculated for each video played, and include: 1) MOS (Mean Opinion Score) 2) Stall Percentage / Number of Stalls 3) Start Time MOS (Mean Opinion Score) The original method for calculating a MOS score was by getting a subjective opinion of video quality from a panel of users and then calculating an average (for each video). Numerous methods have been devised for approximating the subjective MOS score through automated means – and there are a number of standard algorithms to do so. Broadly, these algorithms fall into one of two categories. MOS algorithms can either calculate an “absolute” MOS (absolute quality measure) or a “relative” MOS (comparing 1 video to a “reference” from which it was derived, and measuring the degradation compared to the reference). Rocket Optimizer uses a relative MOS score – which is much more useful for optimization systems since it measure how much optimization has changed the original content. Specifically it uses a Peak Signal-to-Noise Ratio (PSNR) algorithm calculated from the mean square error between frames (http:// en.wikipedia.org/wiki/Peak_signal-to-noise_ratio): The PSNR is then defined as: Typical values are from 30 to 50 dB (higher is better). Skyfire then maps the PSNR into a value between 1 and 5 to provide a familiar MOS value. Every time Skyfire optimizes a video, a MOS score is logged so that mobile operators can monitor and adjust optimization systems based on video quality benchmarks. Stall Percentage / Number of Stalls When running Experience Assurance, stall percentage and number of stalls can be estimated by the Rocket Optimizer system. In order to accurately estimate stalling, Skyfire empirically developed a “virtual client” based on typical client devices on a mobile network. These devices tend to be quite similar in terms of how much video they need to buffer before a playback continues after a stall, etc. While in Experience Assurance mode, Rocket Optimizer monitors all the large- and medium-sized flows running over the mobile network, and then aggregates them on a per-subscriber basis in order to assess the speed of the network connection for each user. PSNR = 10 . log10 ( ) = 20 . log10 ( ) = 20 . log10 (MAXI) – 10 . log10 (MSE) MAXI 2 MSE MAXI √MSE MSE = 1 m n m -1 Σ i = 0 n -1 Σ j = 0 [I(i,j ) – K (i,j )]2
  • Measuring, Quantifying and Improving Quality of Experience | 2014 Page 5 Keep in touch Measuring, Quantifying and Improving Quality of Experience Then, when a user requests to play a video, the optimization system knows 3 key pieces of information: u Bitrate of the source video (through analysis on the controller) u Network speed of the user (through flow monitoring) u Bitrate of the optimized video Using this information, the virtual client can assess both the stalls and stall percentage that would have happened at source bitrate had there been no optimization, and as well as the stalls and stall percentage that did in fact happen due to the optimization. Stall rate is defined as: Stall percentage can be roughly calculated as follows (shown here for illustrative purposes – the virtual client’s algorithm is more sophisticated and more accurate): For example, if the source bitrate is 800 kbps and the network is capable of delivering 400 kbps to the user, the stall rate will be about 50%; the network takes twice the time to deliver the video as is necessary to watch the video in real-time. (If network speed is greater than the source bitrate, stall time will be negligible or 0.) The stall time percentage for optimized video, meanwhile, can be calculated as follows: Now, if the 800 kbps is 70% optimized to a bitrate of 240 kbps, and network speed is 400 kbps, then network bitrate is greater than optimized video bitrate, so stall rate will again be negligible or 0. Optimization has reduced stall rate from 50% to 0%. In general, the goal of optimization, then, is to optimize at the highest bitrate (with maximum video quality) without stalling. Start Time Start time is defined as the time it takes to buffer sufficient data from the network such that the client plays the video. Start time is effectively a special case of stall time that occurs before any video has been played; however, users tend to perceive start time and stall time quite differently, so it is broken out as a separate metric. A typical client will buffer about 4s of video before it begins playback. Again using simplified formulas for illustrative purposes, start time for an unoptimized video can be calculated as follows: Start time will generally be significantly lower after optimization, because fewer bytes need to be downloaded in order to build up sufficient buffer: Using the same example above, unoptimized start time for the video will be around 8s (800/400 x 4s) and start time for the optimized video will be around 2.4s (240/400 x 4s) Source Bitrate (kbps) Network Speed (kbps) Start Time % (Unoptimized) =~ x 4s Optimized Bitrate (kbps) Network Speed (kbps) Start Time % (Optimized) =~ x 4s Stall Time(s) Total Elapsed Time(s) Stall Time(s) Stall Time(s) + Video Play Time(s) Stall % = = Source Bitrate (kbps) - Network speed (kbps) Source Bitrate (kbps) Stall % (Not Optimized) =~ Optimized Bitrate (kbps) - Network speed (kbps) Optimized Bitrate (kbps) Stall % (Optimized) =~
  • Measuring, Quantifying and Improving Quality of Experience | 2014 Page 6 Keep in touch Measuring, Quantifying and Improving Quality of Experience Is Experience Assurance the Same as Congestion Monitoring? To date, the typical approach to managing end-user optimizing video on mobile networks been either (a) global optimization settings based on time-of-day or other static policies or (b) using “probes” in the radio access network (RAN) to identify congested cells – and focus optimization on those congested cells. The static approach is useful in targeting optimization when it is needed – with the hope being that, by performing optimizations based at times when operators know congestion occurs, it will release the load and increase the average QoE. However, it isn’t really as simple as that. In a typical network, only 10% to 20% of cells (at most) will be congested at any one time, and a blanket approach such as this doesn’t focus optimization on the cells where it is needed most. The RAN probe approach came about in an effort to overcome the limitations of the aforementioned static approach by precisely identifying congested cells in the network in order to target optimization only on congested cells. The RAN probe approach is more surgical, but it still has two downsides. First, most mobile networks do not currently have RAN probes deployed, and doing so requires a significant capital outlay. Second, the RAN probe approach is useful for reducing or eliminating RAN congestion – but many quality of experience issues have nothing to do with congestion, and are instead caused by impairments at the individual user level. For example, the user may be indoors and suffering from signal blockage; at the edge of a cell, with poor signal; using an old phone with a CPU that just can’t handle the bitrate of some given video; etc. There is clear benefit to a more comprehensive approach that goes beyond congestion alone. Conclusion - The Rocket Optimizer Difference Skyfire has introduced a breakthrough approach to assuring a top quality end-user quality of experience as part of its Rocket Optimizer system. Measurement and mitigation of bandwidth congestion in real time is now possible for virtually every video subscriber session on the operator network thanks to Rocket Optimizer’s Experience Assurance. Through the use of key metrics that measure both video quality and playback quality – and can juxtapose optimized vs. unoptimized outcomes – wireless operators can measure, quantify and instantly manage session-level quality of experience. With Experience Assurance, the focus is on the individual – and optimizing only as necessary to ensure that the individual is delivered a top quality video viewing experience. Ensuring better QoE for users on crowded towers, inside buildings or at the edge of cells is essential for operators looking to address the challenges raised by the unprecedented over-the-top video traffic running across their networks. Skyfire’s Rocket Optimizer answers this call in a flexible, lightweight and cost-effective manner that allows operators to chart and pilot their own destiny, and win back a large measure of control to ensure a first-rate quality of experience for each and every user who uses their network. Related Content For additional information on Skyfire’s cloud-based solutions for wireless operators, please see the following URLs: Skyfire.com: http://www.skyfire.com
 Rocket Optimizer: http://www.skyfire.com/operator-solutions/rocket-optimizer Skyfire Horizon browser extension platform: http://www.skyfire.com/operators-solutions/skyfire-horizon
  • Measuring, Quantifying and Improving Quality of Experience | 2014 Page 7 Keep in touch Measuring, Quantifying and Improving Quality of Experience About Skyfire Skyfire, an Opera Software company, is dedicated to leveraging the power of cloud computing to radically improve the mobile Internet experience for both operators and their consumers. Skyfire’s innovative, next-generation carrier cloud approach to mobile video and data optimization provides wireless operators with huge cost savings, elastic capacity, and the ability to surgically enhance quality of experience on a per-stream level. The company has also introduced the first mobile browser extension platform to enable robust contextual & social browsing, as well as enhanced monetization opportunities for operators. The company currently counts 4 of the largest mobile operators in the world as customers for its Rocket Optimizer™ and Skyfire Horizon™ solutions. Skyfire was founded in 2007, and is located in Mountain View, CA, in the heart of Silicon Valley.