• Like
Developing for Optimal VoIP Service Quality
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Developing for Optimal VoIP Service Quality

  • 513 views
Published

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
513
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
2
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • VoIP customer base is growing rapidly. 2005 was a revolutionary year for VoIP growth. (Right slide) One of the leading barrier to VoIP growth is voice call quality concerns
  • What do we measure? Not just MOS but service holistically
  • Show how Voice Perspective measures true end to end customer experience. We monitor service quality based on waveform comparison, where a standard sound clip is compared with the sound clip received over the voice path. Our solution does not use network parameters to calculate voice service performance indicators. That approach does not include last mile.

Transcript

  • 1.  
  • 2. Developing for Optimal VoIP Service Quality Understanding Quality From User’s Perspective Arun Bhardwaj Director, Business Development Keynote Systems, Inc. arun.bhardwaj@keynote.com
  • 3. Agenda
    • VoIP Market View
    • Unique Approach to Voice Service Quality Measurement
    • Comparative Analysis of Various Voice Technologies
    • Often Ignored Factors Affective Quality
    • Service Quality Trends in the VoIP Industry
  • 4. Unique Nature of VoIP
  • 5. Voice Communication Landscape PSTN User PSTN Network LDC Network PSTN User PSTN Network BSC BTS MSC BSC BTS MSC IP Network SoftPhone VoIP Phone Access Network POTS WiFi Phone Gateway IP-PSTN Gateway PacketCable Phone
  • 6. Why is Voice Quality Essential? VoIP Growth Barriers to VoIP Adoption 4.5million 1.3million Subscribers $1billion 2005 $200million 2004 Revenue US RBOCs losing 150,000 subscriber lines / month VoIP Providers gaining 100,000 subscribers / month Wireless only service gaining 50,000 subscribers / month VoIP will capture 22% of LEC’s existing market LEC’s will lose $18.2b between 2006 and 2010
  • 7.
    • Rank the relative performance of PSTN, PacketCable, VoIP hard phone, and VoIP soft phone service providers.
    • Identify industry trends in service level performance since the last Keynote study.
    • Identify the range of performance between the best voice service providers and the worst.
    • Examine peak and prime-time performance variations.
    • Identify the strengths and weaknesses of each service provider and voice service technology.
    Competitive Research Study Objectives
  • 8. Measurement Topology
  • 9. Monitoring Scope [New York location only] [San Francisco location only] PacketCable Services VoIP Hard Phone Services VoIP Soft Phone Services
  • 10. Audio Characteristics Analysis Last Mile Impairments: Measuring Within Network is Not Enough 70.9% 405 2.3% 422 Back Clipping 91.1% 520 3.3% 607 Other Clipping 0.2% 1 0.0% 5 Front Clipping 5.6% 32 5.6% 1,099 Holdover Calls with MOS < 3.0 All Calls 0.0% 0.0% 2.9% 0.0% Percentage 571 0 0 407 0 # of calls 18,456 Total 0.0% 0 Frequency Clipping 0.0% 7 Hum 71.3% 539 Static 0.0% 0 Hiss Percentage # of calls
  • 11. Keynote Voice Perspective Agent Technology New York Cable/DSL/Sprint Caller & Responder San Francisco Cable/DSL/Sprint Caller & Responder Responder Agent Accepts calls; sends audio sample Caller Agent Initiates calls; requests audio sample Caller agent compares received and reference audio samples
  • 12. Measured Parameters Voice Service Quality Reliability Audio Clarity Responsiveness Holistic Customer Experience !!! - Average Mean Opinion Score (MOS) - %Calls > Acceptable MOS - MOS Geographic Variability - Average Audio Delay - %Calls > Acceptable Delay - Audio Delay Geo Variability - Service Availability - #Dropped Calls - Average Answer Time
  • 13. True End-to-End Monitoring Methodology What Others Measure What KEYNOTE Measures What Customer Experiences Core Network PSTN User SoftPhone VoIP Phone PSTN Network Access Network IP-PSTN Gateway Voice Path POTS
  • 14. Competitive Research Report VoIP Phone Digital Cable Phone Soft Phone PSTN
  • 15. Case Study: Invisible Annoyance Low MOS score for > 90% of calls Customer Problem Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Silence period frequency profile showed audible Hum on 70% of the VoIP Agents
  • 16. Case Study: Invisible Annoyance Diagnosis Hum problem and hardware ATA model type showed strong correlation Low MOS score for > 90% of calls Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Customer Problem 0.1% Model A San Francisco UUNet 0.0% Model A San Francisco Sprint 97.4% Model B San Francisco SBC DSL 97.1% Model B San Francisco Comcast Cable 0.0% Model A San Francisco AT&T 92.3% Model B New York Verizon DSL 97.7% Model B New York UUNet 97.6% Model B New York Time Warner Cable 87.6% Model B New York Sprint 96.9% Model B New York AT&T % of Calls with Hum ATA Model VoIP Perspective Agent Silence period frequency profile showed audible Hum on 70% of the VoIP Agents
  • 17. Case Study: Invisible Annoyance The problem was in a specific telephone adapter model type Audio Clarity Ranking improved by TWO places after replacing adapters Increased customer satisfaction (Mean Opinion Score increased by 0.3) Diagnosis Hum problem and hardware ATA model type showed strong correlation Low MOS score for > 90% of calls Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Customer Problem Improvement Silence period frequency profile showed audible Hum on 70% of the VoIP Agents
  • 18. Data Collection Period and Size
    • Data collected from August 1 st – August 31 st , 2006.
    • Long distance and local VoIP to PSTN calls between New York and San Francisco every 30 minutes.
    • Call placed on every VoIP provider and network combination
    • Total of over 125,000 phone calls were placed during the one month period
  • 19. Study Results Overview
  • 20. Summary of Results
    • Key performance indicators such as Service Availability and Average MOS improved for most providers.
    • Average one-way audio delay between 150 and 250 ms [Best 62 ms; Worst 335 ms].
    • Average Mean Opinion Score range 3.0 to 4.0. [Best 4.24; Worst 2.64].
    • Calls on most providers have clipping or audio holdover causing service degradation.
  • 21. Summary of Results
    • Primetime Versus Non-primetime Performance
      • Higher variation in audio delay than in Mean Opinion Score.
      • DSL connections offered less audio delay variance
      • Cable modem connection delivered more consistent MOS
  • 22. Reliability Overview – Service Types
    • PacketCable service providers are more reliable than PSTN, VoIP Hard Phone, and VoIP Soft Phone service providers.
  • 23. Audio Quality Overview – Service Types
    • PacketCable service providers had better overall audio quality than PSTN, VoIP Hard Phone, and VoIP Soft Phone service providers.
  • 24. Trends – Service Availability
    • Of the eleven service providers measured in previous studies, seven had a better Service Availability percentage in August, 2006 than in any previous Keynote study.
  • 25. Trends – Average MOS
    • Of the eleven service providers measured in previous studies, seven had a higher Average MOS in August, 2006 than in any previous Keynote study.
  • 26. Audio Delay and MOS Trends
  • 27. Performance Ranges – Audio Delay
    • Only four of the fourteen providers measured in August had an Average Audio Delay below 150 ms.
    • The best Average Audio Delay was 62 ms; the worst was 335 ms.
  • 28. Performance Ranges – Average MOS
    • Only four of the fourteen providers measured an Average MOS above the 4.0 “toll quality” threshold.
    • The best Average MOS was a 4.24; the worst was a 2.64.
  • 29. Codecs Used
    • The most commonly used codec is ITU-T G.711 PCMU.
    • Every VoIP Hard Phone provider with an Average MOS over 4.0 used the ITU-T G.711 PCMU codec.
    • ITU-T G.721 and ITU-T G.729 are still in use by a few VoIP service providers.
    [Note: Codec used cannot be determined for PacketCable providers and some VoIP Soft Phone software clients with proprietary signaling protocols. There is no codec used in the customer premises equipment for analog PSTN service.]
  • 30. Analog Telephone Adaptors and Software Clients
  • 31. Summary
  • 32. Industry Trends
    • Most providers measured in previous studies are improving their reliability
    • PacketCable providers now exceed the overall reliability of PSTN service.
    • VoIP providers as an industry need to improve in service availability.
    • As a whole, the industry standards in responsiveness and audio clarity continue to improve, and PacketCable service providers lead the other voice technologies.
  • 33. How to Improve VoIP Quality
      • Watch the competition – Ensure that your service not only performs well all the time, but also performs better than or at par with your competition.
      • Focus on end user experience – Measure VoIP performance as close as possible to the end-user experience. Actual waveform analysis of call audio brings the measurement perspective as close as possible to what your customers are experiencing.
      • Measure service holistically – Small things can ruin the best service experience. Focus on measuring every aspect of your VoIP call experience, and use the insight gained from the measurements to tune your network infrastructure to ensure few outages and excellent call quality.
  • 34. Public Agents Based Contact Center Monitoring PSTN Network ABC Enterprise Contact Center K R Public (Caller) Agent Infrastructure IP-PSTN GW SF VoIP Network K R Keynote Responder (Terminates VoIP or PSTN Calls) CHI NY DAL FL
  • 35. Appendix B Measurement technology
  • 36.
    • The Audio Quality index ranking is based on Keynote extensions of the Apdex * standard to represent user satisfaction with audio quality:
        • Mean Opinion Score (MOS) [T, F] = [4.0, 3.1] **
        • Audio Delay (ms) [T, F] = [150, 400] ***
    • Each call is determined to be in the Satisfied, Tolerating, or Frustrated performance ranges for MOS and audio delay, based upon industry standard thresholds.
    • * See http://www.apdex.org/
    • ** Thresholds based on Telecommunications Industry Association Technical Services Bulletin 116 “Voice Quality Recommendations for IP Telephony”.
    • *** Thresholds based on International Telecommunications Unions standard ITU-T G.114 “One-way transmission time”.
    Ranking Methodology – Audio Quality Total samples _____________ ___________________________ Tolerating count Satisfied count + 2 1000 x