Slow Cool 20081009 Final


Published on

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Slow Cool 20081009 Final

  1. 1. Slow Cool, Ain’t Cool Hon Wong Symphoniq Corporation
  2. 2. Agenda • Ajax Web 2.0 apps… new opportunities new performance management challenges • Real user approach needed • New approach in action • Q&A © 2008 Symphoniq Corporation 2
  3. 3. The Evolving Web © 2008 Symphoniq Corporation 3
  4. 4. All In The Name of the End User “The Next Big Killer App / Feature” Rich Internet Ubiquitous Computing Applications (RIA) and Access Brittle Architectures Loosely Coupled Dynamic Architectures Architectures Rigid Taxonomies “Semantic Web” Loose Folksonomies Web as Information Web as Information Source Web as Information Synthesis Collaboration © 2008 Symphoniq Corporation 4
  5. 5. Challenges of the Rich User Experience © 2008 Symphoniq Corporation 5
  6. 6. Dealing with the Challenge Web App DB Traditional tools & techniques generate External App oceans of data – no solutions • Log file analysis • HTTP watch • Network sniffers • Load testers • Server monitors • Monitoring services • Many more… © 2008 Symphoniq Corporation 6
  7. 7. Reality Check “43% of all application outages aren’t detected by the tools put in place to manage them, but by the end-users who are subjected to them” Dennis Drogseth, VP, Enterprise Management Associates © 2008 Symphoniq Corporation 7
  8. 8. And Even if A Problem is Discovered… While most problems get solved in less than a day, 30% of problems take more than a day to solve. 18.00% 16.00% 14.00% 12.00% 10.00% 8.00% 6.00% 4.00% 2.00% 0.00% <1 hour 1-2 2-4 5-10 10-24 1-2 days 2-5 days > 5 days hours hours hours hours Forrester Research © 2008 Symphoniq Corporation 8
  9. 9. Performance Management: Practical Question #1 “Living is a form of not being sure, not knowing what next or how…We guess. We may be wrong, but we take leap after leap in the dark.” Agnes de Mille When user satisfaction has direct business impact, do you have the luxury of blindly assuming users are satisfied with application performance? © 2008 Symphoniq Corporation 9
  10. 10. Performance Management: Practical Question #2 quot;I'm an ocean, because I'm really deep. If you search deep enough you can find rare exotic treasures.“ Christina Aguilera When business happens in Web time, do you have time to search oceans of performance data to pin- point the cause of slowdowns? © 2008 Symphoniq Corporation 10
  11. 11. Performance Management: Practical Question #3 “Strive for continuous improvement, instead of perfection.” Kim Collins When complexity and high-speed change make perfection unattainable, do you have the actionable information required to drive performance improvements? © 2008 Symphoniq Corporation 11
  12. 12. Holistic Approach to Performance Management Real User Monitoring Web App Performance Service Level Assurance Web App DB • How can I avoid being • Why is my application • What is the impact of blind-sided by slow? performance problems performance issues? • Which tier is causing the on the business? • Which users are being slowdown? • How do I link affected? • Is it inside or outside the performance criteria to • How can I troubleshoot data center? specific business units? specific user issues? • How can I recreate or validate problems? © 2008 Symphoniq Corporation 12
  13. 13. Bottom Line Impact of “Do Nothing” Time and resources consumed Distraction from core Trying to isolate problem business Wasted IT Employee Blame game budget Reduced downtime triage productivity Brand damage Slow = Off Customer Lost revenue abandonment Inadequate tools Blindsided by Incomplete transactions to detect and diagnose performance problems issues Compromise strategic Wasted Resources initiatives Costs 10x to fix the problem in production © 2008 Symphoniq Corporation 13
  14. 14. Why Monitor from the Real User Perspective? Calculating end user response time is not practical… RT ≈ (Payload / Bandwidth) + (AppTurns * RTT) + Cs + Cc RT Response time of the transaction in seconds The amount of information (bytes) that must be delivered to the user Payload Minimal bandwidth across all network links between the user and the Bandwidth data center Number of user and web site interactions needed to generate a user-level AppTurns system response to a transaction Round-trip-time (in seconds) between the user and the data center RTT Total processing time required by the data center consisting of web Cs servers, application servers and database servers Cc Total processing time required by the user’s PC © 2008 Symphoniq Corporation 14
  15. 15. How Real-Time Apps Derail RT Calculations Parameter Limitations • Varies greatly transaction to transaction Payload • 3rd party or cached content • Non-page content like AJAX, Flash • Varies greatly from user to user Bandwidth • Varies from moment to moment • Varies greatly transaction to transaction AppTurns • 3rd party or cached content • Non-page content like AJAX, Flash RTT • Varies from moment to moment • Varies from transaction to transaction Cs • Dynamic data center—what “path” will the transaction take? • Difficult to instrument applications, esp. 3rd party code • Varies from user to user, moment to moment Cc • Impacted by “last mile” conditions © 2008 Symphoniq Corporation 15
  16. 16. Methods of Measuring RT RT derived through Measuring RT directly i measurement of at the browser surrogate parameters Measuring RT by “listening-in” and listening-in” not adding load Empirical Direct Installed Agent Passive Sniffer or Dynamic Injection Active Synthetic Monitoring (not applicable) Measuring RT of artificially created transactions © 2008 Symphoniq Corporation 16
  17. 17. Direct Measurement at Browser – Only Viable Approach for Ajax Apps • JavaScript that delivers Ajax features are executed on the client’s machine • Non-page content • Last mile connectivity impacts end user experience - Chatty protocol - 3rd party content delivery network - Client side caching • Mash-up, SaaS & 3rd party content mask performance issues © 2008 Symphoniq Corporation 17
  18. 18. Installed vs. Dynamic Injection Approaches Download monitoring agent to PCs Installed Agent Installed Agent Dynamic Agent • Download monitoring agent to PC • Inject instrumentation onto page via Web server • Measure RT, errors & desktop or App Delivery Controller perfmon statistics • Non-intrusive • Challenges: – No agent download – Convince users to download – No source code changes – Maintain agents – Potential compatibility issues • Measure RT & errors • Only suitable for PCs under IT’s • Applicable to all customer-facing direct control or enterprise applications © 2008 Symphoniq Corporation 18
  19. 19. Beyond Monitoring – End-to-End Management HTML, AJAX, Flash, Silverlight Web App DB Tier Time Detail External App Web App SaaS DB Ext 1 Management Server + DB Ext 2 Total © 2008 Symphoniq Corporation 19
  20. 20. Meaningful, Correlated & Actionable Data RT (as experienced by the end-user) Everything measured from Affected Party’s IP Address and URL the real user’s perspective Network Latency Parsing Time Objects Per Page Object Response Time Error or Abort Rate Correlated across all tiers of Base Page Response Time network & infrastructure Response Time at Web, Application & Database Tier Server Responsible at Each Tier Server Parameters: CPU utilization, Memory, I/O etc. Web Service Calls Method Call Tree Insight into application SQL Queries © 2008 Symphoniq Corporation 20
  21. 21. Real Time, End User Experience Driven Problem Resolution Detect Problem Based on RT Assess Impact Prioritize Issues Outside Inside Outside or Inside? Client or Network? Front or Back End? Client Network Front End Back End Identify Identify Which Page, Which Object Individual User Individual IP Object, Web and Server? Service, Server? Trace Call Stack Method Call or Solve The Problem SQL Query? © 2008 Symphoniq Corporation 21
  22. 22. Performance Measurement Based on Real Users © 2008 Symphoniq Corporation 22
  23. 23. Quick Triage • Directly relate real user RT to IT issues ― Not impacted by infrastructure configuration ― Accommodate 3rd party content, SOA etc. • Focus resources on fixing the problem instead of reproducing the problem or pointing fingers © 2008 Symphoniq Corporation 23
  24. 24. Tuning Web App. Performance Using Real Data Requirements Optimize Design Operate Build Deploy Development Phase Production Phase Discover & fix performance bottle- necks under load prior to rollout Real-time detection & mitigation of performance issues © 2008 Symphoniq Corporation 24
  25. 25. Requirements of a Comprehensive Tool Detect Isolate Optimize Web App DB • Provides visibility into • Isolate problems by • Report on business browser-level tagging and tracing impact of performance performance, including transactions through problems RIAs internal and 3rd party • Optimize application • Detect performance J2EE and .NET services performance with problems in real time to • Visibility into problem historical trending and minimize impact servers, services, analysis method calls and SQL queries © 2008 Symphoniq Corporation 25
  26. 26. Complexity Creates a Spectrum of User Experiences HTML AJAX Web App DB Flash, Silverlight External App # of Occurrence Response Time © 2008 Symphoniq Corporation 26
  27. 27. How to Report App. Perf. to Business Owners One approach: Application Performance Index (Apdex) • Standardized method for reporting app. perf. as defined by an alliance of companies and users ( • Reduced myriad of perf. metrics into a 0-to-1 scale (0=no user satisfied, 1=all users satisfied) Num. Satisfied Users + ½ Num. Tolerating Users APDEXT = Total Num. Users =4T © 2008 Symphoniq Corporation 27
  28. 28. Aligning App Perf to Business Goals © 2008 Symphoniq Corporation 28
  29. 29. Sample Apdex Report © 2008 Symphoniq Corporation 29
  30. 30. Contact Information THANK YOU! Hon Wong Symphoniq Corporation, Palo Alto, CA Tel: (650) 213-8889 e-mail: Web: © 2008 Symphoniq Corporation 30