Your SlideShare is downloading. ×
Performance Testing Mobile and Multi-Tier Applications
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Performance Testing Mobile and Multi-Tier Applications

516
views

Published on

Invited Talk, Chicago Quality Assurance Association, Chicago, June 26, 2007. Overview of performance testing strategy for handheld devices and multi-tier systems.

Invited Talk, Chicago Quality Assurance Association, Chicago, June 26, 2007. Overview of performance testing strategy for handheld devices and multi-tier systems.

Published in: Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
516
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. mVerify ® A Million Users in a Box ®Performance Testing Mobile and Multi-Tier Applications Chicago Quality Assurance Association June 26, 2007 Robert V. Binder mVerify Corporation Bob_Binder@mverify.com 312 881-7337 x1001 www.mverify.com
  • 2. Goals of Performance Testing Validate time requirements/expectations Validate utilization requirements/expectations Validate capacity requirements/expectations Reveal load-related bugs Prove compliance: SLAs, contracts, competitive rankings Fire-drill for recovery Assess robustness to shocks © 2007 mVerify Corporation www.mverify.com 2
  • 3. Business Impact/ROI In 2002, slow e-commerce downloads lead to an estimated $25 billion of abandoned transactions In 2005, Google’s 15 minute outage estimated to have cost at least $150,000 in lost ad revenue Recent study: nearly two-thirds of mobile employees rank poor response time as a “significant” inhibitor to working remotely over a VPN © 2007 mVerify Corporation www.mverify.com 3
  • 4. Business Impact/ROIAvoidable Costs Solution Cost•Lost revenue •Software Tools•Lost user productivity •Hardware•Lost IT productivity •Staffing•Overtime payments •Services•Wasted goods •Training•FinesRisk Mitigation•Enterprise demise•Law Suits•Negative Publicity•Personnel morale One 9 Two 9s Three 9s Four 9s Five 9s Six 9s IBM. "Maximizing Web site Availability," February 2002 © 2007 mVerify Corporation www.mverify.com 4
  • 5. Basic Objectives: RASPReliability Probability of a failure occurring within a certain period of timeAvailability Percent achieved up-time, not including scheduled downtimeScalability The range load for which an incremental input consumes the same resourcesPerformance The rate at which work is done © 2007 mVerify Corporation www.mverify.com 5
  • 6. Reliability Reliability: probability of non-failure  Total operational hours or transactions  Entire user population  Can be estimated during test, if tests are sufficiently realistic © 2007 mVerify Corporation www.mverify.com 6
  • 7. Availability – the “nines” Annual Unscheduled Downtime Six nines 99.9999% 32 seconds five nines 99.999% 5 minutes four nines 99.99% 53 minutes Three nines 99.9% 8.8 hours two nines 99% 87 hours (3.6 days) One nine 90% 876 hours (36 days) Availability = percent up-time  MTTR: mean time to recover, repair, restart …  Availability = 1 / 1 + (MTTR  Reliability) © 2007 mVerify Corporation www.mverify.com 7
  • 8. Some Data Points Reliability Availability, (Failures/million hours) 6 min MTTRNT 4.0 Desktop 82,000 0.999000000Windows 2K Server 36,013 0.999640000Common Light Bulb 1,000 0.999990000Stepstone OO Framework 5 0.999999500Telelabs Digital Cross Connect 3 0.999999842 © 2007 mVerify Corporation www.mverify.com 8
  • 9. Performance Metrics Response Time  Round-trip time Avg Resp, Sec Throughput  Aggregate transaction processing rate Trans/Sec Utilization  Average % busy Failure Intensity Recovery Time 0% 100% Utilization © 2007 mVerify Corporation www.mverify.com 9
  • 10. Strategies Performance Testing  Assess compliance with performance goals  Assess compliance with resource utilization goals  Provides data to estimate reliability, availability Stress Testing, Load Testing  Assess response to over-load scenarios  Assess recovery from failure modes © 2007 mVerify Corporation www.mverify.com 10
  • 11. Strategies Benchmarks  Assess throughput for open standard test suite Scalability  Assess performance linearity Profiling  Identify utilization bottlenecks by component © 2007 mVerify Corporation www.mverify.com 11
  • 12. Typical Server Side SetupEmulated Client Server(s) Under TestEmulated Client Internal• LAN••Emulated Client © 2007 mVerify Corporation www.mverify.com 12
  • 13. Issues Client Emulation Machines  Synchronization, overall test execution  Capacity  Multi-homed, test control subnet Test vs. Production Systems  Separate server farm?  Network contention  Isolation versus scale/scope Version/Configuration control  System under test  Test environment Database set/reset © 2007 mVerify Corporation www.mverify.com 13
  • 14. Issues Actual end-user/customer experience?  Network latency, QoS …  Thin clients?  Browser, client software versions  Client OS? © 2007 mVerify Corporation www.mverify.com 14
  • 15. Edge MonitoringEmulated Monitored Client Server(s) Client Under TestEmulated Client • • Internal • Internet• LAN••Emulated Monitored Client Client © 2007 mVerify Corporation www.mverify.com 15
  • 16. Issues Client Monitoring Machines  Synchronization  Achieving desired test input at desired time  Capacity  Data collection  Availability  Security (beta test agreement?) Network configuration  DMZ  Equipment, setup, security considerations © 2007 mVerify Corporation www.mverify.com 16
  • 17. Connectivity – a Wild CardEmulated Monitored Client Server (s) Client Under TestEmulated Client • • Internal • Internet• LAN•• Random latency, jitter, lost packets, re-orderedEmulated packets, re-routed Monitored Client packets, duplicate Client packets, bandwidth restrictions, bit-error, background load, QoS, operational events © 2007 mVerify Corporation www.mverify.com 17
  • 18. Controlled ConnectivityEmulated Monitored Server (s) Client Client Under TestEmulated Client Network • Emulator • Internal •• LAN•• Controlled latency, jitter, lost packets, re-orderedEmulated packets, re-routed packets, Monitored Client duplicate packets, Client bandwidth restrictions, bit- error, background load, QoS, operational events © 2007 mVerify Corporation www.mverify.com 18
  • 19. Issues Complexity  Impairment modeling  Impairment emulator programming  Coordination with emulated clients  Coordination with monitored clients Specialized Skills  Wire Shark (Ethereal)  TCP log analysis © 2007 mVerify Corporation www.mverify.com 19
  • 20. How to Maximize Reliability Combine realistic functional and load testing  Representative variation in load and usage  Supports reliability/availability estimation  Saves time: more test goals supported with fewer tests  Typically effective in finding “weird” bugs Security?  Add abuse cases to the usage profile  Interleave with normal traffic “You play like you practice” © 2007 mVerify Corporation www.mverify.com 20
  • 21. Use Dynamic Loading The real world isn’t flat Vary behavior rate for actor/actor group  Arc 3000.000  Flat 2500.000  Internet fractal 2000.000  Negative ramp Events Per Second 1500.000  Positive ramp 1000.000  Random 500.000  Spikes 0.000  Square wave -5000 -500.000 0 5000 10000 15000 20000 25000  Waves Time (seconds) Actual “Waves” Loading © 2007 mVerify Corporation www.mverify.com 21
  • 22. Case Study EventSimulator Script Java DB SilkTest Writer GUI Test Object Java Java Java Serializer Driver API Servers TX 3270 MainFrame Formatter Driver Test Oracle Custom Test Component 3rd Party Product Test Run Comparator System Under Test Reports © 2007 mVerify Corporation www.mverify.com 22
  • 23. Case Study Every test run unique and realistic  Simulated user behavior to generate transactions  Automatically submit in real time  ~100,000 test cases per hour  ~200 complete daily cycles  Evaluated functionality and performance Controlled distributed heterogeneous test agents (Java, 4Test, Perl, SQL, Prolog) driving Java/CORBA GUI/API Five person team, huge productivity increase Achieved proven high reliability  Last pre-release test run: ~500,000 events in two hours, no failures detected  No production failures © 2007 mVerify Corporation www.mverify.com 23
  • 24. Notes Capture/replay scripts  Static think-time  Can distort load and response time Performance Analysis  Neil Gunther – books and web site © 2007 mVerify Corporation www.mverify.com 24
  • 25. Tools Open Source  openSTA  PushToTest  Grinder  http://opensourcetesting.org/performance.php Scripting systems: Tcl, Perl, Ruby, Python Built-in  Windows Perfmon  *nix – SNMP, others © 2007 mVerify Corporation www.mverify.com 25
  • 26. mVerify Testing System End-to-End Edge to Core Integrated functional and performance testing  Test objects  XML performance measurements Adapters for Windows Mobile, Web Services, ODBC, *nix command line Forthcoming  Profile-based test generation  Adapters and plug-ins for many other platforms © 2007 mVerify Corporation www.mverify.com 26
  • 27. MTS/RPMConsole Host Agent Host MTS Console MTS Test TEST RUN Agent REPORTSAgent Host MTS Remote Agent MTS Test Agent RPM Client Plug In Under Test Client Host Under Test MTS Remote Agent MTS Remote RPM Client Agent Plug In Under Test Host Under Test may be RPM Server Client Host Under Test ü Cell Phone Plug In Under Test ü PDA ü Desktop Server Host Under Test MTS Remote Agent ü Server ü Embedded Processor RPM Server ü Network Equipment Plug In Under Test ü Access Point ü Base Station Server Host Under Test © 2007 mVerify Corporation www.mverify.com 27
  • 28. MTS/RPM Integrates Functional and Performance Test MTS 1.5: RPM Plug-in for Windows Mobile Plug-ins for Win32, *nix coming soon © 2007 mVerify Corporation www.mverify.com 28
  • 29. Q&A© 2007 mVerify Corporation www.mverify.com 29

×