Loadtesting Web(GUI) sites:
    why, what and how


    Mark Rotteveel (Pluton IT)
Who am I?
• Mark Rotteveel          • In IT since 1999
• Born in 1979            • Working as a tester
• Hobbies: reading,         since 2006
  working with Firebird   • Joined Pluton in
  and more                  2008
                          • Loadtesting CRM
                            systems at T-
                            Systems / T-Mobile
                            NL
Overview
• Introduction testing
• Why : reasons for testing
     – Verifying performance requirements
     – Resource planning
• What : targets for testing
     – Designing tests
     – Metrics to use
• How : methods of testing
     –   Tools
     –   Recording / writing tests
     –   Execution
     –   Results
Introduction testing


• Finding errors
    – Verifying software matches requirements
    – Validating software meets user needs
    – Other errors
• Mitigating (business) risks
• Minimizing cost
Loadtesting


•   Verifying performance requirements
•   Checking behavior under load
•   Resource planning
•   Representative load (eg normal, peak, max
    transaction rate with 'normal' resource use)
Stress testing


• Finding limits of software (or hardware)
• Behavior beyond expected load or with limited
  resources
    – Bugs under high load / constrained resources
    – Error handling / graceful degradation or failure
• Exaggerated load (eg 2x peak, until
  application breaks)
Why : Reasons for testing
Verifying requirements (1)


• Requirements state objectives for the
  application (and the test)
    – Response time
    – Load to handle
    – Etc
• Test to verify application meets those
  requirements!
Verifying requirements (2)

Target metrics (example)   Source

• Response time            • List of non-
• Transactions per           functional
  second                     requirements
• Concurrent users         • Usage model
                                – Operational profile
                                – Load profile
Requirements : Example


• Scenario : login
    – Process 5 logins per minute
    – 500 ms response time target
Resource planning (1)


• Resource planning: knowing when to upgrade
  your hardware or add new servers
• Test to check or enhance your planning for
  upgrades,
• or to make a contingency plan for unexpected
  increase of traffic
Resource planning (2)

What                     Why

• Current usage:         • Plan ahead
  required resources     • Contingency plan for
• Expected growth          unexpected growth
• Determine triggers /   • Resource impact of
  warnings on              new version
  resource usage
What : Targets of testing
Designing tests (1)


• Select requirements     • Determine user
  to test                   count or load target
• Select (or write) use   • Amount of
  cases                     'thinktime'
Designing tests (2)


• Order tests by         • Aggregate test
  business risk or         cases
  importance             • Realistic load for
• Most important tests     combined scenario
  first!
• Test individual and
  combined
Test : Example


• Login                   • Access main page
• Target : 5 logins per   • Enter login
  minute                    information
• Determine response      • Click login
  time (should be <
  500 msecs)
Metrics (1)


    Metrics are measurements or performance
indicators used to monitor the test, and determine
           success or failure of the test.
Metrics (2)


• Select metrics for the requirements
• Too much is better than too little
• But, more measurements influence test (eg
  additional load)
Metrics (3): examples


• Resource planning: memory, CPU, bandwidth
• User experience: response time, latency,
  throughput
• Correctness: % failed tests
How : Methods for testing
Testing needs


•   Tools : open source or proprietary
•   Recording / writing actual test
•   Preparation
•   Execution
•   Reporting on results
Tools


• Open source tools: Jmeter, The Grinder (and
  more)
• Proprietary tools: HP LoadRunner,
  SilkPerformer (and more)
• Select the tool that matches your budget and
  needs
Open source vs proprietary

Open source          Proprietary

• Cheap             • Support,
• Extendable (not     documentation
  always easy)      • More functionality
Recording / writing tests
• Translate test design to executable test
• Record and playback vs 'handwritten'
• Parametrize user data
    – Scenario 'consumables' (eg usernames, order
      information)
    – Dynamic session data (eg cookies)
• Add assertions for expected results
• Verify playback (both single and multiple user)
Using Jmeter


• Simple example: recording login into WebGUI
Preparation for execution


• Configure scenario (number of users,
  'thinktime', ramp up, duration)
• Combine scenarios for 'realistic' load
• Setup monitoring of target application(s) /
  server(s)
• Collect data for scenario 'consumables'
Execution


•   Start the test
•   Keep an eye on metrics
•   Manually check the application during test
•   Verify that loadtest server(s) is not overloaded
Results (1)


• Correlate metrics :
  dependencies and
  bottlenecks
• Draw conclusions
Results(2)



• Verify requirements
  ('did we reach our
  targets')
Results (3)


• Communicate results and findings to
  development, management
• Record lessons learned for future reference
• Preserve testware for future use
Pitfalls (1)
• Using one loadserver: bottleneck is the
  testserver
• Flooding the target machine
• Not parametrizing data (caching effects, errors
  due to duplicate or incorrect data)
Pitfalls (2)


• Loadtest software is not a browser (usually no
  or limited javascript support!)
• Not checking for expected data (or absence of
  errors) in response of server
• Test environment not enough production-like
  (impossible to predict behavior in production)
WebGUI specifics
• Manage session cookies (eg use HTTP
  Cookie Manager in JMeter)
• Parameter separator of WebGUI (';') not
  supported by all software; only standard '&'
  (minor problems recording, WebGUI does
  support '&' in playback)
• JMeter bug: recording proxy sometimes
  incorrectly encodes GET requests (stop and
  start proxy before recording next step)
Questions?

Loadtesting wuc2009v2

  • 1.
    Loadtesting Web(GUI) sites: why, what and how Mark Rotteveel (Pluton IT)
  • 2.
    Who am I? •Mark Rotteveel • In IT since 1999 • Born in 1979 • Working as a tester • Hobbies: reading, since 2006 working with Firebird • Joined Pluton in and more 2008 • Loadtesting CRM systems at T- Systems / T-Mobile NL
  • 3.
    Overview • Introduction testing •Why : reasons for testing – Verifying performance requirements – Resource planning • What : targets for testing – Designing tests – Metrics to use • How : methods of testing – Tools – Recording / writing tests – Execution – Results
  • 4.
    Introduction testing • Findingerrors – Verifying software matches requirements – Validating software meets user needs – Other errors • Mitigating (business) risks • Minimizing cost
  • 5.
    Loadtesting • Verifying performance requirements • Checking behavior under load • Resource planning • Representative load (eg normal, peak, max transaction rate with 'normal' resource use)
  • 6.
    Stress testing • Findinglimits of software (or hardware) • Behavior beyond expected load or with limited resources – Bugs under high load / constrained resources – Error handling / graceful degradation or failure • Exaggerated load (eg 2x peak, until application breaks)
  • 7.
    Why : Reasonsfor testing
  • 8.
    Verifying requirements (1) •Requirements state objectives for the application (and the test) – Response time – Load to handle – Etc • Test to verify application meets those requirements!
  • 9.
    Verifying requirements (2) Targetmetrics (example) Source • Response time • List of non- • Transactions per functional second requirements • Concurrent users • Usage model – Operational profile – Load profile
  • 10.
    Requirements : Example •Scenario : login – Process 5 logins per minute – 500 ms response time target
  • 11.
    Resource planning (1) •Resource planning: knowing when to upgrade your hardware or add new servers • Test to check or enhance your planning for upgrades, • or to make a contingency plan for unexpected increase of traffic
  • 12.
    Resource planning (2) What Why • Current usage: • Plan ahead required resources • Contingency plan for • Expected growth unexpected growth • Determine triggers / • Resource impact of warnings on new version resource usage
  • 13.
    What : Targetsof testing
  • 14.
    Designing tests (1) •Select requirements • Determine user to test count or load target • Select (or write) use • Amount of cases 'thinktime'
  • 15.
    Designing tests (2) •Order tests by • Aggregate test business risk or cases importance • Realistic load for • Most important tests combined scenario first! • Test individual and combined
  • 16.
    Test : Example •Login • Access main page • Target : 5 logins per • Enter login minute information • Determine response • Click login time (should be < 500 msecs)
  • 17.
    Metrics (1) Metrics are measurements or performance indicators used to monitor the test, and determine success or failure of the test.
  • 18.
    Metrics (2) • Selectmetrics for the requirements • Too much is better than too little • But, more measurements influence test (eg additional load)
  • 19.
    Metrics (3): examples •Resource planning: memory, CPU, bandwidth • User experience: response time, latency, throughput • Correctness: % failed tests
  • 20.
    How : Methodsfor testing
  • 21.
    Testing needs • Tools : open source or proprietary • Recording / writing actual test • Preparation • Execution • Reporting on results
  • 22.
    Tools • Open sourcetools: Jmeter, The Grinder (and more) • Proprietary tools: HP LoadRunner, SilkPerformer (and more) • Select the tool that matches your budget and needs
  • 23.
    Open source vsproprietary Open source Proprietary • Cheap • Support, • Extendable (not documentation always easy) • More functionality
  • 24.
    Recording / writingtests • Translate test design to executable test • Record and playback vs 'handwritten' • Parametrize user data – Scenario 'consumables' (eg usernames, order information) – Dynamic session data (eg cookies) • Add assertions for expected results • Verify playback (both single and multiple user)
  • 25.
    Using Jmeter • Simpleexample: recording login into WebGUI
  • 26.
    Preparation for execution •Configure scenario (number of users, 'thinktime', ramp up, duration) • Combine scenarios for 'realistic' load • Setup monitoring of target application(s) / server(s) • Collect data for scenario 'consumables'
  • 27.
    Execution • Start the test • Keep an eye on metrics • Manually check the application during test • Verify that loadtest server(s) is not overloaded
  • 28.
    Results (1) • Correlatemetrics : dependencies and bottlenecks • Draw conclusions
  • 29.
    Results(2) • Verify requirements ('did we reach our targets')
  • 30.
    Results (3) • Communicateresults and findings to development, management • Record lessons learned for future reference • Preserve testware for future use
  • 31.
    Pitfalls (1) • Usingone loadserver: bottleneck is the testserver • Flooding the target machine • Not parametrizing data (caching effects, errors due to duplicate or incorrect data)
  • 32.
    Pitfalls (2) • Loadtestsoftware is not a browser (usually no or limited javascript support!) • Not checking for expected data (or absence of errors) in response of server • Test environment not enough production-like (impossible to predict behavior in production)
  • 33.
    WebGUI specifics • Managesession cookies (eg use HTTP Cookie Manager in JMeter) • Parameter separator of WebGUI (';') not supported by all software; only standard '&' (minor problems recording, WebGUI does support '&' in playback) • JMeter bug: recording proxy sometimes incorrectly encodes GET requests (stop and start proxy before recording next step)
  • 34.