EuroSTAR Software Testing Conference 2012 presentation on Performance Testing of a Road Tolling System by Siegfried Goeschl. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/
14. Project Overview
• Vehicles are passing through a tolling point
• The vehicle is identified either using the
license plate number (via OCR) or a tag
• Vehicle passage are transmitted to Open
Road Tolling Back Office (ORTBO)
• 90% video passages with three images
• 10% tag passages
Donnerstag, 27. September 12 14
15. Project Overview
• Vehicle passages are collected and assigned
to road-user accounts in the Transaction
Clearing House (TCH)
• TCH responsible for billing the customer
• Passages of unregistered or black-listed
cars are processed by the Violation
Processing Center (VPC)
Donnerstag, 27. September 12 15
16. Project Overview
• Overall system contains multiple Oracle
RACs and MSSQL clusters
• Interfaces between the subsystems mostly
web services
• Backup data center replicates the “Central
Operation Center” over dark fiber
Donnerstag, 27. September 12 16
17. Schedule Phase One
• Performance testing of CRM Server
simulating 771 concurrent & active CRM
client sessions
• Performance testing of public-facing web
portal simulating 400 concurrent & active
web user sessions
• On success the CRM subsystem goes live
and has real user accounts
Donnerstag, 27. September 12 17
18. Schedule Phase Two
• Simulating up to 162 vehicle passage per
second
• Validating the end-to-end processing under
load (passage processing, customer billing
and violation processing)
• On success the overall system goes live
Donnerstag, 27. September 12 18
19. Performance Test Scope
• End to end performance acceptance test
• Mandatory and client-witnessed test
• Strict acceptance criteria regarding
‣ throughput
‣ number and severity of errors
‣ validation of test runs
Donnerstag, 27. September 12 19
20. Performance Test Types
Baseline Test One hour average load
Stress Test One hour peak load
Endurance Test 24 hours maximum load
Donnerstag, 27. September 12 20
21. Performance Test Hardware
• Six dedicated load injectors
‣ Windows 2008 Server
‣ 3 physical and 3 virtual boxes
‣ Dual quad-cores with 8 GB RAM
• No direct access outside the data center
‣ Citrix over RDP over RDP
Donnerstag, 27. September 12 21
22. Performance Test Tools
• All performance test tools have issues
‣ Make sure that they work for you
• We prefer JVM-based test tools
‣ Easy to migrate between different OS
‣ We are Java developers
• We prefer command-line invocation
‣ RDP over slow network is really slow
Donnerstag, 27. September 12 22
23. Performance Test Tools
• We use a non-distributed installation
‣ Complex network and routing
‣ Firewall not under our control
• Check the license and price tags
‣ Virtual users can be expensive
‣ Connectivity to license server?
Donnerstag, 27. September 12 23
24. The Test Tools We Used
Subsystem Interface Planned Delivered
CRM Server WebService
Public Web Portal HTML
Vehicle Passage
Processing WebService
Donnerstag, 27. September 12 24
25. Apache JMeter
• Open Source and free
• Implemented a new reporting backend to
scope with huge result files
• Rock solid - only minor issues
• Extensible using scripting and Java libraries
• Rough GUI compared to commercial tools
• Unsuitable for complex ASP.NET web sites
Donnerstag, 27. September 12 25
26. WAPT Pro
• Excellent value for money
• Good support
• ASP.NET module saved my day
• Extensible through Javascript
• One major issue with generation of test
values under high load (requires restart)
Donnerstag, 27. September 12 26
27. soapUI Pro
• Excellent for prototyping and functional
testing of web services
• WebService mocking saved my day
• Extensible using Groovy and Java libraries
• Various issues during load testing
• soapUI was replaced with JMeter for
performance tests
Donnerstag, 27. September 12 27
28. Some More Thoughts
• No software installation
• Take two (or more)
• Automation is your friend
• Performance test for everyone
• Performance tests are a valuable asset
Donnerstag, 27. September 12 28
29. No Software Installation
• Portable Apps on USB Stick
‣ Run tests directly from USB Stick
‣ Clone USB Stick to hard disk
• Have all your tools on the USB stick
‣ Java, JMeter, Editor, Git, ...
Donnerstag, 27. September 12 29
30. Take Two (or more)
• A minimum of two share-nothing load
injectors are required to prove that the
production servers is causing the
performance bottleneck (and not your
test tool)
• We used a maximum of five share-nothing
load injectors simultaneously
Donnerstag, 27. September 12 30
31. Automation Is Your Friend
• Performance test were run mostly at
night and over the weekend
• Test execution managed by Hudson
• Test failures triggers email notification
• Test protocols are copied to FTP server
Donnerstag, 27. September 12 31
32. Performance Tests for Everyone
• All test scenarios are configured as Hudson
jobs
‣ Baseline, stress & endurance test
‣ Some other internal tests
• Everyone can start and monitor a
performance test scenario over the web
browser
• Even developers can run performance tests
Donnerstag, 27. September 12 32
33. Performance Tests as Asset
• Performance tests can be used in creative
and un-planned ways
‣ Smoke test for new deployment
‣ Testing database failover
‣ Testing data center replication & failover
‣ Testing different database setups
Donnerstag, 27. September 12 33
34. The few things you
should take home
Donnerstag, 27. September 12 34
35. Lessons Learned
• Tests tools are buggy
• Client-witnessed test are hard
• Performance tests are a valuable asset
• Creation and maintenance of complex
performance test suite is a project on its
own
Donnerstag, 27. September 12 35