When modernizing systems the software is migrated from one platform to another. There are big risks concerning the performance the system should have in the new platform. A new system cannot take more time to perform the same operations than the previous one as the users will refuse it. Therefore, the preventive performance test is crucial to guarantee the success of the modernization project. However, the automation tasks for performance testing are too demanding, in terms of time and effort, as the tools work at a communication protocol level. Though not free, the functional testing automation is easier to accomplish than the performance testing automation as the tools work at a graphic user interface level; the tools are therefore more intuitive and they have to handle less variables and technical issues. In this article we present a tool that we developed for industrial usage to automatically generate performance tests scripts from automated functional tests. The tool has been used in several projects in the industry, achieving important effort savings and improving flexibility.
ENASE 2013 - SEM - (Francia) From Functional Test Scripts to Performance Test Scripts for Web Systems
1.
2.
3. Agenda
• Introduction and background
– Functional Testing Automation
– Performance Testing Automation
– Motivation
• Proposal
– From functional test scripts to performance test
scripts
• Related Work
• Conclusion and future work
5. Just in case you don’t know Selenium
Tester / User
SUT: System Under Test
Manual Test Case
Execution
6. Just in case you don’t know Selenium
Functional
Test Scripts
Selenium captures
User Interactions
Tester / User
Executes and reports
SUT: System Under Test
Manual Test Case
Execution
8. Performance Testing Automation
• Record and playback, but at a Protocol level
• Load generator (OpenSTA)
– Simulation of hundreds of
Concurrent Virtual Users from
few machines
– They cannot be simulated with
real Browsers
– So, the tool execute processes
that simulate the HTTP traffic
9. How do we capture the traffic?
Performance
Test Scripts
OpenSTA captures
HTTP traffic
Tester / User
Executes and reports
SUT: System Under Test
HTTP traffic
Web Server
Manual Test Case
Execution
11. Performance Testing Methodology
• Vázquez, G., Reina, M., Toledo, F., de Uvarow, S., Greisin, E., López, H.:
Metodología de Pruebas de Performance. Presented at the JCC (2008).
Test Design Automation
Execute
AnalyzeFixBetween 30% and 50% in
automation tasks
12. Performance Testing Automation
• What’s the problem?
• DEMO: record one click with Selenium. See
the generated HTTP traffic with Fiddler
• What do you think is easier? Functional or
performance test scripts?
13. Motivation
• Performance testing is too expensive
• No flexibility
– If the application changes, you need to rebuild the
scripts
Goals for performance testing automation
• Reduce costs
• Improve flexibility
14. Why is it important to Modernization
• When do we do performance testing?
– New system into production environment
– Architectural changes
• Windows to Web
– Platform changes
• Operating System
• DBMS
• Web application
– Hardware changes
15. Overview
Tester / User
Manual Test Case
Execution
Functional
Test Scripts
Selenium captures
User Interactions
System Under
Test Interface
21. Automatic generation
• Addition of Timers
• Validations as in the Selenium Script
• Modularization as in the Selenium Script
• Parameterizations (data-driven testing) as in
the Selenium Script
• Any of those actions require more effort
working at a protocol level than in the UI
22. Industrial Use
Project SUT # Scripts # VU
Human Resources
System
AS400 database, Java Web
system on Websphere
14 317
Production
Management
System
AS400 database, C# Web system
on Internet Information Services
5 55
Courts
Management
System
Java Web system on Tomcat with
Oracle database
5 144
Auction System Java Web system on Tomcat with
MySQL database
1 2000
Logistics System Java Web system on Weblogic
with Oracle database
9 117
23. Model driven development tools
• GeneXus (www.genexus.com)
– Model driven / code generator
• Main difficulties
– Small changes in the models, big changes in the
generated code, big changes at the protocol level
– Paradox: if you find problems, changes must be
done, if you make changes, scripts should be
rebuild. Do not change in order to avoid rework in
testing.
24. Results
• The effort required with our framework was
reduced more than 5 times
– Traditional approach: 6 to 10 hours per script
– Our approach: 1 to 5 hours per script
• Flexibility
– Maintenance in traditional approach: rebuild the
script from scratch
– Our approach: adjust Selenium script, regenerate
25. Related Work
• Generation of performance tests (Web Services)
– García Domínguez et al.: Performance Test Case
Generation for Java and WSDL-based Web
Services from MARTE. Advances in Internet
Technology. 2012.
26. Related Work
• Generation of performance tests (Web Systems)
– Use Selenium scripts for performance testing
• TestMaker (www.pushtotest.com)
• Scaleborn (www.scaleborn.com)
27. Related Work
• Generation of performance tests (Web Systems)
– De Sousa: Reusing Functional Testing in order to
Decrease Performance and Stress Testing Costs.
SEKE 2011.
• Generate statically the Selenium scripts to JMeter
scripts. They do not consider the http traffic
– Secondary requests
– Java scripts
28. Conclusions
• Objectives for performance testing
– Improve flexibility
– Reduce costs
• Future work
– Generate for different Load Generators
• Jmeter (jmeter.apache.org)
• Considering different protocols (FTP, SOAP, etc.)