ENASE 2013 - SEM - (Francia) From Functional Test Scripts to Performance Test Scripts for Web Systems


Published on

When modernizing systems the software is migrated from one platform to another. There are big risks concerning the performance the system should have in the new platform. A new system cannot take more time to perform the same operations than the previous one as the users will refuse it. Therefore, the preventive performance test is crucial to guarantee the success of the modernization project. However, the automation tasks for performance testing are too demanding, in terms of time and effort, as the tools work at a communication protocol level. Though not free, the functional testing automation is easier to accomplish than the performance testing automation as the tools work at a graphic user interface level; the tools are therefore more intuitive and they have to handle less variables and technical issues. In this article we present a tool that we developed for industrial usage to automatically generate performance tests scripts from automated functional tests. The tool has been used in several projects in the industry, achieving important effort savings and improving flexibility.

Published in: Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

ENASE 2013 - SEM - (Francia) From Functional Test Scripts to Performance Test Scripts for Web Systems

  1. 1. Agenda• Introduction and background– Functional Testing Automation– Performance Testing Automation– Motivation• Proposal– From functional test scripts to performance testscripts• Related Work• Conclusion and future work
  2. 2. Funcional Testing Automation• Record and Playback• User interface level automation• Selenium
  3. 3. Just in case you don’t know SeleniumTester / UserSUT: System Under TestManual Test CaseExecution
  4. 4. Just in case you don’t know SeleniumFunctionalTest ScriptsSelenium capturesUser InteractionsTester / UserExecutes and reportsSUT: System Under TestManual Test CaseExecution
  5. 5. Improved Selenium Scripts
  6. 6. Performance Testing Automation• Record and playback, but at a Protocol level• Load generator (OpenSTA)– Simulation of hundreds ofConcurrent Virtual Users fromfew machines– They cannot be simulated withreal Browsers– So, the tool execute processesthat simulate the HTTP traffic
  7. 7. How do we capture the traffic?PerformanceTest ScriptsOpenSTA capturesHTTP trafficTester / UserExecutes and reportsSUT: System Under TestHTTP trafficWeb ServerManual Test CaseExecution
  8. 8. Performance Test ScriptDepending on theapplication1 line in Selenium isequivalent to 200lines in OpenSTA
  9. 9. Performance Testing Methodology• Vázquez, G., Reina, M., Toledo, F., de Uvarow, S., Greisin, E., López, H.:Metodología de Pruebas de Performance. Presented at the JCC (2008).Test Design AutomationExecuteAnalyzeFixBetween 30% and 50% inautomation tasks
  10. 10. Performance Testing Automation• What’s the problem?• DEMO: record one click with Selenium. Seethe generated HTTP traffic with Fiddler• What do you think is easier? Functional orperformance test scripts?
  11. 11. Motivation• Performance testing is too expensive• No flexibility– If the application changes, you need to rebuild thescriptsGoals for performance testing automation• Reduce costs• Improve flexibility
  12. 12. Why is it important to Modernization• When do we do performance testing?– New system into production environment– Architectural changes• Windows to Web– Platform changes• Operating System• DBMS• Web application– Hardware changes
  13. 13. OverviewTester / UserManual Test CaseExecutionFunctionalTest ScriptsSelenium capturesUser InteractionsSystem UnderTest Interface
  14. 14. OverviewFunctionalTest ScriptsSystem UnderTestHTTP trafficSniffer HTTPAutomatic Test CaseExecutionHTTP sessionSystem UnderTest Interface
  15. 15. OverviewFunctionalTest ScriptsHttp sessionHttp sessionmodelGeneratePerformance TestScriptsGenerate
  16. 16. Artifacts of the Process
  17. 17. Meta-model
  18. 18. Test code generation
  19. 19. Automatic generation• Addition of Timers• Validations as in the Selenium Script• Modularization as in the Selenium Script• Parameterizations (data-driven testing) as inthe Selenium Script• Any of those actions require more effortworking at a protocol level than in the UI
  20. 20. Industrial UseProject SUT # Scripts # VUHuman ResourcesSystemAS400 database, Java Websystem on Websphere14 317ProductionManagementSystemAS400 database, C# Web systemon Internet Information Services5 55CourtsManagementSystemJava Web system on Tomcat withOracle database5 144Auction System Java Web system on Tomcat withMySQL database1 2000Logistics System Java Web system on Weblogicwith Oracle database9 117
  21. 21. Model driven development tools• GeneXus (www.genexus.com)– Model driven / code generator• Main difficulties– Small changes in the models, big changes in thegenerated code, big changes at the protocol level– Paradox: if you find problems, changes must bedone, if you make changes, scripts should berebuild. Do not change in order to avoid rework intesting.
  22. 22. Results• The effort required with our framework wasreduced more than 5 times– Traditional approach: 6 to 10 hours per script– Our approach: 1 to 5 hours per script• Flexibility– Maintenance in traditional approach: rebuild thescript from scratch– Our approach: adjust Selenium script, regenerate
  23. 23. Related Work• Generation of performance tests (Web Services)– García Domínguez et al.: Performance Test CaseGeneration for Java and WSDL-based WebServices from MARTE. Advances in InternetTechnology. 2012.
  24. 24. Related Work• Generation of performance tests (Web Systems)– Use Selenium scripts for performance testing• TestMaker (www.pushtotest.com)• Scaleborn (www.scaleborn.com)
  25. 25. Related Work• Generation of performance tests (Web Systems)– De Sousa: Reusing Functional Testing in order toDecrease Performance and Stress Testing Costs.SEKE 2011.• Generate statically the Selenium scripts to JMeterscripts. They do not consider the http traffic– Secondary requests– Java scripts
  26. 26. Conclusions• Objectives for performance testing– Improve flexibility– Reduce costs• Future work– Generate for different Load Generators• Jmeter (jmeter.apache.org)• Considering different protocols (FTP, SOAP, etc.)
  27. 27. Acknowledgement:(ftoledo@abstracta.com.uy)(mreina@abstracta.com.uy)(fbaptista@abstracta.com.uy)(macario.polo@uclm.es)(beatriz.plamancha@uclm.es)From Functional Test Scripts toPerformance Test Scripts for Web SystemsMSc. Federico ToledoEng. Matías ReinaEng. Fabián BaptistaPhD. Macario Polo UsaolaPhD. Beatriz Pérez LamanchaGracias / Merci / Thank you for your attention!Questions?