Agenda
• Introduction and background
– Functional Testing Automation
– Performance Testing Automation
– Motivation
• Proposal
– From functional test scripts to performance test
scripts
• Related Work
• Conclusion and future work
Funcional Testing Automation
• Record and Playback
• User interface level automation
• Selenium
Just in case you don’t know Selenium
Tester / User
SUT: System Under Test
Manual Test Case
Execution
Just in case you don’t know Selenium
Functional
Test Scripts
Selenium captures
User Interactions
Tester / User
Executes and reports
SUT: System Under Test
Manual Test Case
Execution
Improved Selenium Scripts
Performance Testing Automation
• Record and playback, but at a Protocol level
• Load generator (OpenSTA)
– Simulation of hundreds of
Concurrent Virtual Users from
few machines
– They cannot be simulated with
real Browsers
– So, the tool execute processes
that simulate the HTTP traffic
How do we capture the traffic?
Performance
Test Scripts
OpenSTA captures
HTTP traffic
Tester / User
Executes and reports
SUT: System Under Test
HTTP traffic
Web Server
Manual Test Case
Execution
Performance Test Script
Depending on the
application
1 line in Selenium is
equivalent to 200
lines in OpenSTA
Performance Testing Methodology
• Vázquez, G., Reina, M., Toledo, F., de Uvarow, S., Greisin, E., López, H.:
Metodología de Pruebas de Performance. Presented at the JCC (2008).
Test Design Automation
Execute
AnalyzeFixBetween 30% and 50% in
automation tasks
Performance Testing Automation
• What’s the problem?
• DEMO: record one click with Selenium. See
the generated HTTP traffic with Fiddler
• What do you think is easier? Functional or
performance test scripts?
Motivation
• Performance testing is too expensive
• No flexibility
– If the application changes, you need to rebuild the
scripts
Goals for performance testing automation
• Reduce costs
• Improve flexibility
Why is it important to Modernization
• When do we do performance testing?
– New system into production environment
– Architectural changes
• Windows to Web
– Platform changes
• Operating System
• DBMS
• Web application
– Hardware changes
Overview
Tester / User
Manual Test Case
Execution
Functional
Test Scripts
Selenium captures
User Interactions
System Under
Test Interface
Overview
Functional
Test Scripts
System Under
Test
HTTP traffic
Sniffer HTTP
Automatic Test Case
Execution
HTTP session
System Under
Test Interface
Overview
Functional
Test Scripts
Http session
Http session
model
Generate
Performance Test
Scripts
Generate
Artifacts of the Process
Meta-model
Test code generation
Automatic generation
• Addition of Timers
• Validations as in the Selenium Script
• Modularization as in the Selenium Script
• Parameterizations (data-driven testing) as in
the Selenium Script
• Any of those actions require more effort
working at a protocol level than in the UI
Industrial Use
Project SUT # Scripts # VU
Human Resources
System
AS400 database, Java Web
system on Websphere
14 317
Production
Management
System
AS400 database, C# Web system
on Internet Information Services
5 55
Courts
Management
System
Java Web system on Tomcat with
Oracle database
5 144
Auction System Java Web system on Tomcat with
MySQL database
1 2000
Logistics System Java Web system on Weblogic
with Oracle database
9 117
Model driven development tools
• GeneXus (www.genexus.com)
– Model driven / code generator
• Main difficulties
– Small changes in the models, big changes in the
generated code, big changes at the protocol level
– Paradox: if you find problems, changes must be
done, if you make changes, scripts should be
rebuild. Do not change in order to avoid rework in
testing.
Results
• The effort required with our framework was
reduced more than 5 times
– Traditional approach: 6 to 10 hours per script
– Our approach: 1 to 5 hours per script
• Flexibility
– Maintenance in traditional approach: rebuild the
script from scratch
– Our approach: adjust Selenium script, regenerate
Related Work
• Generation of performance tests (Web Services)
– García Domínguez et al.: Performance Test Case
Generation for Java and WSDL-based Web
Services from MARTE. Advances in Internet
Technology. 2012.
Related Work
• Generation of performance tests (Web Systems)
– Use Selenium scripts for performance testing
• TestMaker (www.pushtotest.com)
• Scaleborn (www.scaleborn.com)
Related Work
• Generation of performance tests (Web Systems)
– De Sousa: Reusing Functional Testing in order to
Decrease Performance and Stress Testing Costs.
SEKE 2011.
• Generate statically the Selenium scripts to JMeter
scripts. They do not consider the http traffic
– Secondary requests
– Java scripts
Conclusions
• Objectives for performance testing
– Improve flexibility
– Reduce costs
• Future work
– Generate for different Load Generators
• Jmeter (jmeter.apache.org)
• Considering different protocols (FTP, SOAP, etc.)
Acknowledgement:
(ftoledo@abstracta.com.uy)
(mreina@abstracta.com.uy)
(fbaptista@abstracta.com.uy)
(macario.polo@uclm.es)
(beatriz.plamancha@uclm.es)
From Functional Test Scripts to
Performance Test Scripts for Web Systems
MSc. Federico Toledo
Eng. Matías Reina
Eng. Fabián Baptista
PhD. Macario Polo Usaola
PhD. Beatriz Pérez Lamancha
Gracias / Merci / Thank you for your attention!
Questions?

ENASE 2013 - SEM - (Francia) From Functional Test Scripts to Performance Test Scripts for Web Systems

  • 3.
    Agenda • Introduction andbackground – Functional Testing Automation – Performance Testing Automation – Motivation • Proposal – From functional test scripts to performance test scripts • Related Work • Conclusion and future work
  • 4.
    Funcional Testing Automation •Record and Playback • User interface level automation • Selenium
  • 5.
    Just in caseyou don’t know Selenium Tester / User SUT: System Under Test Manual Test Case Execution
  • 6.
    Just in caseyou don’t know Selenium Functional Test Scripts Selenium captures User Interactions Tester / User Executes and reports SUT: System Under Test Manual Test Case Execution
  • 7.
  • 8.
    Performance Testing Automation •Record and playback, but at a Protocol level • Load generator (OpenSTA) – Simulation of hundreds of Concurrent Virtual Users from few machines – They cannot be simulated with real Browsers – So, the tool execute processes that simulate the HTTP traffic
  • 9.
    How do wecapture the traffic? Performance Test Scripts OpenSTA captures HTTP traffic Tester / User Executes and reports SUT: System Under Test HTTP traffic Web Server Manual Test Case Execution
  • 10.
    Performance Test Script Dependingon the application 1 line in Selenium is equivalent to 200 lines in OpenSTA
  • 11.
    Performance Testing Methodology •Vázquez, G., Reina, M., Toledo, F., de Uvarow, S., Greisin, E., López, H.: Metodología de Pruebas de Performance. Presented at the JCC (2008). Test Design Automation Execute AnalyzeFixBetween 30% and 50% in automation tasks
  • 12.
    Performance Testing Automation •What’s the problem? • DEMO: record one click with Selenium. See the generated HTTP traffic with Fiddler • What do you think is easier? Functional or performance test scripts?
  • 13.
    Motivation • Performance testingis too expensive • No flexibility – If the application changes, you need to rebuild the scripts Goals for performance testing automation • Reduce costs • Improve flexibility
  • 14.
    Why is itimportant to Modernization • When do we do performance testing? – New system into production environment – Architectural changes • Windows to Web – Platform changes • Operating System • DBMS • Web application – Hardware changes
  • 15.
    Overview Tester / User ManualTest Case Execution Functional Test Scripts Selenium captures User Interactions System Under Test Interface
  • 16.
    Overview Functional Test Scripts System Under Test HTTPtraffic Sniffer HTTP Automatic Test Case Execution HTTP session System Under Test Interface
  • 17.
    Overview Functional Test Scripts Http session Httpsession model Generate Performance Test Scripts Generate
  • 18.
  • 19.
  • 20.
  • 21.
    Automatic generation • Additionof Timers • Validations as in the Selenium Script • Modularization as in the Selenium Script • Parameterizations (data-driven testing) as in the Selenium Script • Any of those actions require more effort working at a protocol level than in the UI
  • 22.
    Industrial Use Project SUT# Scripts # VU Human Resources System AS400 database, Java Web system on Websphere 14 317 Production Management System AS400 database, C# Web system on Internet Information Services 5 55 Courts Management System Java Web system on Tomcat with Oracle database 5 144 Auction System Java Web system on Tomcat with MySQL database 1 2000 Logistics System Java Web system on Weblogic with Oracle database 9 117
  • 23.
    Model driven developmenttools • GeneXus (www.genexus.com) – Model driven / code generator • Main difficulties – Small changes in the models, big changes in the generated code, big changes at the protocol level – Paradox: if you find problems, changes must be done, if you make changes, scripts should be rebuild. Do not change in order to avoid rework in testing.
  • 24.
    Results • The effortrequired with our framework was reduced more than 5 times – Traditional approach: 6 to 10 hours per script – Our approach: 1 to 5 hours per script • Flexibility – Maintenance in traditional approach: rebuild the script from scratch – Our approach: adjust Selenium script, regenerate
  • 25.
    Related Work • Generationof performance tests (Web Services) – García Domínguez et al.: Performance Test Case Generation for Java and WSDL-based Web Services from MARTE. Advances in Internet Technology. 2012.
  • 26.
    Related Work • Generationof performance tests (Web Systems) – Use Selenium scripts for performance testing • TestMaker (www.pushtotest.com) • Scaleborn (www.scaleborn.com)
  • 27.
    Related Work • Generationof performance tests (Web Systems) – De Sousa: Reusing Functional Testing in order to Decrease Performance and Stress Testing Costs. SEKE 2011. • Generate statically the Selenium scripts to JMeter scripts. They do not consider the http traffic – Secondary requests – Java scripts
  • 28.
    Conclusions • Objectives forperformance testing – Improve flexibility – Reduce costs • Future work – Generate for different Load Generators • Jmeter (jmeter.apache.org) • Considering different protocols (FTP, SOAP, etc.)
  • 29.
    Acknowledgement: (ftoledo@abstracta.com.uy) (mreina@abstracta.com.uy) (fbaptista@abstracta.com.uy) (macario.polo@uclm.es) (beatriz.plamancha@uclm.es) From Functional TestScripts to Performance Test Scripts for Web Systems MSc. Federico Toledo Eng. Matías Reina Eng. Fabián Baptista PhD. Macario Polo Usaola PhD. Beatriz Pérez Lamancha Gracias / Merci / Thank you for your attention! Questions?