Your SlideShare is downloading. ×
Using QTP for Load and Performance Testing of Rich Client Applications
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Using QTP for Load and Performance Testing of Rich Client Applications

3,174
views

Published on

An overview by Capgemini on using HP QTP for Load and Performance Testing

An overview by Capgemini on using HP QTP for Load and Performance Testing


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
3,174
On Slideshare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
48
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved Problem sind ID‘s und Batches
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • © 2010 Capgemini - All rights reserved
  • Transcript

    • 1. Twitter hashtag #HPSWU BTOT-WE-1145-8 Twitter hashtag #HPSWU
    • 2. Stephan Wiesner Barcelona, 2011 Using QTP for Load and Performance Testing of Rich Client Applications
    • 3. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
    • 4. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT AGENDA
          • Introduction
          • Challenges
          • Solutions
          • Lessons Learned
    • 5. Performance testing is testing that is performed, to determine how fast some aspect of a system performs under a particular workload
      • Performance testing definition
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Performance Testing Endurance Testing Endurance testing is usually done to determine if the application can sustain the continuous expected load. Load Testing A load test is usually conducted to understand the behaviour of the application under a specific expected load. Stress Testing Stress testing is used to understand the upper limits of capacity within the application landscape. Evaluation Evaluate results, perform bottle neck analysis, perform optimisation
    • 6. Customer performed a major hardware upgrade for it‘s critical trading system based on Standard Software © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Customer Changes Testproject
        • Large Swiss utility company
        • Central trading department
        • 7x24 business
        • ca. 40 active traders
        • Performance crucial for business transactions
        • Replace host database with modern Oracle based database system
        • Replace all server hardware
        • Virtualisation of all windows based servers
        • Implement high availability measures
        • Perform functional regression testing
        • Define and test high availability scenarios
        • Plan, test and manage the system migration
        • Plan, execute and evaluate extensive performance tests
      ! Overview of the situation
    • 7. The (new) system under test is build on a typical 3-tier architecture
      • Simplified system view
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
    • 8. QTP 9.2 was already used for automation of functional regression tests on testsystems
      • Overview of testsystems and -tools
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Testsystems Test Tools
        • New production system
        • 5 Testsystems, one of them was an exact duplicate of the Production
        • QC 9.2 for requirements, testcases and testscripts
        • QTP 9.2 for automated regression testing
      Reqs / Test Cases
        • No formal requirements (standard software)
        • Some test cases from old projects
    • 9. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT AGENDA
          • Introduction
          • Challenges
          • Solutions
          • Lessons Learned
    • 10. The performance testing faced three major challenges: unclear requirements, budget constraints and old technologies
      • Major challenges and derived measures
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
      • Time & Budget
      • 4 Weeks of total time
      • No money for new tools
      • No inhouse expertise
      Successful Performance Testing
      • Requirements
      • „ double the speed“
      • critical workflows
      • number of user
      • Technology
      • Client: Oracle Forms 6.0
      • No API for automation
      • No testdata generation
      Perform GUI based testautomation Reuse as much as possible Perform requirements gathering and priorisation
    • 11. The (performance) testing of standard software has some very specific challenges
      • Comparison of custom and standard software concerning testing
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Requirements and change management are crucial for performance testing of standard software Custom Software Standard Software Functional requirements
      • Customer specifies requirements
      • Functional test cases derived from requirements
      • Customer has “expectations”
      • Test cases derived from manual or best practices
      Nonfunctional requirements
      • Nonfunctional requirements part of contract
      • Software is provided on a “as is” basis
      Implementing changes
      • Contract specifies change process
      • Custom release cycle
      • Changes can affect all customers
      • Regular releases
      Testautomation
      • Customer can influence development and testautomation can be part of contract (e.g. automatic smoke tests per release)
      • Customer has low influence
      • Changes have high impact on automation by customer
    • 12. Reuse of existing automated workflow tests and lack of API support can be a challenge © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Workflow Functions Testdata API Business Logic API API System Under Test (Black-Box) GUI Testdata Testdata System A System B Database
    • 13. QTP simulates real users – using mouse, keyboard and screen © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Only ONE instance of QTP can run on any single computer!
    • 14. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT AGENDA
          • Introduction
          • Challenges
          • Solutions
          • Benefits and Lessons Learned
    • 15. A subset of already automated workflow test cases was selected for performance testing
      • Testcase transformation process
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Customer & Test Team Automated Workflow Tests Performance Tests Select test cases that simulate a realistic workload
      • Select test cases with customer
      • Adapt for concurrent execution
      • Adapt for perf. testing (data, logging, etc.)
    • 16. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Using 20 QTP licences and virtual systems enabled us to simulate 20 concurrent users Overview of test system and usage of QTP
    • 17. Using QTP for performance tests means a lot of manual work
      • List of Manual Tasks
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
      • Log into virtual client
      • Start QTP
      • Load script
      • Start execution
      • Stop execution
      • Aggregate log files
      • Perform analysis
      • Optional: reporting
      x 20 Can we automate this?
    • 18. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT QualityCenter (QC) can execute distributed test scripts Overview of test system and usage of QTP
    • 19. Performance tests where performed on production data © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT 1
      • Export production
      • Import into test systems and isolation
      • DB migration
      • Optional: perform data validation
      • Optional: duplicate data
      • Optional: import into reference database
      2 3 5 4 Overview data replication and migration process 6 Production T03 T05 T06 T07
    • 20. The use of production data on testsystems requires some kind of isolation to ensure safe testing
      • Isolation of Production Data
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Test system Internal systems with external connections Internal systems Partner systems
      • Export productive database
      • Select isolation script depending on target system
      • Isolation
      The Scripts are reusable, under version control, and use parameters. The Goal is to control the communication with internal and external Systems Productive Data Isolated Data 1. Isolation- script 2. 3. [email_address] [email_address]
    • 21. All tests were executed on old and new system to compare the results
      • Test execution workflow
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT QC 9.2 Old system New system
      • Reset data
      • Execute scripts
      • Perform reporting
      QTP 9.2 on VM clients Data 1.10.200X Data 1.10.200X QTP scripts
    • 22. Although the requirements only demanded End2End performance, we introduced several additional points of measurement . . .
      • Examples of points of measurement
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Log In and Out should be measured separately
      • Start script
      • Log in
      • Screen 1
      • Screen 2
        • Start calculation
        • End calculation
      • Screen N
      • Log out
      Log in and out should be measured separately Good entry point for bottle neck analysis Customer demanded End2End execution time (2 x faster)
    • 23. Log files contained status, duration, workflow ID, etc
      • Example extract from logfile
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
    • 24. We defined test scenarios for full load, base load and average load by varying the number of concurrent users (1/2)
      • Three test scenarios
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Don’t change think times!
    • 25. We defined test scenarios for full load, base load and average load by varying the number of concurrent users (2/2)
      • Example report for different test scenarios
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT Old system was highly sensitive to load differences New system was not sensitive to load differences The new database had 32 CPUs (vs. 4)  20 QTP instances proofed that the application supported concurrent execution!
    • 26. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT AGENDA
          • Introduction
          • Challenges
          • Solutions
          • Lessons Learned
    • 27. Lesson learned: capture replay is not sufficient – use a central library and global configurations
      • Use of central library and parameterization
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT iterations = Environment(&quot;v_NumberOfIterations&quot;) InitLog While Iterations <> 0 RunWorkflow(&quot;FU&quot;) RunWorkflow(&quot;SF&quot;) [. . .] If (Iterations > 0)Then Iterations = Iterations - 1 End If Wend <Variable> <Name>v_NumberOfIterations</Name> <Value>2</Value> </Variable> Central configuration file Central execution flow
    • 28. Lesson learned: find a way to stop test execution “smartly”
      • How to stop running tests using “stop-files”
      © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
      • Modes of performancetest execution:
        • Number of runs
        • Time frame
        • Ramp up (and down)
      And in case of error? Hard stop of QTP scripts using stop button Result: “probably” corrupt data
      • Better: after each block of code:
        • Check for existence of stop-file
        • If file exists perform controlled stop
        • Else go to next block
      Try to preserve consistent data
    • 29. Lesson learned © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
        • Testdata
        • Test systems
        • QTP
          • Use production data (anonymized and/or isolated)
          • Define a way to reset the data (fast)
          • Virtualisation cuts costs and eases system management
          • Use QC test execution controls
          • Define central library (logging, bottle neck analysis, error handling, etc.)
          • Use read-only-tests (where possible)
          • Clean data after testruns wherever possible
    • 30. © 2010 Capgemini - All rights reserved HP_BARCELONA_WIESNER_06.PPT
    • 31. www.de.capgemini.com Thanks! [email_address]
    • 32.
        • Continue the conversation with your peers at the HP Software Community hp.com/go/swcommunity