Your SlideShare is downloading. ×
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
PowerPoint Handouts
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

PowerPoint Handouts

1,313

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,313
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
72
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • QA MANAGER—RYALND/RAJ
  • Performance Center is Web-enabled global load testing tool, which is specially designed to streamline the testing process and increase the test efficiency
  • Transcript

    • 1. Performance Testing Process SASQAG March 2007 Emily Ren T-Mobile
    • 2. Why We Need Performance Testing?
      • Before release, managers need to know:
        • Do we have enough hardware?
        • Can we handle the target load?
        • How many users can we handle?
        • Is the system fast enough to make customers happy?
    • 3. Nature of Performance Testing
      • It is very different from functional testing. A very challenging job
      • It requires stellar cooperation and coordination: it is a whole team effort!
      • Automation tools are very powerful, but expensive and complex, training is needed
      • It can be fun too!
    • 4. Why We Need Performance Testing?
      • The failure of an application can be costly
      • Assure performance and functionality under real-world conditions
      • Locate potential problems before our customers do
      • Reduce development time – multiple rounds of load testing
      • Reduce infrastructure cost
    • 5. When we do it
      • During design and development
        • What is the best server to support target load?
        • Define system performance requirements
      • Before release
        • Is the system reliable enough to go into production?
        • After functional testing done
      • Post-deployment
        • What is the cause of performance degradation?
    • 6. What we are doing
      • Performance testing before release :
          • Application response times
          • - How long does it take to complete a task?
          • Configuration sizing
          • - Which configuration provides the best performance level?
          • Capacity planning
          • - How many users can the system handle?
          • Regression
          • - Does the new version of the software adversely affect response time?
          • Reliability
          • - How stable is the system under heavy work load?
    • 7. Plan Test Create Scripts Scenario Creation Scenario Execution Result Analysis Load Testing Process Performance Tuning
    • 8. Perf. Test Planning Documents
      • Performance Testing Initial Assessment
      • - Pre-test plan document
      • - Help project team to brainstorm their test scope
      • Performance Test Request Form
      • - Detail information related to whole performance testing process, including setup goals, environment, business process, performance requirement (e.g., response time), usage information, internal support team, etc.
    • 9. What we are doing
      • 1. Test Planning - Before we run load testing
      • - Setup goals
          • Measure application response time
          • Configuration sizing
          • Capacity planning
          • Regression
          • Reliability
      • - Type of testing
          • Load Testing (System performance testing with SLA target load)
          • Stress Testing (Capacity testing to find out breaking point)
          • Duration Testing (Reliability testing to test the system under load)
    • 10. What we are doing – Cont.
      • - Identify usage information - Business Profile
          • Which business processes to use
            • BA, Dev team responsible for definition
          • Isolate peak load and peak time
            • BA, Dev, application support responsible for definition
          • Document user actions and input data for each business process
            • SME/Functional Testing team responsible for creation of business process document
    • 11. Sample : Business Profile 1 - HR App. Business Processes 1 min 3 min 2-3 min Preferred Response Time (Total, including think time) 9000 350 100% Total > 3 min 1000 50 20% Update personal info. > 5 min 6000 200 60% Time Entry > 5 min 2000 100 20% Browse Unacceptable Response Time (Total , including think time) Peak number of concurrent users Avg Number of concurrent users Total Users (%) Business Process
    • 12. Sample : Business Profile 2 – eCommerce Business Processes 3-5 sec 3-5 sec 3-5 sec Preferred Response Time (Each transaction) 8000 100% Total > 8 sec 1000 4-6 pm 20% Display order > 8 sec 6000 4-6 pm 60% Browse Catalog > 8 sec 1000 4-6 pm 20% Create Order Unacceptable Response Time (Each transaction) Peak Load (# of users) Peak Time Total Users (%) Business Process
    • 13. What we are doing – Cont.
      • - Business Profile is the basis for load testing
          • It is the traffic model of the application
          • The better the documentation of the business processes, the better the test scripts and scenarios.
          • Save time on script and scenario creation
          • Good business profile can make it possible to reuse existing load testing scripts and results later.
    • 14. What we are doing – Cont.
      • 2. Create Scripts
      • - Automate business processes in LoadRunner VUGen (Virtual User Generator):
        • Scripts are C, C++-like code
        • Scripts are different with different protocol/technology
        • LoadRunner has about 50 protocols, including WAP
      • Record user actions
        • Need assistance of SME/Functional Testing group
      • Add programming and test data in the scripts
        • E.g. add correlation to handle dynamic data, e.g. session id
        • Test data may need lot of work from project team
    • 15. Sample Script
      • web_submit_data("logon.sap",
      • "Action=http://watstwscrm02:50000/bd/logon.sap",
      • "Method=POST",
      • "RecContentType=text/html",
      • "Referer=http://watstwscrm02:50000/bd/startEBPP.sap",
      • "Snapshot=t3.inf",
      • "Mode=HTML",
      • ITEMDATA,
      • "Name=login_submit", "Value=true", ENDITEM,
      • "Name=j_authscheme", "Value=default", ENDITEM,
      • "Name=j_alias", "Value={UserName}", ENDITEM,
      • "Name=j_password", "Value=coffee@2", ENDITEM,
      • "Name=j_language", "Value=EN", ENDITEM,
      • "Name=AgreeTerms", "Value=on", ENDITEM,
      • "Name=Login", "Value=Log on", ENDITEM,
      • LAST);
    • 16. What we are doing – Cont.
      • 3. Create Test Scenario
      • - Build test scenario according to usage information in Business Profile
      • Load Calculation
      • Can use rendezvous point, IP Spoofing, etc.
      • - Run-Time setting
      • Think time
      • Pacing
      • Browser Emulation: simulate browser cache, new user each iteration
      • Browser version, bandwidth, etc.
    • 17. What we are doing – Cont.
      • 4. Execute Load Testing
      • Execute test scenarios with automated test scripts in LoadRunner Controller
      • Isolate top time transactions with low load
      • Overdrive test (120% of full load) to isolate SW & HW limitations
      • - Work with Internal Support Team to monitor the whole system, e.g., web server, DB server, middleware, etc.
    • 18. Example Parameters to Monitor
      • system - % total processor time
      • Memory - page faults/sec
      • Server work queues - bytes transferred/sec
      • HTTP Response
      • Number of connections
      • Support team will have better ideas for what to monitor
      • Individual write-up is highly suggested as part of test report
      • ---need to get csv files, then import to LoadRunner
    • 19. What we are doing – Cont.
      • 5. Analyze Test Result - Analysis
      • - Collect statistics and graphs from LoadRunner
      • - Report results
        • - Most commonly requested results:
        • Transaction Response time
        • Throughput
        • Hits per sec
        • HTTP response
        • Network Delay
        • *Server Performance
        • - Merge graphs to make it more meaningful
        • Transaction response time under load
        • Response time/Vuser vs CPU utilization
        • Cross scenario graphs
    • 20. What we are doing – Cont.
      • 6. Test Report
      • - Don’t send LoadRunner result and graphs directly
      • - Send summary to the whole team
      • - Report key performance data and back end performance data
      • - Add notes for each test run
      • - Keep test history: for team to compare test runs
    • 21. What we are doing – Cont.
      • 7. Performance Tuning
      • - Help identify the bottlenecks and degradation points to build an optimal system
        • - Hardware, Configuration, Database, Software, etc
      • - Drill down on transaction details,
      • - e.g. webpage breakdown
      • - Diagnostics
      • - Show Extended Log to dev team
      • - Data returned by server
      • - Advanced Trace: Show logs of all VUser messages and function calls
    • 22. What we are doing – Cont.
      • 8. Communication Plan
      • - Internal Support Team:
      • - PM, BA, environment / development / architect, network, DBA, functional test lead, etc.
      • - Resource plan
    • 23. Timeline/Activities - Example
      • Test Planning, Script Creation – 4 weeks
      • Test Execution – 4 weeks
      • Trail run - 2 days
      • Round 1 – Load Testing: Response time with SLA target load: 1 week
      • Round 2 – Stress Testing: find breaking point: 1 week
      • Round 3 – Duration (Reliability) test: 2 days
      • More performance tuning – 3 days
      • Document and deliver final report – 2-3 days
    • 24. Projects
      • Projects :
      • All performance testing projects in
      • T-Mobile’s IT dept
      • 40+ projects in <3 years
      • The Standard Performance Testing Process has worked very well on all projects
    • 25. Automation Tools - Mercury LoadRunner
        • Scripting : VUGen (Virtual User Generator)
        • Performance test execution:
          • Controller – build test scenarios according to business profile and load calculation
          • Load Generator – run virtual users
        • Performance test result analysis
          • Analysis
            • provides test reports and Graphs
            • Summarize the system performance
    • 26. Automation Tools – Performance Center
      • Web-enabled global load testing tool Performance Testing team can manage multiple, concurrent load testing projects across different geographic locations
          • User Site - conduct and monitor load tests.
          • Privilege Manager- manage user and project access rights
          • Administration Site - for overall resource management and technical supervision
    • 27. Automation Tools - Diagnostics
      • - Pinpoint Root Cause
        • Solve tough problems
        • Memory leaks and trashing
        • Thread deadlock and synchronization
        • Instance tracing
        • Exceptions
    • 28. Diagnostics Methodology in Pre-production
      • Start with monitoring of business process
        • Which transactions are problematic
      • Eliminate system and network components
        • Infrastructure monitors and metrics
      • Isolate application Tier and method
        • Triage (using Transaction Breakdown)
      • Correct behavior and re-test
    • 29. Broad Heterogeneous Platform Support
      • WebSphere J2EE/Portal Server
      • WebLogic J2EE/Portal Server
      • JBoss, Tomcat, JServ
      • Oracle Application Server J2EE
      • MS .NET
      • Generic/Custom JAVA
      • SAP Net/Weaver J2EE/Portal
      • Oracle 11i Applications
      • Siebel
    • 30. Performance Engineering - Bridge the Gap
      • 80% of IT Organizations experience failures in apps that passed the test phases and rolled into production
      • HyPerformix – Performance Engineering
        • Production line: Designer, Optimizer and Capacity Manager
      • HyPerformix Optimizer (Capacity Planning): can bridge the gap between testing and production environments and leverage load test data to accurately show how the application will perform when in production.
    • 31. Performance Engineering - HyPerformix Optimizer
      • Configuration sizing, Capacity planning
      • Create production-scale models
        • – Perf. Test team and Architect team work together
      • Load test and production perf. data are seamlessly integrated with Optimizer
      • Ensure capacity is match to current and future business requirements
      • Reduce risk before application deployment
    • 32. What Performance Testing can do for business?
      • Performance testing is critical. Competition in market is high: customer switch cost is low, cost to keep customers is high
      • Performance Testing can protect revenue by helping to isolate and fix problems in the software infrastructure
      • Improve availability, functionality, and scalability of business critical applications
      • Ensure products are delivered to market with high confidence that system performance will be acceptable
      • Proactive performance testing can decrease costs of production support and help desk
      • A good Performance Testing Process is essential to get performance testing done right and on time!
    • 33. Questions?
      • [email_address]
      • [email_address]
      • Tel: (425)748-6655 (desk)
          • (425)922-7100 (cell)

    ×