• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
PowerPoint Handouts
 

PowerPoint Handouts

on

  • 1,810 views

 

Statistics

Views

Total Views
1,810
Views on SlideShare
1,809
Embed Views
1

Actions

Likes
0
Downloads
71
Comments
0

1 Embed 1

http://www.slideshare.net 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • QA MANAGER—RYALND/RAJ
  • Performance Center is Web-enabled global load testing tool, which is specially designed to streamline the testing process and increase the test efficiency

PowerPoint Handouts PowerPoint Handouts Presentation Transcript

  • Performance Testing Process SASQAG March 2007 Emily Ren T-Mobile
  • Why We Need Performance Testing?
    • Before release, managers need to know:
      • Do we have enough hardware?
      • Can we handle the target load?
      • How many users can we handle?
      • Is the system fast enough to make customers happy?
  • Nature of Performance Testing
    • It is very different from functional testing. A very challenging job
    • It requires stellar cooperation and coordination: it is a whole team effort!
    • Automation tools are very powerful, but expensive and complex, training is needed
    • It can be fun too!
  • Why We Need Performance Testing?
    • The failure of an application can be costly
    • Assure performance and functionality under real-world conditions
    • Locate potential problems before our customers do
    • Reduce development time – multiple rounds of load testing
    • Reduce infrastructure cost
  • When we do it
    • During design and development
      • What is the best server to support target load?
      • Define system performance requirements
    • Before release
      • Is the system reliable enough to go into production?
      • After functional testing done
    • Post-deployment
      • What is the cause of performance degradation?
  • What we are doing
    • Performance testing before release :
        • Application response times
        • - How long does it take to complete a task?
        • Configuration sizing
        • - Which configuration provides the best performance level?
        • Capacity planning
        • - How many users can the system handle?
        • Regression
        • - Does the new version of the software adversely affect response time?
        • Reliability
        • - How stable is the system under heavy work load?
  • Plan Test Create Scripts Scenario Creation Scenario Execution Result Analysis Load Testing Process Performance Tuning
  • Perf. Test Planning Documents
    • Performance Testing Initial Assessment
    • - Pre-test plan document
    • - Help project team to brainstorm their test scope
    • Performance Test Request Form
    • - Detail information related to whole performance testing process, including setup goals, environment, business process, performance requirement (e.g., response time), usage information, internal support team, etc.
  • What we are doing
    • 1. Test Planning - Before we run load testing
    • - Setup goals
        • Measure application response time
        • Configuration sizing
        • Capacity planning
        • Regression
        • Reliability
    • - Type of testing
        • Load Testing (System performance testing with SLA target load)
        • Stress Testing (Capacity testing to find out breaking point)
        • Duration Testing (Reliability testing to test the system under load)
  • What we are doing – Cont.
    • - Identify usage information - Business Profile
        • Which business processes to use
          • BA, Dev team responsible for definition
        • Isolate peak load and peak time
          • BA, Dev, application support responsible for definition
        • Document user actions and input data for each business process
          • SME/Functional Testing team responsible for creation of business process document
  • Sample : Business Profile 1 - HR App. Business Processes 1 min 3 min 2-3 min Preferred Response Time (Total, including think time) 9000 350 100% Total > 3 min 1000 50 20% Update personal info. > 5 min 6000 200 60% Time Entry > 5 min 2000 100 20% Browse Unacceptable Response Time (Total , including think time) Peak number of concurrent users Avg Number of concurrent users Total Users (%) Business Process
  • Sample : Business Profile 2 – eCommerce Business Processes 3-5 sec 3-5 sec 3-5 sec Preferred Response Time (Each transaction) 8000 100% Total > 8 sec 1000 4-6 pm 20% Display order > 8 sec 6000 4-6 pm 60% Browse Catalog > 8 sec 1000 4-6 pm 20% Create Order Unacceptable Response Time (Each transaction) Peak Load (# of users) Peak Time Total Users (%) Business Process
  • What we are doing – Cont.
    • - Business Profile is the basis for load testing
        • It is the traffic model of the application
        • The better the documentation of the business processes, the better the test scripts and scenarios.
        • Save time on script and scenario creation
        • Good business profile can make it possible to reuse existing load testing scripts and results later.
  • What we are doing – Cont.
    • 2. Create Scripts
    • - Automate business processes in LoadRunner VUGen (Virtual User Generator):
      • Scripts are C, C++-like code
      • Scripts are different with different protocol/technology
      • LoadRunner has about 50 protocols, including WAP
    • Record user actions
      • Need assistance of SME/Functional Testing group
    • Add programming and test data in the scripts
      • E.g. add correlation to handle dynamic data, e.g. session id
      • Test data may need lot of work from project team
  • Sample Script
    • web_submit_data("logon.sap",
    • "Action=http://watstwscrm02:50000/bd/logon.sap",
    • "Method=POST",
    • "RecContentType=text/html",
    • "Referer=http://watstwscrm02:50000/bd/startEBPP.sap",
    • "Snapshot=t3.inf",
    • "Mode=HTML",
    • ITEMDATA,
    • "Name=login_submit", "Value=true", ENDITEM,
    • "Name=j_authscheme", "Value=default", ENDITEM,
    • "Name=j_alias", "Value={UserName}", ENDITEM,
    • "Name=j_password", "Value=coffee@2", ENDITEM,
    • "Name=j_language", "Value=EN", ENDITEM,
    • "Name=AgreeTerms", "Value=on", ENDITEM,
    • "Name=Login", "Value=Log on", ENDITEM,
    • LAST);
  • What we are doing – Cont.
    • 3. Create Test Scenario
    • - Build test scenario according to usage information in Business Profile
    • Load Calculation
    • Can use rendezvous point, IP Spoofing, etc.
    • - Run-Time setting
    • Think time
    • Pacing
    • Browser Emulation: simulate browser cache, new user each iteration
    • Browser version, bandwidth, etc.
  • What we are doing – Cont.
    • 4. Execute Load Testing
    • Execute test scenarios with automated test scripts in LoadRunner Controller
    • Isolate top time transactions with low load
    • Overdrive test (120% of full load) to isolate SW & HW limitations
    • - Work with Internal Support Team to monitor the whole system, e.g., web server, DB server, middleware, etc.
  • Example Parameters to Monitor
    • system - % total processor time
    • Memory - page faults/sec
    • Server work queues - bytes transferred/sec
    • HTTP Response
    • Number of connections
    • Support team will have better ideas for what to monitor
    • Individual write-up is highly suggested as part of test report
    • ---need to get csv files, then import to LoadRunner
  • What we are doing – Cont.
    • 5. Analyze Test Result - Analysis
    • - Collect statistics and graphs from LoadRunner
    • - Report results
      • - Most commonly requested results:
      • Transaction Response time
      • Throughput
      • Hits per sec
      • HTTP response
      • Network Delay
      • *Server Performance
      • - Merge graphs to make it more meaningful
      • Transaction response time under load
      • Response time/Vuser vs CPU utilization
      • Cross scenario graphs
  • What we are doing – Cont.
    • 6. Test Report
    • - Don’t send LoadRunner result and graphs directly
    • - Send summary to the whole team
    • - Report key performance data and back end performance data
    • - Add notes for each test run
    • - Keep test history: for team to compare test runs
  • What we are doing – Cont.
    • 7. Performance Tuning
    • - Help identify the bottlenecks and degradation points to build an optimal system
      • - Hardware, Configuration, Database, Software, etc
    • - Drill down on transaction details,
    • - e.g. webpage breakdown
    • - Diagnostics
    • - Show Extended Log to dev team
    • - Data returned by server
    • - Advanced Trace: Show logs of all VUser messages and function calls
  • What we are doing – Cont.
    • 8. Communication Plan
    • - Internal Support Team:
    • - PM, BA, environment / development / architect, network, DBA, functional test lead, etc.
    • - Resource plan
  • Timeline/Activities - Example
    • Test Planning, Script Creation – 4 weeks
    • Test Execution – 4 weeks
    • Trail run - 2 days
    • Round 1 – Load Testing: Response time with SLA target load: 1 week
    • Round 2 – Stress Testing: find breaking point: 1 week
    • Round 3 – Duration (Reliability) test: 2 days
    • More performance tuning – 3 days
    • Document and deliver final report – 2-3 days
  • Projects
    • Projects :
    • All performance testing projects in
    • T-Mobile’s IT dept
    • 40+ projects in <3 years
    • The Standard Performance Testing Process has worked very well on all projects
  • Automation Tools - Mercury LoadRunner
      • Scripting : VUGen (Virtual User Generator)
      • Performance test execution:
        • Controller – build test scenarios according to business profile and load calculation
        • Load Generator – run virtual users
      • Performance test result analysis
        • Analysis
          • provides test reports and Graphs
          • Summarize the system performance
  • Automation Tools – Performance Center
    • Web-enabled global load testing tool Performance Testing team can manage multiple, concurrent load testing projects across different geographic locations
        • User Site - conduct and monitor load tests.
        • Privilege Manager- manage user and project access rights
        • Administration Site - for overall resource management and technical supervision
  • Automation Tools - Diagnostics
    • - Pinpoint Root Cause
      • Solve tough problems
      • Memory leaks and trashing
      • Thread deadlock and synchronization
      • Instance tracing
      • Exceptions
  • Diagnostics Methodology in Pre-production
    • Start with monitoring of business process
      • Which transactions are problematic
    • Eliminate system and network components
      • Infrastructure monitors and metrics
    • Isolate application Tier and method
      • Triage (using Transaction Breakdown)
    • Correct behavior and re-test
  • Broad Heterogeneous Platform Support
    • WebSphere J2EE/Portal Server
    • WebLogic J2EE/Portal Server
    • JBoss, Tomcat, JServ
    • Oracle Application Server J2EE
    • MS .NET
    • Generic/Custom JAVA
    • SAP Net/Weaver J2EE/Portal
    • Oracle 11i Applications
    • Siebel
  • Performance Engineering - Bridge the Gap
    • 80% of IT Organizations experience failures in apps that passed the test phases and rolled into production
    • HyPerformix – Performance Engineering
      • Production line: Designer, Optimizer and Capacity Manager
    • HyPerformix Optimizer (Capacity Planning): can bridge the gap between testing and production environments and leverage load test data to accurately show how the application will perform when in production.
  • Performance Engineering - HyPerformix Optimizer
    • Configuration sizing, Capacity planning
    • Create production-scale models
      • – Perf. Test team and Architect team work together
    • Load test and production perf. data are seamlessly integrated with Optimizer
    • Ensure capacity is match to current and future business requirements
    • Reduce risk before application deployment
  • What Performance Testing can do for business?
    • Performance testing is critical. Competition in market is high: customer switch cost is low, cost to keep customers is high
    • Performance Testing can protect revenue by helping to isolate and fix problems in the software infrastructure
    • Improve availability, functionality, and scalability of business critical applications
    • Ensure products are delivered to market with high confidence that system performance will be acceptable
    • Proactive performance testing can decrease costs of production support and help desk
    • A good Performance Testing Process is essential to get performance testing done right and on time!
  • Questions?
    • [email_address]
    • [email_address]
    • Tel: (425)748-6655 (desk)
        • (425)922-7100 (cell)