Performance Testing Secrets in Context
Upcoming SlideShare
Loading in...5
×
 

Performance Testing Secrets in Context

on

  • 1,092 views

 

Statistics

Views

Total Views
1,092
Views on SlideShare
1,092
Embed Views
0

Actions

Likes
1
Downloads
50
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Performance Testing Secrets in Context Performance Testing Secrets in Context Presentation Transcript

  • Performance Testing Software Systems: A Heuristic Approach Derived from: Microsoft patterns & practices Performance Testing Guidance for Web Applications By: J.D. Meier, Carlos Farre, Prashant Bansode, Scott Barber, Dennis Rea © 2007 Microsoft Corporation. All rights reserved. http://www.codeplex.com/PerfTestingGuide Scott Barber Chief Technologist PerfTestPlus, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 1 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Testing Software Systems Scott Barber, (A.K.A. “The Perf Guy”) sbarber@perftestplus.com Chief Technologist, PerfTestPlus, Inc. www.perftestplus.com Vice President & Executive Director, Association for Software Testing www.associationforsoftwaretesting.org Contributing Author & Expert Advisor Performance Testing Guidance for Web Applications www.codeplex.com/PerfTestingGuide www.PerfTestPlus.com Performance Testing Software Systems Page 2 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Credits Some of this material was developed for, or inspired by, Performance Testing Guidance for Web Applications, a Microsoft patterns & practices book by J.D. Meier, Scott Barber, Carlos Farre, Prashant Bansode, and Dennis Rea. Many ideas in this course were inspired or enhanced by colleagues including Alberto Savoia, Roland Stens, Richard Leeke, Mike Kelly, Nate White, Rob Sabourin, Chris Loosley, Ross Collard, Jon Bach, James Bach, Jerry Weinberg, Cem Kaner, Dawn Haynes, Karen Johnson, and the entire WOPR community. Most of the concepts in this presentation are derived from publications, presentations, and research written and/or conducted by Scott Barber. Many ideas were improved by students who took previous versions of this course, back to 2001. This course has been heavily influenced by: Rapid Software Testing (James Bach & Michael Bolton, ©1995-2007 Satisfice, Inc.) Just-In-Time Testing (Robert Sabourin, ©1998-2007 Amibug, Inc.) www.PerfTestPlus.com Performance Testing Software Systems Page 3 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • I Assume That You: Test software performance or manage someone(s) who does. Have at least some control over the design of your tests and some time to create new tests. Have at least some influence over your test environment. Are worried that your test process is spending too much time and resources on things that aren‟t important AND/OR Are worried that your test process doesn‟t leave enough time and resources to determine what IS important. Believe that good testing requires thinking. Test under uncertainty, resource limitations and time pressure. Have a major goal to find important problems quickly. Want to get very good at testing software performance. www.PerfTestPlus.com Performance Testing Software Systems Page 4 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • “There is no such thing as a „junior performance tester‟… but there are people who are new to performance testing.“ --Scott Barber www.PerfTestPlus.com Performance Testing Software Systems Page 5 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructional Methods That I Use The Class Presents My Editorial Opinions: I do not make appeals to authority; I speak only from my experiences, and I appeal to your experience and intelligence. Not All Slides Will be Discussed: There is much more material here than I can cover in detail, so I may skip some of it. (If you want me to go back to something that I skipped, just ask.) I Need to Hear from You: You control what you think and do, so I encourage your questions about and challenges to the lecture. (Talk to me during the break, too.) If You Want Specifics, Bring Specifics: I invite you to bring real examples of testing problems and test documents to class. (I am happy to show you how I would work through them.) The Exercises are the Most Important Part: I sometimes use immersive socratic exercises that are designed to fool you if you don‟t ask questions. I usually do not provide all the information you need. Asking questions is a fundamental testing skill! Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 6 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructional Methods That I Use If I call on you, and you don’t want to be put on the spot, just say “Pass!” Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 7 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • What Not to Expect From Me Untested theory. Marketing fluff. Pulled punches to protect the guilty. The “One True Answer” to anything. Every concept to apply, precisely as presented, to every context. Over simplifications without acknowledgement. A disimpassioned, boring instructor! www.PerfTestPlus.com Performance Testing Software Systems Page 8 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Primary Goal of this Course To teach you how to think about, organize, and manage performance testing effectively, under time and resource constraints, by examining nine core principles common to successful performance testing projects and examining how you can rapidly apply those principles to your project context. www.PerfTestPlus.com Performance Testing Software Systems Page 9 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Secondary Goal of this Course To introduce you to how to apply heuristics and oracles to increase your ability to more efficiently and effectively achieving the objectives of you performance testing projects. www.PerfTestPlus.com Performance Testing Software Systems Page 10 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • “Let‟s face the truth, performance testing *IS* rocket science.” --Dawn Haynes www.PerfTestPlus.com Performance Testing Software Systems Page 11 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Mnemonic Mnemonics make difficult to remember information, less difficult to remember. adjective: “assisting or intended to assist the memory.” noun: “thing intended to assist the memory, as a verse or formula.” “Mnemonics Neatly Eliminate Man's Only Nemesis – Insufficient Cerebral Storage” An acrostic mnemonic to remember how to spell “mnemonics”. - http://www.mnemonic-device.eu www.PerfTestPlus.com Performance Testing Software Systems Page 12 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Mnemonics in this Course Were created by Scott Barber to help Scott organize and remember stuff. Have been, and will be, revised when revisions help Scott remember stuff better, or remember better stuff. Have been used and liked by some people other than Scott. Have proven to be “not so memorable” for some people. If these mnemonics don’t work for you, create your own!! www.PerfTestPlus.com Performance Testing Software Systems Page 13 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Heuristics Heuristics bring useful structure to problem-solving skill. adjective: “serving to discover.” noun: “a fallible method for solving a problem or making a decision.” “Heuristic reasoning is not regarded as final and strict but as provisional and plausible only, whose purpose is to discover the solution to the present problem.” - George Polya, How to Solve It Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 14 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Types of Heuristics Guideword Heuristics: Words or labels that help you access the full spectrum of your knowledge and experience as you analyze something. Trigger Heuristics: Ideas associated with an event or condition that help you recognize when it may be time to take an action or think a particular way. Like an alarm clock for your mind. Subtitle Heuristics: Help you reframe an idea so you can see alternatives and bring out assumptions during a conversation. Heuristic Model: A representation of an idea, object, or system that helps you explore, understand, or control it. Heuristic Procedure or Rule: A plan of action that may help solve a class of problems. Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 15 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Some Everyday Heuristics It‟s dangerous to drink and drive. A bird in hand is worth two in the bush. Nothing ventured, nothing gained. Sometimes people stash their passwords near their computers. Try looking there. Stores are open later during the Holidays. If your computer is behaving strangely, try rebooting. If it‟s very strange, reinstall Windows. If it‟s a genuinely important task, your boss will follow- up, otherwise, you can ignore it. Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 16 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Heuristics ≠ Process A heuristic is not an edict. Heuristics require guidance and control of skilled practitioner. Heuristics are context-dependent. Heuristics may be useful even when they contradict each other– especially when they do! Heuristics can substitute for complete and rigorous analysis. Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 17 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Heuristics in this Course Were created by Scott to help Scott organize and remember stuff. Are fallible (ask Scott about times when they have failed him). Are incomplete (ask Scott about times when he used these and still missed “stuff”). Are not all relevant to every project and every context. If these heuristics don’t work for you, create your own (and update the mnemonic)!! www.PerfTestPlus.com Performance Testing Software Systems Page 18 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Principles ≠ Process “One-size-fits-all” approaches fit performance testing poorly. In my experience, successful performance testing projects involve at least active decisions related to each of the core principles in this course. Core principles are neither exclusive, nor sequential. They go by many different names, have varying priorities, and may be implicit or explicit. Core principles are not in themselves an approach or process. Core principles represent a foundation upon which to build a process or approach based on the context of your project. www.PerfTestPlus.com Performance Testing Software Systems Page 19 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Testing Principles: (A mnemonic of guideword heuristics) www.PerfTestPlus.com Performance Testing Software Systems Page 20 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Testing Principles Project context is central to successful performance testing. Business, project, system, & user success criteria. Identify system usage, and key metrics; plan and design tests. Install and prepare environment, tools, & resource monitors. Script the performance tests as designed. Run and monitor tests. Validate tests, test data, and results. Analyze the data individually and as a cross-functional team. Consolidate and share results, customized by audience. "Lather, rinse, repeat" as necessary. www.PerfTestPlus.com Performance Testing Software Systems Page 21 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 22 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • When assessing project context, I (Another mnemonic of guideword heuristics) www.PerfTestPlus.com Performance Testing Software Systems Page 23 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Why do we test software system performance? To determine compliance with requirements? To evaluate release readiness? To assess user satisfaction? To assist in performance tuning? To estimate capacity? To validate assumptions? To generate marketing statements? www.PerfTestPlus.com Performance Testing Software Systems Page 24 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 25 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Do you know your performance testing mission? Do you know the “Commander‟s Intent”? Can you find out? Might COPE in PUBS help? Example from my days as a U.S. Army LT: Mission: Secure hilltop 42 NLT 0545 tomorrow. Commander’s Intent: It is my intent that the supply convoy safely cross the bridge spanning the gorge between hilltop 42 and hilltop 57 between 0553 and 0558 tomorrow. www.PerfTestPlus.com Performance Testing Software Systems Page 26 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Assemble into groups of 3-5 people (4 is ideal) that you don‟t normally work with. Use COPE in PUBS to describe your current or most recent performance testing project to one another. Pick the most interesting and jot down some notes about its context using COPE in PUBS. Be prepared to brief the class on the context of the project you chose AND be prepared to answer questions about the contexts of your teammates. www.PerfTestPlus.com Performance Testing Software Systems Page 27 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 28 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 29 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Criteria are boundaries dictated or presumed by someone or something that matters. Goals: Soft Boundaries (User Satisfaction) Requirements: Firm Boundaries (Business or Legal) Thresholds: Hard Boundaries (Laws of Physics) Constraints: Arbitrary Boundaries (Budget or Timeline) www.PerfTestPlus.com Performance Testing Software Systems Page 30 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Goals are soft boundaries typically representing the opinion of someone that matters. Typically target whole-system performance characteristics Are often “unsubstantiated opinions” Are often unattainable Must be well qualified, but can be loosely quantified www.PerfTestPlus.com Performance Testing Software Systems Page 31 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Requirements are firm boundaries frequently derived from contracts (that matter). Are actually required to pass to go live Are often externally dictated Are often continually monitored in production Are typically legally enforceable Must be both well qualified and quantified www.PerfTestPlus.com Performance Testing Software Systems Page 32 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Thresholds are hard boundaries that represent physical properties of a system (that matters). Are the maximum acceptable values for component-level resources of interest Based on published hardware or software performance recommendations or direct observation Are practically non-negotiable without an environmental change www.PerfTestPlus.com Performance Testing Software Systems Page 33 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Constraints are arbitrary boundaries set or presumed by someone that matters (but likely doesn‟t get “it”). Boundaries imposed by people, traditions, and/or assumptions Are sometimes exceedingly difficult to challenge Are almost always worth questioning if they jeopardize the project www.PerfTestPlus.com Performance Testing Software Systems Page 34 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • + => www.PerfTestPlus.com Performance Testing Software Systems Page 35 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Testing Objectives What we actually hope to gain by testing performance Are sometimes completely unrelated to stated requirements, goals, thresholds, or constraints Should be the main drivers behind performance test design and planning Usually indicate the performance-related priorities of project stakeholders Will frequently override goals in “go-live” decisions How do we know if we’re meeting our objectives? www.PerfTestPlus.com Performance Testing Software Systems Page 36 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • For a video lecture, see: http://www.satisfice.com/bbst/videos/BBSTORACLES.mp4 Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 37 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • An oracle is the principle or mechanism by which you recognize a problem. “..it works” “...it appeared at least once to meet some requirement to some degree.” Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 38 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Without an oracle you cannot recognize a problem If you think you see a problem, you must be using an oracle. Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 39 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Consistency (“this agrees with that”) is an important theme in oracles History: The present version of the system is consistent with past versions of it. Image: The system is consistent with an image that the organization wants to project. Comparable Products: The system is consistent with comparable systems. Claims: The system is consistent with what important people say it‟s supposed to be. Users’ Expectations: The system is consistent with what users want. Product: System elements are consistent with comparable elements in the system. Purpose: The system is consistent with its purposes, both explicit and implicit. Statutes: The system is consistent with applicable laws and legal contracts. Familiarity: The system is not consistent with the pattern of any familiar problem. Consistency heuristics rely on the quality of your models of the product and its context. Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 40 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • All Oracles Are Heuristic We often do not have oracles that establish a definite correct or incorrect result, in advance. That’s why we use abductive inference. No single oracle can tell us whether a program (or a feature) is working correctly at all times and in all circumstances. That’s why we use a variety of oracles. Any program that looks like it‟s working, to you, may in fact be failing in some way that happens to fool all of your oracles. That’s why we proceed with humility and critical thinking. You (the tester) can‟t know the deep truth about any result. That’s why we report whatever seems likely to be a bug. Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 41 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Coping With Difficult Oracle Problems Ignore the Problem Ask “so what?” Maybe the value of the information doesn‟t justify the cost. Simplify the Problem Ask for testability. It usually doesn‟t happen by accident. Built-in oracle. Internal error detection and handling. Lower the standards. You may be using an unreasonable standard of correctness or goodness. Shift the Problem Parallel testing. Compare with another instance of a comparable algorithm. Live oracle. Find an expert who can tell if the output is correct. Reverse the function. (e.g. 2 x 2 = 4, then 4/2 = 2) Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 42 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Coping With Difficult Oracle Problems Divide and Conquer the Problem Spot check. Perform a detailed inspection on one instance out of a set of outputs. Blink test. Compare or review overwhelming batches of data for patterns that stand out. Easy input. Use input for which the output is easy to analyze. Easy output. Some output may be obviously wrong, regardless of input. Unit test first. Gain confidence in the pieces that make the whole. Test incrementally. Gain confidence by testing over a period of time. Slide Adapted from Rapid Software Testing by James Bach & Michael Bolton, © 1995-2007, Satisfice, Inc. www.PerfTestPlus.com Performance Testing Software Systems Page 43 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Reassemble into your group. Identify at least 3 performance criteria for your project of each type. (Goals, Requirements, Thresholds, and Constraints) Based on your criteria, identify the top 5 performance testing objectives for your project. Identify your oracles for the top 5 performance testing objectives. Be prepared to brief the class on your criteria, objectives, and oracles. www.PerfTestPlus.com Performance Testing Software Systems Page 44 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 45 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • “Enterprise grade load generation tools are designed to look easy in sales demos. Don‟t be fooled.” --Scott Barber www.PerfTestPlus.com Performance Testing Software Systems Page 46 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • To help me decide what tests to design, I use (An acronym of guideword heuristics) www.PerfTestPlus.com Performance Testing Software Systems Page 47 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Do I need this test to: or or response and/or utilized under or conditions www.PerfTestPlus.com Performance Testing Software Systems Page 48 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Communicating Design www.PerfTestPlus.com Performance Testing Software Systems Page 49 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • When Building Usage Models, I (Yet another mnemonic of guideword heuristics) www.PerfTestPlus.com Performance Testing Software Systems Page 50 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Common activities (get from logs) e.g. Resource hogs (get from developers/admins) Even if these activities are both rare and not risky SLA‟s, Contracts and other stuff that will get you sued What the users will see and are mostly likely to complain about. What is likely to earn you bad press New technologies, old technologies, places where it‟s failed before, previously under-tested areas Don‟t argue with the boss (too much) www.PerfTestPlus.com Performance Testing Software Systems Page 51 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Communicating System Usage www.PerfTestPlus.com Performance Testing Software Systems Page 52 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Intent of Investigation: Collect configuration data for tuning. Collect data to assist in validating existing network. Prerequisites: Static prototype deployed on future production hardware. Tasks: Determine network bandwidth, validate firewalls & load balancer, evaluate web server settings. Tools & Scripts: Load generation tool, HTTP scripts to request objects of various sizes from a pool of IP addresses. External Resources Needed: Firewall, Load Balancer, Network Admins, network monitors, 20 IP addresses for spoofing. Risks: Schedule delay, availability of administrators, configuration of load generation tool for IP spoofing, Data of Special Interest: Network bandwidth & latency, load balancer effectiveness, resource consumption, response times. Areas of Concern: No internal expertise on load balancer configuration. Pass/Fail Criteria: Adequate available bandwidth, architectural assumptions validated. Completion Criteria: Critical data collected and assumptions validated. Planned Variants: 1 to 20 IPs, volume of 1 to 500, size from 1Kb to 1mb, configuration settings. Execution Duration(s): 6 days: 2 days ea. network & bandwidth, firewall and load balancer, web server configuration. www.PerfTestPlus.com Performance Testing Software Systems Page 53 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 54 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Reassemble into your group. Referencing your previous work, use IVECTRAS and FIBLOTS to design the top priority performance tests for your team‟s project. Revisit your oracles. Will they work as planned? If not, choose new oracles. Be prepared to brief the class on your tests and your Oracles. www.PerfTestPlus.com Performance Testing Software Systems Page 55 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 56 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Test Harness Test Harness Simulated User Load Test Harness www.PerfTestPlus.com Performance Testing Software Systems Page 57 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • SUT, Load Generator, Monitors, Tools, Utilities, Processes, & Team Installations, Data, Hooks, Stubs, Harnesses, & Schedule Configurations, Assumptions, Design, Models, Integrations, Gaps, & Support Everything so far based on Validations. Prepare for execution www.PerfTestPlus.com Performance Testing Software Systems Page 58 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • “Only performance testing at the conclusion of system or functional testing is like ordering a diagnostic blood test after the patient is dead.” --Scott Barber www.PerfTestPlus.com Performance Testing Software Systems Page 59 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 60 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 61 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Reassemble into your group. Use the 9 core principles of successful performance testing projects to create an approach or process for your team‟s project. Consider how this approach or process will mesh with the overall project approach or process. Be prepared to brief the class on your process. www.PerfTestPlus.com Performance Testing Software Systems Page 62 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 63 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • “MacGyver is a super-hero, *not* a career path.” --Scott Barber www.PerfTestPlus.com Performance Testing Software Systems Page 64 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • When creating scripts, I try to: (Yet another mnemonic of guideword heuristics) www.PerfTestPlus.com Performance Testing Software Systems Page 65 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Ensure obvious functional errors are detected Consider “pre-script” validation and/or transformation If you think there are 3 ways to do something, users will find 5… the other two could be performance killers Vary data to avoid unintentional caching and determine the difference between “speed” and “volume” effects “Super-users” yield “Stinky-scripts” Not logging out can leave resource consuming session artifacts The step model, is not just unrealistic, it can invalidate results If you don‟t design your scripts to be maintainable, they won‟t be maintained www.PerfTestPlus.com Performance Testing Software Systems Page 66 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 67 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Load generation tools: Do not interact with client side portions of the application. Do not natively evaluate correctness of returned pages. Often don‟t handle conditional navigation. Do not handle abandonment well. Scripting concepts: Record – EDIT – playback Add data variance Add delays Add conditional logic Add code to evaluate correctness of key pages Add abandonment functions www.PerfTestPlus.com Performance Testing Software Systems Page 68 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Real Users React Ensure your tests represent the fact that real users react to the application. Vary Data Make sure that data being entered is unique for each simulated user. Make sure that each simulated users is unique (this may mean more than just separate IDs and Passwords). Vary Navigation Paths If there is more than one way for a user to accomplish a task in the application, your test must represent that. Different paths through the system often stress different parts of the system. www.PerfTestPlus.com Performance Testing Software Systems Page 69 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Users Think… and Type Guess what? They all do it at different speeds! Guess what else? It's your job to figure out how to model and script those varying speeds. Determine how long they think Log files Industry research Observation Educated guess/Intuition Combinations are best www.PerfTestPlus.com Performance Testing Software Systems Page 70 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Abandonment If a page takes too long to display, users will eventually abandon your site – thus lessening the load – changing the overall performance. Not simulating abandonment makes your test un- intentionally more stressful than real life. www.PerfTestPlus.com Performance Testing Software Systems Page 71 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Delays Every page has a think time – after you determine the think time for that page, document it. These think times should cause your script to pace like real users. Event Type Event Name Type Min Max Std Req't Goal Procedure name: Initial Navigation() Timer name: tmr_home_page negexp 4 N/A N/A 8 5 Timer name: tmr_login normdist 2 18 4.5 8 5 Timer name: tmr_page1 linear 5 35 N/A 8 5 Timer name: tmr_data_entry negexp 8 N/A N/A 8 5 Timer name: tmr_page2 normdist 3 9 3 5 3 Timer name: tmr_submit_transaction linear 2 4 N/A 5 3 Timer name: tmr_signout N/A N/A N/A N/A 8 5 www.PerfTestPlus.com Performance Testing Software Systems Page 72 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Not Frightening Frightening www.PerfTestPlus.com Performance Testing Software Systems Page 73 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 74 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 75 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Reassemble into your group. Discuss the most challenging scripting problems you anticipate based on your test design. Consider alternatives to achieving the same degree of realism while minimizing the scripting challenge. Be prepared to brief the class on your scripting challenges. www.PerfTestPlus.com Performance Testing Software Systems Page 76 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 77 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • To remind me that execution doesn’t simply mean “break it”, I recall: (A mnemonic of guideword heuristics) (Not to mention an oddly placed Shakespeare reference.) www.PerfTestPlus.com Performance Testing Software Systems Page 78 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Determine what the script(s) actually do (Accuracy). Check script(s), data, etc. for consistency (Precision). If it‟s worth checking more than once, it‟s likely worth trending. Vary between alternate extremes over a definable period. Establishing an understood, reliable point of reference. Increase the load systematically until learning stops. Usage won‟t be exactly what you think, ask “what if…?” Work as a collaborative, cross-functional team. When something looks odd “beat on it to see if it breaks”. www.PerfTestPlus.com Performance Testing Software Systems Page 79 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Execution Heuristics: “1, 3, 7, 11, More” “Best, Expected, Worst” “Marching & Resonance” “What if Greenspan sneezes?” “First user on Monday” “UAT under load” “If I can‟t break it, I don‟t understand it” www.PerfTestPlus.com Performance Testing Software Systems Page 80 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Reassemble into your group. Spend a few minutes jotting down execution heuristics that may be valuable for your team‟s project. Be prepared to describe your heuristics to the class. www.PerfTestPlus.com Performance Testing Software Systems Page 81 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 82 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • “With an order of magnitude fewer variables performance testing could be a science, but for now, performance testing is at best a scientific art.” --Scott Barber www.PerfTestPlus.com Performance Testing Software Systems Page 83 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • When I’m analyzing, I remind myself to: (A mnemonic of guideword heuristics) www.PerfTestPlus.com Performance Testing Software Systems Page 84 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Results are meaningless without technical context. Don‟t over-trust results until you can repeat them. Within the test run, across tests, across data, etc. If you can repeat it or it‟s >1%, it‟s not an outlier. Graph, blink, overlay, compare, and contrast. If it can get you sued, check it every time. How well do the results represent reality. This is where users care and symptoms are found. If it‟s broken, performance doesn‟t matter. www.PerfTestPlus.com Performance Testing Software Systems Page 85 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Methods: Blink De-Focus & Re-Focus Overlay Plot Bucket Look for Odd Be Derivative Ditch the Digits Un-average Averages Manual www.PerfTestPlus.com Performance Testing Software Systems Page 86 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Facts: Analysis is a team sport. We cannot prove anything. Focus on patterns, trends, and feelings. Numbers are meaningless out of context. Qualitative feedback is at least as relevant as quantitative feedback. www.PerfTestPlus.com Performance Testing Software Systems Page 87 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • All three have an average of 4. Which has the “best” performance”? How do you know? www.PerfTestPlus.com Performance Testing Software Systems Page 88 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Reassemble into your group. Pay attention, I‟m going to explain this group of exercises orally. Be prepared to describe your findings with the class. www.PerfTestPlus.com Performance Testing Software Systems Page 89 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 90 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • “Linear extrapolation of performance test results is, at best, black magic. Don‟t do it (unless your name is Connie Smith, PhD. or Daniel Menasce, PhD.)” --Scott Barber www.PerfTestPlus.com Performance Testing Software Systems Page 91 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • I name good reports: (A mnemonic of guideword heuristics) www.PerfTestPlus.com Performance Testing Software Systems Page 92 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Stakeholders need data to make decisions. Many decisions can‟t wait until tomorrow. Reports are only interesting if they contain data that is useful. A great report for developers is probably a lousy report for executives. Try to use pictures over numbers and numbers over words. Save words for recommendations. Strive to make reports compelling without explanation. Unless you are hiding something, make the supporting data available to the team. www.PerfTestPlus.com Performance Testing Software Systems Page 93 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Facts: Most people will never read performance test results docs. Most people don‟t really understand the underlying components to performance. It is our job to make it easy for them to understand, and understand quickly. Being skilled at graphical presentation of technical information is critical for us to help others understand the message we are delivering. Confusing charts and tables lead to wrong decisions causing lost $ and ruined reputations. www.PerfTestPlus.com Performance Testing Software Systems Page 94 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • What consumers of reports want: Answers… NOW! (They might not even know the question) To understand information intuitively. Simple explanations of technical information. To be able to make decisions quickly and have the information to support those decisions. “Trigger phrases” to use with others. Concise summaries and conclusions. Recommendations and options. www.PerfTestPlus.com Performance Testing Software Systems Page 95 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • What consumers of reports usually get: www.PerfTestPlus.com Performance Testing Software Systems Page 96 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Strive for something better: Concise verbal descriptions. Well formed, informative charts (pretty pictures). Focus on requirements and business issues. Don‟t be afraid to make recommendations or draw conclusions! Make all supporting data available to everyone, all the time (Don‟t sit on data „cause they won‟t understand it). Report ≠ Document Report *AT LEAST* every 48 hours during execution. www.PerfTestPlus.com Performance Testing Software Systems Page 97 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Inspired by “ET”: Edward Tufte, Ph.D., Professor Emeritus of political science, computer science and statistics, and graphic design at Yale. According to ET: Power Corrupts... www.PerfTestPlus.com Performance Testing Software Systems Page 98 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • PowerPoint Corrupts Absolutely. www.PerfTestPlus.com Performance Testing Software Systems Page 99 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Relative Performance www.PerfTestPlus.com Performance Testing Software Systems Page 100 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Graphs Make Some Things Obvious www.PerfTestPlus.com Performance Testing Software Systems Page 101 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Trends, Trends, Trends!!! www.PerfTestPlus.com Performance Testing Software Systems Page 102 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 103 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 104 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 105 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 106 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 107 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 108 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 109 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 110 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Instructions: Reassemble into your group. Spend a few minutes discussing and sketching 1 graphic that would convey the key information to stakeholders that you don‟t believe they are getting now. Be prepared to let the class assess your graphic. www.PerfTestPlus.com Performance Testing Software Systems Page 111 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • www.PerfTestPlus.com Performance Testing Software Systems Page 112 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Don‟t confuse “Delivery” with “Done” You will never have enough data (statistically), even if you already have too much (to parse effectively). Ask “Rut or Groove”. Don‟t let complacency be your guide. If you run out of new ideas, take old ideas to new extremes. Above all else ask: “What test that I can do right now, will add the most informational value to the project?” www.PerfTestPlus.com Performance Testing Software Systems Page 113 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Performance Testing Principles Project context is central to successful performance testing. Business, project, system, & user success criteria. Identify system usage, and key metrics; plan and design tests. Install and prepare environment, tools, & resource monitors. Script the performance tests as designed. Run and monitor tests. Validate tests, test data, and results. Analyze the data individually and as a cross-functional team. Consolidate and share results, customized by audience. "Lather, rinse, repeat" as necessary. www.PerfTestPlus.com Performance Testing Software Systems Page 114 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Questions www.PerfTestPlus.com Performance Testing Software Systems Page 115 © 2006-7 PerfTestPlus, Inc. All rights reserved.
  • Contact Info Scott Barber Chief Technologist PerfTestPlus, Inc E-mail: Web Site: sbarber@perftestplus.com www.PerfTestPlus.com www.PerfTestPlus.com Performance Testing Software Systems Page 116 © 2006-7 PerfTestPlus, Inc. All rights reserved.