Top Ten Secret Weapons For Agile Performance Testing
Upcoming SlideShare
Loading in...5
×
 

Top Ten Secret Weapons For Agile Performance Testing

on

  • 5,106 views

 

Statistics

Views

Total Views
5,106
Views on SlideShare
4,800
Embed Views
306

Actions

Likes
6
Downloads
146
Comments
1

12 Embeds 306

http://amelnyk.blogspot.com 260
http://amelnyk.blogspot.ru 29
http://www.linkedin.com 7
http://www.slideshare.net 2
http://www.blogger.com 1
http://translate.googleusercontent.com 1
http://amelnyk.blogspot.co.il 1
http://amelnyk.blogspot.in 1
http://amelnyk.blogspot.com.au 1
http://webcache.googleusercontent.com 1
http://static.slidesharecdn.com 1
http://amelnyk.blogspot.co.uk 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • Very good presentation. Maybe you find mine interesting as well.

    Cheers

    Alois
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • In a conventional project, we focus on the functionality that needs to be delivered.Performance might be important, but performance requirements are considered quite separate from functional requirements.One approach is to attach “conditions” to story cards, i.e. this functionality must handle a certain load.In our experience, where performance is of critical conern, pull out the performance requirement as its own story…
  • Calling out performance requirements as their own stories allows you to:validate the benefit you expect from delivering the performance-prioritise performance work against other requirements-know when you’re done
  • not sure if you like this picture, I was really looking for a good shot looking out over no-man’s land at the Berlin wall.I want the idea of divisions along skill lines breading hostility and un-cooperation.
  • Everything should be based on some foreseeable scenario, and who benefits from itHarder to do without repetition (involvement and feedback) [not sure if this makes sense anymore]Extremely important to keep people focused as its easy to driftCapture different profilesSeparation simulation from optimisation -> Problem Identification vs Problem Resolution (or broken down further Solution Brainstorm -> Solution Investigation)Linking back to why is even more essential -> map to existing problems or fearsLatency vs throughput -> determine what is the most useful metric and define service level agreements
  • http://www.flickr.com/photos/denniskatinas/2183690848/Not sure which one you like better
  • Here’s an example... (in the style of Feature Injection) “What’s our upper limit?”
  • Here’s another example... (in the style of Feature Injection), “Can we handle peaks in traffic again?”So that we have confidence in meeting our SLAAs the Operations ManagerI want to ensure that a sustained peak load does not take out our service
  • It helps to be clear about who is going to benefit from any performance testing (tuning and optimisation) that is going to take place. Ensure that they get a stake on prioritisation that will help with the next point...
  • Evidence-based decision-making. Don’t commit to a code change until you know it’s the right thing to do.
  • Evidence-based decision-making. Don’t commit to a code change until you know it’s the right thing to do.
  • It helps to have the customer (mentioned in the previous slide) be a key stakeholder to prioritise.
  • Application supports better ability to be performance tested easierLike TDD changes the design/architecture of a systemNeed to find reference for thisMeasuring it early helps raise what changes contribute to slownessPerformance work takes longerLead times potentially large and long lead time (sequential) – think of where gantt chart may actually be usefulRun it as a parallel track of work to normal functionality (not sequential)Minimal environment availability (expensive, non concurrent use)Need minimal functionality or at least clearly defined interfaces to operate againstWant to have some time to respond to feedback -> work that into the process as early as possible and potentially change architecture/design
  • Start with the simplest performance test scenarios -> Sanity test/smoke test-> Hit all aspects-> Use to drive out automated deployment (environment limitations, configuration issues, minimal set of reporting needs – green/red)-> Hit integration boundaries but with a small problem rather than everythingNext story might be a more complex script or something that drives out more of the infrastrcutrePerformance stories should not be :-> Build out tasks-> Does not enhance anything without other storiesLog files -> Contents early. Consumer Driven. Contracts for analysis. Keep around. Keep notes around what was variedINVEST storiesAvoid the large “performance test” storySeparate types of storiesOptimise vs MeasureOptimise is riskier components. Less known. “Done” is difficult to estimateMeasure is clearer. Allows you to make better informed choicesKnow when to stopWhen enough is enough
  • The best lessons are learned from iterating, not from incrementing. Iterate over your performance test harness, framework and test fixtures. Make it easier to increment into new areas by incrementing in a different direction each time. - Start with simple performance test scenarios - Don’t build too much infrastructure at once - Refine the test harness and things used to create more tests - Should always be delivering value - Identify useful features in performance testing and involve the stakeholder(s) to help prioritise them inPrioritise and schedule in analysis stories (metrics and graphs)Some of this work will still be big
  • Sashimi is nice and bite sized. You don’t eat the entire fish at once. You’re eating a part of it. Sashimi slices are nice and thin. There are a couple of different strategies linking this in. Think of sashimi as the thinnest possible slice.
  • Number of requests over time
  • Latency over time
  • “I don’t want to click through to each graph”
  • “I don’t want to click through to each graph”
  • “I don’t want to click through to each graph”
  • Automated build is a key XP practice.The first stage of automating a build is often to automate compilationHowever, for a typical project, we go on after compilation to run tests, as another automated step.In fact we may have a whole series of automted steps that chain on after each other, automating many aspects of the development process, all the way from compiling source to to deploying a complete application into the production environment.
  • Automation is powerful lever in software projects because:it gives us reproducable, consistent processesWe get faster feedback when something goes wrongOverall higher productivity – we can repeat an automated build much more often than we could if it was manual
  • In performance testing we can use automate many of the common tasks in a similar way to how we automate a software build.For any performance test, there is a linear series of activities that can be automated (first row of slide)In our recent projects we’ve been using the build tool ant for most of performance scripting. You could use any scripting language, but here are some very basic scripts to show you the kind of thing we mean… [possibly animate transitions to the 4 following slides]Once we’ve auomted the running of a single test, we can move on even more aspects of automation such as scheduling and result archiving, whch lead us into…Continuous Performance testing.
  • For a faster feedback, set up your CI server so that performance tests are always running against the latest version of the application.

Top Ten Secret Weapons For Agile Performance Testing Top Ten Secret Weapons For Agile Performance Testing Presentation Transcript

  • Top ten secret weapons for agile performance testing
    by Patrick Kua
    patrick.kua@thoughtworks.com
    http://www.thekua.com/atwork/presentations-and-papers/
    © ThoughtWorks2010
  • Make Performance Explicit
    © ThoughtWorks2010
    1
  • So that I can make better investment decisions
    As an investor
    I want to see the value of my portfolio presented on a single web page
    must have “good” performance, less than 0.2s page load for about 10,000 concurrent users
    © ThoughtWorks2010
  • © ThoughtWorks2010
    So that investors have a high-quality experience as the business grows
    As the Operations Manager
    I want the portfolio value page to render within 0.2s when 10,000 users are logged in
  • One Team
    © ThoughtWorks2010
    2
  • Team Dynamics
    © ThoughtWorks2010
  • Performance Testers Part of Team
    © ThoughtWorks2010
  • © ThoughtWorks2010
  • Performance Testers Part of Team
    © ThoughtWorks2010
  • Pair on Performance Test Stories
    © ThoughtWorks2010
  • Rotate Pairs
    © ThoughtWorks2010
  • Customer Driven
    © ThoughtWorks2010
    3
  • What was a good source of requirements?
    © ThoughtWorks2010
  • © ThoughtWorks2010
    Existing Pain Points
  • An example...
    © ThoughtWorks2010
  • So that we can budget for future hardware needs as we grow
    As the data centre manager
    I want to know how much traffic we can handle now
    © ThoughtWorks2010
  • Another example
    © ThoughtWorks2010
  • © ThoughtWorks2010
    So that we have confidence in meeting our SLA
    As the Operations Manager
    I want to ensure that a sustained peak load does not take out our service
  • Personas
    © ThoughtWorks2010
  • Who is the customer?
    © ThoughtWorks2010
    Investors
    Marketing
    End Users
    Power
    Users
    Operations
  • Discipline
    © ThoughtWorks2010
    4
  • © ThoughtWorks2010
    Observe test results
    What do you see?
    Formulate an hypothesis
    Why is it doing that?
    Design an experiment
    How can I prove that’s what’s happening?
    Run the experiment
    Take the time to gather the evidence.
    Is the hypothesis valid?
    Change the application code
    Safe in the knowledge that I’m making it faster
  • ??????????
    © ThoughtWorks2010
  • © ThoughtWorks2010
    Observe test results
    Saw tooth pattern (1 minute intervals)
    Formulate an hypothesis
    Directory structure of (yyyy/mm/minuteofday)?. Slow down due to # of files in directory?
    Design an experiment
    1 directory should result in even worse performance...
    Run the experiment
    We ran the test…
    Is the hypothesis valid?
    Change the application code
  • One Directory
    © ThoughtWorks2010
  • Play Performance Early
    © ThoughtWorks2010
    5
  • © ThoughtWorks2010
    End
    Start
    Other projects start performance testing here
    End
    Start
    Agile projects start performance testing as early as possible
  • Iterate Don’t (Just) Increment
    © ThoughtWorks2010
    6
  • © ThoughtWorks2010
  • We Sashimi
    © ThoughtWorks2010
  • Sashimi Slice By... Presentation
    © ThoughtWorks2010
  • © ThoughtWorks2010
    So that I can better see trends in performance
    As the Operations Manager
    I want a graph of requests per second
  • © ThoughtWorks2010
    So that I can better see trends in performance
    As the Operations Manager
    I want a graph of average latency per second
  • © ThoughtWorks2010
    So that I can easily scan results at a single glance
    As the Operations Manager
    I want a one page showing all results
  • Sashimi Slice By... Scenario
    © ThoughtWorks2010
  • © ThoughtWorks2010
    So that we never have a day like “October 10”
    As the Operations Manager
    I want to ensure that a sustained peak load does not take out our service
  • © ThoughtWorks2010
    So that we never have a day like “November 12”
    As the Operations Manager
    I want to ensure that an escalating load up to xxx requests/second does not take out our service
  • Automate, Automate, Automate
    © ThoughtWorks2010
    7
  • © ThoughtWorks2010
    Automated
    Compilation
    Automated
    Tests
    Automated
    Packaging
    Automated
    Deployment
  • Automation => Reproducible and Consistent
    Automation => Faster Feedback
    Automation => Higher Productivity
    Why Automation?
    © ThoughtWorks2010
  • © ThoughtWorks2010
    Automated
    Test Orchestration
    Automated
    Analysis
    Automated Scheduling
    Automated
    Load Generation
    Automated
    Application Deployment
    Automated Result Archiving
  • Continuous Performance Testing
    © ThoughtWorks2010
    8
  • Application
    Build Pipelines
    © ThoughtWorks2010
    Performance
  • © ThoughtWorks2010
  • Test Drive Your Performance Test Code
    © ThoughtWorks2010
    9
  • V Model Testing
    © ThoughtWorks2010
    Slower + Longer
    Performance Testing
    Speed
    Fast
    http://en.wikipedia.org/wiki/V-Model_(software_development)
  • We make mistakes
    © ThoughtWorks2010
  • V Model Testing
    © ThoughtWorks2010
    Slower + Longer
    Performance Testing
    Speed
    Unit test performance code to fail faster
    Fast
    http://en.wikipedia.org/wiki/V-Model_(software_development)
  • Classic Performance Areas to Test
    © ThoughtWorks2010
    Analysis
    Information Collection
    Presentation
    Publishing
    Visualisation
  • Get Feedback
    © ThoughtWorks2010
    10
  • Frequently (Weekly) Showcase
    © ThoughtWorks2010
    Here is what we learned this week....
  • Frequently (Weekly) Showcase
    © ThoughtWorks2010
    And based on this... We changed our directory structure.
  • Frequently (Weekly) Showcase
    © ThoughtWorks2010
    Should we do something different knowing this new information?
  • List of All Secret Weapons
    Make Performance Explicit
    One Team
    Customer Driven
    Discipline
    Play Performance Early
    Iterate Don't (Just) Increment
    Automate, Automate, Automate
    Test Drive Your Performance Code
    Continuous Performance Testing
    Get Feedback
    © ThoughtWorks2010
  • Photo Credits (Creative Commons licence)
    Barbed wire picture: http://www.flickr.com/photos/lapideo/446201948/
    Eternal clock: http://www.flickr.com/photos/robbie73/3387189144/
    Sashimi from http://www.flickr.com/photos/mac-ash/3719114621/
    Questions?
    © ThoughtWorks2010