Making Sense of the Numbers - DOs and DON'Ts of Quality Performance Testing

2,167
-1

Published on

A quality performance test is broken up into 3 stages - (1) creating representative user scenarios; (2) scripting the actions of the simulated users; and (3) analyzing the results of the test. Unfortunately most tests are plagued by shortcuts taken at one of the 3 stages. Whether the mistake is not testing a common user path, choosing the wrong tool, or incorrectly believing bad data, a breakdown anywhere in the process leads to unnecessary fallout and likely little improvement in the site's performance. This presentation is a collection of lessons learned through a number of high concurrency load tests, both effective and failed. Through these lessons, we'll walk through each stage in depth to ensure that each and every load test is valid and beneficial the very first time.

While this session is highly technical, the lessons themselves will attempt to address the concerns of everyone in the process, including sysadmins, developers, and even project managers. Performance testing is a critical component in any site launch for the entire project team. Sysadmins will benefit from knowing how to correlate data to different layers in the hosting stack. Developers will learn how to write precise tests and analyze their results. Project managers will gain the knowledge required to ensure the tests are accurate and indicative of real site performance.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
2,167
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Making Sense of the Numbers - DOs and DON'Ts of Quality Performance Testing

  1. 1. Building Bridges, Connecting CommunitiesErik Webb, Jeff BeemanSr. Technical Consultants, AcquiaMaking Sense of the NumbersDOs and DONTs of QualityPerformance Testing
  2. 2. About  ErikSenior Technical ConsultantFocus on Performance,Infrastructure, and ScalabilityJoined Acquia in early 20105+ years with Drupal10+ years with LAMPRed Hat Certified Engineer
  3. 3. About  JeffSenior Technical ConsultantFocus on Architecture,Deployment, DeliveryJoined Acquia in late 20108+ years with DrupalLead architect on several large-scale Acquia PS projects
  4. 4. Everyone  needs  performance  testing.
  5. 5. Do  have  a  plan.Creating representative user scenariosScripting the actions of the simulatedusersAnalyzing the results of the test
  6. 6. Do  involve  everyone.Not just for developersInvolve sysadmins, developers, andproject managersProject stakeholdersEveryone is the QA team!Align tests with business goals
  7. 7. Let’s  get  on  the  same  page.
  8. 8. PerformanceGETdrupal.org
  9. 9. Scalability1s ~1s
  10. 10. ScalingVerticalHorizontal
  11. 11. Load  Testing10 users100 users:-):-|
  12. 12. Stress  Testing??? users-(x x
  13. 13. Contention/latest-news
  14. 14. Your  numbers  are  only  as  good  as  your  tests.
  15. 15. Do  set  technical  goals.Monitor changes and usage side-by-sideEnsure new development does not affectexisting functionalityReduce infrastructure complexity throughearly testingHold programmers accountable forperformance, not just sysadmins
  16. 16. Do  set  specific  success  metrics.Backend performanceX concurrent authenticated sessionsX page views per minuteX seconds maximum per requestX MB memory usage per requestMaximum % CPU or GB RAM used on serverFrontend performanceX seconds until initial renderX seconds until full page load
  17. 17. Do  gather  all  the  data.Never trust the client.Find the most value for the client.Understand the full client use cases.Historical data is a requirement.We’re too smart to ignore our intuition.
  18. 18. Don’t  give  up  until  it’s  done.Get your hands dirty.Don’t get lost in Drupal.Assume no one has checked the easystuff.Don’t accept anything less than perfect.
  19. 19. Your  numbers  are  only  as  consistent  as  your  infrastructure.
  20. 20. Don’t  use  a  dev  environment  for  testing.Developer overhead - XDebug, XHProfVerbose loggingCongested network
  21. 21. Don’t  extrapolate  results.300#connections50#conns 50#conns 50#conns300#connections50#conns 50#conns 50#conns300#connections100#conns 100#conns 100#conns300#connections150$connections50$conns 50$conns 50$conns150$connectionsCONTENTION!!!
  22. 22. Don’t  extrapolate  results.Locking query #1Waiting query #1Waiting query #2Waiting query #3Waiting query #4Waiting query #520 ms lock +query time20 ms lock20 ms lock +query time30 ms lock +prev lock timequery timeLocking query #2 30 ms lock+ prev lock time30 ms lock +prev lock timequery time30 ms lock +prev lock timequery timeLocking query #1Waiting query #1Waiting query #220 ms lock +query time20 ms lock20 ms lock +query timeCONTENTION!!!
  23. 23. Your  numbers  are  only  as  current  as  your  last  test.
  24. 24. Do  write  smart  tests.User scenario created by businessUse real data from analyticsEnsure variabilityMeasure test coverageNew tests for each feature
  25. 25. Don’t  write  your  tests  for  launch.Launch is a milestone, not the end.New features added all the time.It’s just QA!
  26. 26. Do  use  CI  for  performance  too.Ongoing content scaling complicates long-termreliabilityIntegrate with Jenkins (or other CI tool) for performanceregression testing
  27. 27. Your  numbers  are  only  as  good  as  your  tools.
  28. 28. Do  multiple  types  of  testing.Request profilingService testingSimple HTTP response testingLoad testing
  29. 29. Virtual  vs.  Real  Load  TestingVirtualHTTP clientDesigned for efficiencyLimited client functionalityJMeter cloud service -$500/month up to 4800concurrentRealBrowser clientEstimates real userexperienceSupports AJAX nativelySelenium-based service -$499/week up to 100 realusers
  30. 30. Virtual  Users//div[@id=”content”]//h2/a/@href//ul[@id=”main-menu-links”]//a/@href
  31. 31. Real  Users
  32. 32. Questions?
  33. 33. We’re  Hiring!
  34. 34. @erikwebb@doogiemac
  35. 35. Building Bridges, Connecting CommunitiesEvaluate this session at:portland2013.drupal.org/schedule.Thank you!What did you think?

×