Your SlideShare is downloading. ×
Quantifying Drupal (DrupalCamp Asheville 2012)
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Quantifying Drupal (DrupalCamp Asheville 2012)

1,978
views

Published on

Presented at DrupalCamp Asheville 2012! …

Presented at DrupalCamp Asheville 2012!

Update: This session will focus primarily on performance testing due to session topic overlap. There will be a JMeter demo that includes some of the best tricks I've learned when load testing and optimizing Drupal.

Drupal (and most web application) testing can be broken down into four main categories -

Unit testing
Simple response testing
Automated browser testing
Load testing (virtual vs. "real")
Combining testing with a continuous integration tool provides developers with ongoing, automatic notifications of regressions in both functionality and performance. Best of all, this testing scales to multiple projects, because most tests can be reused.

How does this impact human QA? As a project grows, it becomes increasingly difficult to test new features in addition to all existing features. Writing test cases allows QA testers to focus on process and efficiency over needless pointing and clicking.

In this session we'll cover a full range of testing tools including Simpletest, Selenium, Apache Bench, JMeter, and Jenkins. We will briefly cover some commercial tools to assist with testing as well.

Published in: Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,978
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  •  \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • From Merriam-Webster\n
  • From Merriam-Webster\n
  • From Merriam-Webster\n
  • \n
  • \n
  • \n
  • \n
  • Use 90-95% percentile for any request timing (lots of outliers)\n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Transcript

    • 1. Quantifying Drupal: KeepingSanity Through TestingErik Webb@erikwebbd.o: erikwebbSenior Technical ConsultantAcquia
    • 2. Agenda Introduction Defining Performance Setting Goals Creating a Test Plan Performance Testing Tools JMeter Demo
    • 3. About Me Senior Technical Consultant Focus on Performance, Infrastructure, and Scalability 5+ years with Drupal 10+ years with LAMP Red Hat Certified Engineer
    • 4. We’re hiring!
    • 5. My site feels slower! What did you do!? - A client you might know
    • 6. Defining Performance
    • 7. per•for•mance, noun.
    • 8. per•for•mance, noun. the execution of an action
    • 9. per•for•mance, noun. the execution of an action the fulfillment of a claim, promise, or request
    • 10. per•for•mance, noun. the execution of an action the fulfillment of a claim, promise, or request the manner in which a mechanism performs
    • 11. Technical Goals Monitor changes and usage side-by-side Ensure new development does not affect existing functionality Reduce infrastructure complexity through early testing Hold programmers accountable for performance, not just sysadmins
    • 12. Business Goals Ensure a consistent user experience Reduce overall cost and improve capacity planning Improve accountability for change-level performance And of course - Make more $$$ *Requires significant increase in QA budget
    • 13. Setting Goals
    • 14. Goals or Gates? Both. Definition of Done D8 performance core initiative Continuous integration or gates? Ongoing monitoring Unpredictable site scaling Concurrency issues
    • 15. Success Metrics Backend performance X concurrent authenticated sessions X page views per minute X seconds maximum per backend request Maximum X% CPU or X GB RAM used Frontend performance X seconds until initial render X seconds until full page load Maximum X MB payload per request
    • 16. Creating a Test Plan
    • 17. What to Test? First impressions Main landing page Popular campaign pages User experience Typical user scenario, both anonymous and authenticated Expected problems Highly complex pages
    • 18. Writing Tests User scenario created by business Augmented with known possible issues by developers Ensure variability Randomize selections from listings Test as multiple user roles Measure test coverage Both page components and total page count Treat the same as code coverage New tests for each feature Define as part of individual requirements
    • 19. Continuous Integration Ongoing content scaling prevents long-term reliability Integrate with Jenkins (or other CI tool) for performance regression testing
    • 20. What is Hard to Test? Maximum real concurrency All page sequences CDNs Akamai only allows load testing with partners Requires spread out test clients Dumb users!!!
    • 21. Performance Testing Tools
    • 22. Testing Types Request profiling Service testing Simple HTTP response testing Load testing
    • 23. Request Profiling Goal: Diagnose specific application bottlenecks Gets rid of “guessing” *I never start troubleshooting until profiling is done. Local - XHProf, XDebug (Webgrind) Aggregation - XHProf, New Relic (SaaS)
    • 24. Service Testing Goal: Ensure proper performance of each layer of the stack Should perform well independently of Drupal Repeatable for Puppet- or Chef-based configurations Examples: static files through Apache, sysbench for MySQL, memslap for Memcached, JMeter for many
    • 25. Simple HTTP Response Testing Goal: Measure high-level performance of individual pages Local tools Apache Bench Siege 3rd party services Yottaa Pingdom
    • 26. Load Testing Ongoing process, not just on site launch checklist Virtual vs. real Virtual does not effectively represent real-world measurements Real is very resource intensive (i.e. expensive) Only real load testing should be trusted for sign-off Use virtual only for comparisons and improvements
    • 27. Virtual vs. Real Load Testing
    • 28. Virtual vs. Real Load Testing Virtual HTTP client Overly efficient AJAX testing is out-of-band Blazemeter (JMeter Cloud) - $500/month up to 4800 concurrent
    • 29. Virtual vs. Real Load Testing Virtual HTTP client Overly efficient AJAX testing is out-of-band Blazemeter (JMeter Cloud) - $500/month up to 4800 concurrent
    • 30. Virtual vs. Real Load Testing Virtual Real HTTP client Browser client Overly efficient Written with rich in-page APIs AJAX testing is out-of-band (e.g. Selenium) Supports AJAX natively Blazemeter (JMeter Cloud) - $500/month up to 4800 BrowserMob - $499/week up to concurrent 100 real users
    • 31. JMeter Demogithub.com/erikwebb/drupal-jmeter-tricks
    • 32. Questions?Where to find me?erikwebb.neterikwebb on Drupal.org@erikwebb on Twittererikwebb on LinkedInerikwebb on SlideShare