BlazeMeter- Effective Performance Reporting

3,412 views

Published on

This topic focuses on effective reporting and its associated challenges while using JMeter. It delves into the importance of metrics and KPIs for effective performance reporting, followed by a brief overview of JMeter's built-in listeners (reporting elements) like Aggregate Listener, Graph Listeners etc.

The 3rd and the final part covers the inadequacies of these listeners and use of third party/external reporting tools that provide enhanced reporting (ant + xslt).

The new BlazeMeter reporting plugin is introduced as a quick and ready to use solution for JMeter reporting.

Sub-topics:
* Importance of effective performance test reporting
* Typical performance testing metrics
* JMeter reporting entities (Listeners)
* Shortcomings of existing JMeter reporting elements
* Generating advanced JMeter reports using ant + xslt
* Building reporting tools frameworks
* How the blazemeter reporting plugin can alleviate the challenges in JMeter reports
* Details on the blazemeter reporting plugin

Published in: Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,412
On SlideShare
0
From Embeds
0
Number of Embeds
40
Actions
Shares
0
Downloads
101
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

BlazeMeter- Effective Performance Reporting

  1. 1. EFFECTIVE PERFORMANCE REPORTING USING APACHE JMETER JULY 31, 2012
  2. 2. THE LOAD TESTING CLOUD A DEV-TEST CLOUD SERVICE 100% COMPATIBLE WITH THE OPEN-SOURCE APACHE JMETER
  3. 3. AGENDA Performance Attributes Understanding Performance KPIs Creating Load Test Reports JMeter Reporting ElementsGenerating Advanced JMeter Reports BlazeMeter Reporting Plugin
  4. 4. PERFORMANCE ATTRIBUTES• Speed / Responsiveness • How fast does the page load? • How quickly can the system process a transaction?• Scalability • Can the application handle the expected end user load? • Does the application throughput degrade as the user load increases?
  5. 5. PERFORMANCE ATTRIBUTES…• Efficiency and Capacity Planning • Are you using the right resources • Can your infrastructure carry the load?• Reliability/Availability/ Recoverability • What is the mean time between failure (MTBF)? • Does the application recover after a crash? Does it lose user data after crash?
  6. 6. UNDERSTANDING PERFORMANCE KPIS System Metrics Server Platform Metrics • CPU • DB • Memory • App-server • Disk / IO • Application • NetworkResponse Time Requests / sec Internet User Load User Load Application Metrics Browser Rendering Metrics* • Response Time • Total Rendering Time • Throughput • Heavy Images/CSS/JS • Error Rate • DNS Lookup End User
  7. 7. UNDERSTANDING PERFORMANCE KPIS… Response Time Throughput DB Inter Response Time Web App Server net Server Server DB Server Total Response Time = Throughput = Network latency + Application latency + [TRANSACTIONS] / Second Browser Rendering Time•Measured from the end-user perspective •Transactions are specific to applications•Time taken to completely respond to request •In its simplest form, it is requests / sec•TTLB TTFB Error •Defined in terms of the success of the request •Error at HTTP level (404, 501) •Application level error
  8. 8. CREATING LOAD TEST REPORTSCapture Application Metrics Capture Server Metrics• Response Time • CPU / Memory / Disk / IO• Throughput 1. Capture • Network• Errors • Application • PlatformCorrelate Application Metrics 2. Correlate Correlate System Metrics• User Load - Response Time • User Load - Server Metrics• User Load - Throughput • User Load - Network• User Load - Errors • User Load - Platform 3. Plot / TabulateTables Graph / Charts• Response Time • Scatter / Line (avg/min/max/%/stddev) 4. Trends / • Overlay• Throughput (average) Thresholds• Errors (success % / types) 5. Customize / Trends / ThresholdsSummarize Summarize • Response Time Trends• Overall Performance • Throughput Trends• Important Trends • Threshold Violation• Threshold Violations 6 . Compare • Utilization (Server Metrics) Trends
  9. 9. SAMPLE REPORT ELEMENTS (SNAPSHOTS) Photo Credits: • http://msdn.microsoft.com/en-us/library/bb924371.aspx • Sanitized past projects
  10. 10. JMETER REPORTING ELEMENTS (LISTENERS)• JMeter Listeners • JMeter elements that display performance test metrics / output • Various types of Listeners (Raw / Aggregated / Graphical) • Doesn’t have inherent capability to measure system metrics* • Useful for basic analysis
  11. 11. GENERATING ADVANCED JMETER REPORTSJMeter Report using xslt stylesheet Other Reporting Options • JMeter CSV results + Excel• Style-sheet under ‘extras’ folder • Process results programmatically• .jtl output must be in xml format (perl / python etc.) – jmeter.save.saveservice.output.for • BlazeMeter Reporting Plug-in mat=xml• Integrate using ant Photo Credits: • http://www.programmerplanet.org/pages/projects/jmeter-ant- task.php
  12. 12. WHAT HAPPENED?TO LABEL A AND KPI B AT TIME C
  13. 13. BLAZEMETER REPORTING PLUGIN BENEFITS• Store a report per test run, including • Script that was used to run the test • Logs & JTL file• Compare results of two test runs• See an improvement trend• Compare current with previous in real time• Share with co-workers
  14. 14. KPIS AVAILABLE IN A JMETER TESTRESPONSE TIME - THE TIME IT TAKES A REQUEST TO FULLY LOAD• Indicates the performance level of the entire system under test (web server + DB).• Represents the average response time during a specific minute of the test.
  15. 15. BLAZEMETER REPORTING PLUGINCOMPARE TWO REPORTS
  16. 16. HTTP://BLAZEMETER.COM/‘BlazeMeter - Startup Offers ‘BlazeMeter - Code probing, not BlazeMeter - Changing theJMeter Cloud Load Testing at Angry Birds will define cloud’s Economics of Load Testing via theScale’ success’ Cloud’ THANK YOU!

×