• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
BlazeMeter Effective Performance Reporting
 

BlazeMeter Effective Performance Reporting

on

  • 1,212 views

This topic focuses on effective reporting and its associated challenges while using JMeter. It delves into the importance of metrics and KPIs for effective performance reporting, followed by a brief ...

This topic focuses on effective reporting and its associated challenges while using JMeter. It delves into the importance of metrics and KPIs for effective performance reporting, followed by a brief overview of JMeter's built-in listeners (reporting elements) like Aggregate Listener, Graph Listeners etc.

The 3rd and the final part covers the inadequacies of these listeners and use of third party/external reporting tools that provide enhanced reporting (ant + xslt).

The new BlazeMeter reporting plugin is introduced as a quick and ready to use solution for JMeter reporting.

Sub-topics:
* Importance of effective performance test reporting
* Typical performance testing metrics
* JMeter reporting entities (Listeners)
* Shortcomings of existing JMeter reporting elements
* Generating advanced JMeter reports using ant + xslt
* Building reporting tools frameworks
* How the blazemeter reporting plugin can alleviate the challenges in JMeter reports
* Details on the blazemeter reporting plugin

Statistics

Views

Total Views
1,212
Views on SlideShare
1,212
Embed Views
0

Actions

Likes
1
Downloads
28
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Thank youIt’s good to be hereThat sounds great, thanks
  • Hi. My name is a Alon Girmonsky and I am the CEO and Founder of BlazeMeter. BlazeMeter is a load testing cloud (or should I say, platform as a service) that is 100% compatible with Apache JMeter.I am excited to open the second JMeter webinar and this time we will be discussing – Reporting.It’s a great privilege to for me to introduce to you budhaditya das who is a highly experienced JMeter professional and was kind enough to share his experience and discuss JMeter reporting in this Webinar.
  • Importance of various performance attributes. Quick explanation of various performance attributes
  • Performance KPIs can be bifurcated into various silos. Application Metrics (or KPIs) are typically captured by the tools and include metrics like response time, throughput, error rate etc. Typically in end used performance testing browser rendering metrics are ignored (unless they are heavy ajax or RIA applications). Server side metrics are around resource utilization and most tools capture these using additional plugins. Jmeter doesn’t have inherent capibility of monitoring system netrics (this is achiever using 3rd party plugin)
  • Response Time: Measured from end user perspective. Typically important for end user facing applicationsThroughput: Typically important for transactional systemsError Rate: Important to capture different classes / types of errors
  • Typical Reporting Steps
  • Most common reporting elements in Jmeter (Aggreate Reports and Graph Results). They capture the Application KPIs in a tabular and graphical format
  • Thorough the years and close to 15,000JMeter tests, BlazeMeter has developed its own JMeter reporting tools.Until recently, the only way to enjoy these reporting tools was to rung the test using our cloud infrastructure.Lately we’ve decided to allow any JMeter user to enjoy BlazeMeter’s reporting tools free of charge.As results we’ve released the BlazeMeter plugin.The BlazeMeter plugin enable everyone to enjoy BlazeMeter’s reporting capabilities free of charge and as part of their jmeter software.
  • In general, BlazeMeter stores all of the data, logs, reports and other artifacts resulting from each test run.This is not only a way to manage all of the test configurations, reports etc.It also allows to compare results and do a deep dive into a certain set of results.
  • Using BlazeMeter plugin, not only you can view all of the KPIs. You can see the KPIs of each label and through time. You can dive in according to you interest up to a certain minute to a certain KPI of a certain label and you can also compare between KPIs and between labels.
  • Being able to compare results of two test runs is very important!It allows you to evaluate whether the performance level has improved or deterioratedsince the last run.You may also want to see how the current test results compare to previous run in real time.This can only be achieved if you store all the data of previous run and are able to compare between two sets of results.

BlazeMeter Effective Performance Reporting BlazeMeter Effective Performance Reporting Presentation Transcript

  • EFFECTIVE PERFORMANCE REPORTING USING APACHE JMETER JULY 31, 2012
  • THE LOAD TESTING CLOUD A DEV-TEST CLOUD SERVICE 100% COMPATIBLE WITH THE OPEN-SOURCE APACHE JMETER
  • AGENDA Performance Attributes Understanding Performance KPIs Creating Load Test Reports JMeter Reporting ElementsGenerating Advanced JMeter Reports BlazeMeter Reporting Plugin
  • PERFORMANCE ATTRIBUTES• Speed / Responsiveness • How fast does the page load? • How quickly can the system process a transaction?• Scalability • Can the application handle the expected end user load? • Does the application throughput degrade as the user load increases?
  • PERFORMANCE ATTRIBUTES…• Efficiency and Capacity Planning • Are you using the right resources • Can your infrastructure carry the load?• Reliability/Availability/ Recoverability • What is the mean time between failure (MTBF)? • Does the application recover after a crash? Does it lose user data after crash?
  • UNDERSTANDING PERFORMANCE KPIS System Metrics Server Platform Metrics • CPU • DB • Memory • App-server • Disk / IO • Application • NetworkResponse Time Requests / sec Internet User Load User Load Application Metrics Browser Rendering Metrics* • Response Time • Total Rendering Time • Throughput • Heavy Images/CSS/JS • Error Rate • DNS Lookup End User
  • UNDERSTANDING PERFORMANCE KPIS… Response Time Throughput DB Inter Response Time Web App Server net Server Server DB Server Total Response Time = Throughput = Network latency + Application latency + [TRANSACTIONS] / Second Browser Rendering Time•Measured from the end-user perspective •Transactions are specific to applications•Time taken to completely respond to request •In its simplest form, it is requests / sec•TTLB TTFB Error •Defined in terms of the success of the request •Error at HTTP level (404, 501) •Application level error
  • CREATING LOAD TEST REPORTSCapture Application Metrics Capture Server Metrics• Response Time • CPU / Memory / Disk / IO• Throughput 1. Capture • Network• Errors • Application • PlatformCorrelate Application Metrics 2. Correlate Correlate System Metrics• User Load - Response Time • User Load - Server Metrics• User Load - Throughput • User Load - Network• User Load - Errors • User Load - Platform 3. Plot / TabulateTables Graph / Charts• Response Time • Scatter / Line (avg/min/max/%/stddev) 4. Trends / • Overlay• Throughput (average) Thresholds• Errors (success % / types) 5. Customize / Trends / ThresholdsSummarize Summarize • Response Time Trends• Overall Performance • Throughput Trends• Important Trends • Threshold Violation• Threshold Violations 6 . Compare • Utilization (Server Metrics) Trends
  • SAMPLE REPORT ELEMENTS (SNAPSHOTS) Photo Credits: • http://msdn.microsoft.com/en-us/library/bb924371.aspx • Sanitized past projects
  • JMETER REPORTING ELEMENTS (LISTENERS)• JMeter Listeners • JMeter elements that display performance test metrics / output • Various types of Listeners (Raw / Aggregated / Graphical) • Doesn’t have inherent capability to measure system metrics* • Useful for basic analysis
  • GENERATING ADVANCED JMETER REPORTSJMeter Report using xslt stylesheet Other Reporting Options • JMeter CSV results + Excel• Style-sheet under ‘extras’ folder • Process results programmatically• .jtl output must be in xml format (perl / python etc.) – jmeter.save.saveservice.output.for • BlazeMeter Reporting Plug-in mat=xml• Integrate using ant Photo Credits: • http://www.programmerplanet.org/pages/projects/jmeter-ant- task.php
  • WHAT HAPPENED?TO LABEL A AND KPI B AT TIME C
  • BLAZEMETER REPORTING PLUGIN BENEFITS• Store a report per test run, including • Script that was used to run the test • Logs & JTL file• Compare results of two test runs• See an improvement trend• Compare current with previous in real time• Share with co-workers
  • KPIS AVAILABLE IN A JMETER TESTRESPONSE TIME - THE TIME IT TAKES A REQUEST TO FULLY LOAD• Indicates the performance level of the entire system under test (web server + DB).• Represents the average response time during a specific minute of the test.
  • BLAZEMETER REPORTING PLUGINCOMPARE TWO REPORTS
  • HTTP://BLAZEMETER.COM/‘BlazeMeter - Startup Offers ‘BlazeMeter - Code probing, not BlazeMeter - Changing theJMeter Cloud Load Testing at Angry Birds will define cloud’s Economics of Load Testing via theScale’ success’ Cloud’ THANK YOU!