Less14 3 e_loadmodule_4

  • 647 views
Uploaded on

This is part of R12 Testing Suite for Oracle Applications or E-Business suite.

This is part of R12 Testing Suite for Oracle Applications or E-Business suite.

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
647
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
11
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Oracle Application Testing Suite: Introduction 14 - Reports & Graphs Reporting Customization – e-Load provides options for customizing graphs and reports. Users can set the range of the x-axis (min, max) to zoom in on a particular time segment. Users can switch between linear and log based y-axis scaling. There is no limit on the number of data series plotted on a single graph. Users can choose whether to include or exclude think times in their profile timers (scripts, user defined profiles) for the real-time performance statistics view and post-run session Reports.
  • Oracle Application Testing Suite: Introduction 14 - Reports & Graphs (continued) Reporting Customization – e-Load provides options for customizing graphs and reports. Users can set the range of the x-axis (min, max) to zoom in on a particular time segment. Users can also switch between linear and log based y-axis scaling. There is no limit on the number of data series plotted on a single graph. Users can choose whether to include or exclude think times in their profile timers (scripts, user defined profiles) for the real-time performance statistics view and post-run session reports. Reporting Interface – e-Load includes an interface for creating real-time and post-run custom graphs. Users directly launch graphs by double-clicking individual counters or by selecting multiple counters in the data series tree for analyzing individual load test Sessions or comparing results from multiple sessions.
  • Oracle Application Testing Suite: Introduction 14 - Create Graphs In the Filters section at the bottom of the screen, enter a name for the new graph. The name is displayed in the corresponding tab at the top of the screen. Select the data you want to view in the Available Data Series field. Click Add Data Series. Select how to plot the graph in the Plot Data Series vs. field.
  • Oracle Application Testing Suite: Introduction 14 - Create Graphs (continued) Each custom report is displayed under a separate tab so that you can select the one you want to view. Click the “Remove Graph” button located on the graph's tab to delete the graph and its tab. Click Export to CSV to save the graph as a CSV file. Click “Export to Excel” to save the graph as a CSV file. Click “Export to PDF” to save the graph as a PDF file.

Transcript

  • 1. Reports & Graphs Introduction to Oracle Application Testing Suite: e-Load
  • 2. Reports & Graphs
    • Generates Graphs and Charts to help analyze potential bottlenecks
    • Allows you to chart historic test data with recently executed test data
  • 3. Reports & Graphs
  • 4. Session Report
    • Session report is an overview of the test
    • May be exported to a CSV file for portability and distribution of results
  • 5. Create Graphs
    • Provides solid data for later use regarding the last test
    • Provides statistics for Script Execution Times
    • Provides metrics such as the following:
      • Total Virtual Users
      • Transactions (Script Executions) per Second
      • Pages & Hits per Second
      • Total Transactions Executed
    • Contains last executed Virtual User configuration
  • 6. Create Graphs
    • Click on New Graph to generate a new graph
  • 7. Create Graphs
    • Filters let you select which Data Series to plot on the graph
  • 8. Create Graphs
    • You can save the graph in various formats
  • 9. Performance vs. Time
    • Performance vs. Time:
      • Gives script completion times during the total test execution.
      • This trend should remain flat unless a defect occurs within the web application.
      • A bottleneck will cause performance time (script completion times) to increase
      • Invalid server responses may cause decreases in performance times.
  • 10. Performance vs. Time
  • 11. Errors vs. Time
    • Errors vs. Time:
      • Number of errors encountered at the time of execution
      • Should always remain at zero.
      • Occasional spikes mean occasional failures.
      • Increasing spike frequency over time may indicate that the server cannot withstand the existing load without throwing an error.
  • 12. Errors vs. Time
  • 13. Errors vs. Users
    • Errors vs. Users:
      • Same as “Errors vs. Time” except that the graph is plotted over the total number of users run during the test.
  • 14. Errors vs. Users
  • 15. Performance vs. Users
    • Performance vs. Users:
      • Shows script completion times given vs. the total active users.
      • Graph should remain flat unless a defect occurs within the web application.
      • A bottleneck will cause performance time to increase.
      • Invalid server responses will cause the performance time to decrease.
  • 16. Performance vs. Users
  • 17. Statistics vs. Time
    • Statistics vs. Time:
      • Shows Kilobytes/Second, Transactions/Second, Pages/Second, and Hits/Second.
      • As users continue to increase, KB/Sec should increase.
      • If KB/Sec remains does not increase given increasing users, there may be a an issue with the application keeping up with the load applied.
      • If KB/Sec decreases given increasing users, a bottleneck may exist.
      • Heavy KB/Sec fluctuations may indicate unstable servers given the current load.
  • 18. Statistics vs. Time
  • 19. Statistics vs. Users
    • Statistics vs. Users:
      • Same as Statistics vs. Time but potted over the actual number of users run during the test.
  • 20. Statistics vs. Users
  • 21. Support
    • http://metalink.oracle.com
  • 22.