Storage Performance Testing


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Storage Performance Testing

  1. 1. Storage Performance Testing Woody Hutsell, Texas Memory Systems
  2. 2. Abstract <ul><li>Storage Performance Testing </li></ul><ul><li>Conducting storage performance tests is essential to selecting storage for tiered storage environments. Some applications require endless hours of constant data acquisition, while others experience peak bursts of small block I/O. The best storage device for one application is almost never the right storage device for another. </li></ul><ul><li>  </li></ul><ul><li>This session will provide an in-depth technical discussion of storage performance testing . </li></ul>
  3. 3. Agenda Topics <ul><li>Why test storage? </li></ul><ul><li>Types of storage performance testing: </li></ul><ul><ul><li>Benchmarking. </li></ul></ul><ul><ul><li>Application simulation. </li></ul></ul><ul><ul><li>Application testing. </li></ul></ul><ul><ul><li>Production testing. </li></ul></ul><ul><li>Common mistakes. </li></ul><ul><li>Conduct live performance tests to show how to configure, test and analyze results. </li></ul>
  4. 4. Why Test Storage? <ul><li>Because it matters to your users and customers. </li></ul><ul><ul><li>Slow storage performance means slow response times and long running queries. </li></ul></ul><ul><li>Because it affects your batch window. </li></ul><ul><ul><li>Slow storage can mean longer batch or backup windows causing lower application availability or maintenance windows. </li></ul></ul><ul><li>Because it matters to your company’s profitability. </li></ul><ul><ul><li>Slow storage can frustrate your customers and waste the investment you have made in your server infrastructure. </li></ul></ul><ul><ul><li>Inappropriate use of fast storage means wasted dollars spent on performance. </li></ul></ul><ul><li>Because storage vendors do not publicize every relevant metric for your application and environment. </li></ul>
  5. 5. Types of Storage Performance Testing <ul><li>Benchmarking. </li></ul><ul><ul><li>Review published and audited industry benchmarks. </li></ul></ul><ul><ul><li>Conduct tests with industry standard software. </li></ul></ul><ul><ul><li>Conduct tests for data corruption. </li></ul></ul><ul><li>Application simulation. </li></ul><ul><ul><li>Use industry standard software to test a program with conditions similar to a target application. </li></ul></ul><ul><li>Application testing. </li></ul><ul><ul><li>Test an application with sample queries or scripts in a production-like environment. </li></ul></ul><ul><li>Production testing. </li></ul>
  6. 6. Published Benchmarks <ul><li>Storage Performance Council </li></ul><ul><ul><li>www.StoragePerformance.Org </li></ul></ul><ul><ul><li>SPC-1 test simulates an on-line transaction processing (OLTP) environment. </li></ul></ul><ul><ul><li>SPC-2 test to simulate large block sequential processing. </li></ul></ul><ul><li>Spec-SFS </li></ul><ul><ul><li>www.Spec.Org/sfs97r1 </li></ul></ul><ul><ul><li>A good test for measuring performance of file servers and network attached storage. </li></ul></ul><ul><li>TPC-C </li></ul><ul><ul><li> </li></ul></ul><ul><ul><li>TPC-C for testing OLTP, TPC-H for decision support and TPC-W for web e-commerce. </li></ul></ul>
  7. 7. Sample SPC-1 Result <ul><li>Shows peak SPC-1 IOPS. </li></ul><ul><li>Shows response time curve. </li></ul>
  8. 8. Benchmarking Software <ul><li>IOMeter </li></ul><ul><ul><li>Most popular tool among storage vendors. </li></ul></ul><ul><ul><li>Available free from . </li></ul></ul><ul><ul><li>Primarily a Windows-based tool. </li></ul></ul><ul><li>IOZone </li></ul><ul><ul><li>Broad OS support. </li></ul></ul><ul><ul><li>Available free from . </li></ul></ul><ul><li>Benchmark Factory for Databases by Quest Software. </li></ul><ul><ul><li>TPC-B, TPC-C, TPC-D (not for publishing results). </li></ul></ul><ul><li>Vendor tools. </li></ul>
  9. 9. IOMeter – Disk Targets Tab <ul><li>Heuristics: </li></ul><ul><li>One manager per server. </li></ul><ul><li>One worker per processor. </li></ul><ul><li>Note: </li></ul><ul><li>If you leave this field at “0”, IOMeter will use all available disk space. </li></ul>Can play a significant role in observed performance.
  10. 10. IOMeter – Example Effect of Varied Outstanding I/Os Block Size of Transfer IOPS 25 Outstanding IOs. 1 Outstanding IO.
  11. 11. IOMeter – Setting Access Specifications Test storage with small and large block transfer request sizes. Try different sequential vs. random tests. Try different read/write mixtures. Usually leave at default but can be changed to match application behavior.
  12. 12. IOMeter – Example Effect of Varied Block Sizes - 50,000 100,000 150,000 200,000 250,000 300,000 350,000 400,000 '2k' '4k' '16K' '64K' '256K' Block Size of Transfer IOPS Small block size = High IOPS but relatively Low bandwidth. Big block size = Low IOPS but relatively High bandwidth.
  13. 13. IOMeter – Example Effect of Random vs. Sequential 1,413 4,626 1,000 2,000 3,000 4,000 5,000 6,000 '2k' '4k' '16K' '64K' '256K' Block Size of Transfer Sequential Transfers Random Transfers IOPS
  14. 14. Testing for Data Corruption <ul><li>Storage devices and storage network components are almost always reliable in predictable performance ranges, but the question is how do they handle extreme requirements. </li></ul><ul><li>Most benchmarking tools do not automatically check data. </li></ul><ul><li>Testing for data corruption usually means testing with data patterns that challenge components. </li></ul><ul><ul><li>Need to test extremes of performance. </li></ul></ul><ul><ul><li>Need to test extreme data patterns. </li></ul></ul>
  15. 15. Example of Data Testing Patterns <ul><li>Data pattern testing should be implemented with a write, read compare process in order to verify data is not corrupt. </li></ul><ul><li>Some vendors can provide you software which performs this kind of testing. </li></ul><ul><li>Examples of challenging data patterns: </li></ul><ul><ul><li>A’s and 5’s </li></ul></ul><ul><ul><ul><li>010101101010 </li></ul></ul></ul><ul><ul><li>Rolling 0’s and Rolling 1’s </li></ul></ul><ul><ul><ul><li>0001 0010 0100 1000 </li></ul></ul></ul>
  16. 16. Application Simulation Testing <ul><li>One type of test does not represent all applications. </li></ul><ul><li>One type of application does not represent all uses for a storage product. </li></ul><ul><li>Common types of application simulation testing: </li></ul><ul><ul><li>Test storage latency for messaging or other single-threaded applications. </li></ul></ul><ul><ul><li>Test peak storage bandwidth for data acquisition or data streaming environments. </li></ul></ul><ul><ul><li>Test peak storage IOPS for databases. </li></ul></ul>
  17. 17. IOMeter – Simulating Single Threaded Applications 1 1 Outstanding I/O simulates a single threaded application. Note: Single threaded applications are extremely sensitive to latency (server, HBA, switch and storage device).
  18. 18. IOMeter – Simulating Multi-threaded Applications 25 25 Outstanding I/Os simulates a multi-threaded application.
  19. 19. IOMeter – Simulating Database Environments 8 Small transfer request size simulates database transfers. Match the application’s read/write distribution. Database activity is mostly random.
  20. 20. IOMeter – Simulating Data Streaming 512 Big transfer request size tests peak bandwidth. Match the application’s read/write distribution. A mostly sequential setting is best.
  21. 21. Application Testing <ul><li>Testing with the actual application is the best way to measure storage performance. </li></ul><ul><ul><li>Production-like environment that can stress storage limits is desirable. </li></ul></ul><ul><ul><li>Measure performance of different solutions: </li></ul></ul><ul><ul><ul><li>Compare OLTP response times. </li></ul></ul></ul><ul><ul><ul><li>Compare batch run times. </li></ul></ul></ul><ul><ul><ul><li>Compare sustained streaming rates. </li></ul></ul></ul><ul><ul><li>Operating system and application tools can help monitor storage performance. </li></ul></ul>
  22. 22. Monitoring Storage Performance With Windows <ul><li>Windows performance monitor can be used to monitor storage performance. </li></ul><ul><li>Capture the following key variables over the duration of a peak processing period or test run: </li></ul><ul><ul><li>Processor: % processor time (total and by processor). </li></ul></ul><ul><ul><li>Physical disk: average disk queue (total, read and write by disk/array). </li></ul></ul><ul><ul><li>Physical disk: disk bytes/second (total, read and write by disk array). </li></ul></ul>
  23. 23. Monitoring Storage Performance With Windows <ul><li>Tips to analyzing Windows Performance Monitor results: </li></ul><ul><ul><li>Use the following scaling to ease visual analysis: </li></ul></ul><ul><ul><ul><li>Scale disk queues by using a 1:1 ratio (default is 100:1). </li></ul></ul></ul><ul><ul><ul><li>Scale processor utilization by using a 1:1 ratio (default is 1:1). </li></ul></ul></ul><ul><ul><ul><li>Scale disk bytes per second by using a 0000001:1 ratio (default is a .0001:1). </li></ul></ul></ul><ul><ul><li>Start by looking at graph with “total” fields to identify big issues and then drill down into read/write/by disk/by processor variables. </li></ul></ul><ul><ul><li>If your graphs are hard to read, alter the line thickness in order to see your results easier. </li></ul></ul><ul><ul><li>Using the slider bars, zoom into trouble spots for better detail. </li></ul></ul>
  24. 24. Monitoring Storage Performance With Windows <ul><li>More tips to analyzing Windows Performance Monitor results: </li></ul><ul><ul><li>Remember that disk bytes per second should be divided by 1024 to get disk KB and 1024 again to get disk MB. </li></ul></ul><ul><ul><li>Where physical disk queues increase is likely at the same point where you have hit a storage performance limitation. </li></ul></ul><ul><ul><li>A system with high processor utilization does not have a storage performance bottleneck. </li></ul></ul><ul><ul><li>Microsoft recommends that physical disk queues greater than 3 shows an I/O bottleneck. </li></ul></ul><ul><ul><li>Processor utilization levels off in places you have physical disk queues this is an indication that faster storage will improve application performance. </li></ul></ul>
  25. 25. Monitoring Storage Performance With Windows Disk Queues show pending requests to storage. Disk Bytes Per Second helps reveal storage limitations. Processor Time shows percent processor utilized.
  26. 26. Monitoring Storage Performance With Windows No Processor bottleneck No disk queues Low disk activity
  27. 27. <ul><li>IOStat results show read and write bytes per second: </li></ul><ul><li>TOP shows CPU utilization including I/O Wait. </li></ul>Monitoring Storage Performance With UNIX Device: r/s w/s rkB/s wkB/s avgrq-sz avgqu-sz await svctm %util /dev/sdb 0.00 10619.39 0.00 85570.91 16.12 4636.79 43.52 0.10 101.21 /dev/sdc 0.00 10678.79 0.00 85570.91 16.07 2438.06 22.75 0.10 107.27 avg-cpu: %user %nice %sys %idle 13.04 0.33 68.15 18.48 load averages:  0.09,  0.04,  0.03                                     16:31:09 66 processes:  65 sleeping, 1 on cpu CPU states: 69.2% idle, 18.9% user, 11.9% kernel, 0.0% iowait, 0.0% swap Memory: 128M real, 4976K free, 53M swap in use, 542M swap free
  28. 28. Monitoring Storage Performance With Oracle Elapsed: 68.87 (mins) Top 5 Wait Events ~~~~~~~~~~~~~~~~~ Wait % Total Event Waits Time (cs) Wt Time -------------------------------------------- ------------ ------------ ------- db file sequential read 18,073,422 581,168 59.36 db file scattered read 933,001 267,364 27.31 db file parallel write 25,990 35,898 3.67 SQL*Net message from dblink 181,872 20,372 2.08 latch free 11,936 17,879 1.83 ------------------------------------------------------------- Tablespace ------------------------------ Av Av Av Av Buffer Av Buf Reads Reads/s Rd(ms) Blks/Rd Writes Writes/s Waits Wt(ms) -------------- ------- ------ ------- ------------ -------- ---------- ------ SESSION_DATA 61 0 20.2 1.2 190,606 94 128,753 56.8 UNDOTBS1 32 0 14.1 1.0 16,517 8 6,083 2.3 These wait events are heavily influenced by storage. Tablespace metrics are a good way to monitor storage performance.
  29. 29. <ul><li>Risk vs. Reward. </li></ul><ul><ul><li>Risk: taking an unsupported, well-traveled evaluation unit and putting it in a production environment could compromise application availability and expose unexpected system problems. </li></ul></ul><ul><ul><li>Reward: sometimes this is the only way to know for certain that storage performance is acceptable for an application. </li></ul></ul>Production Testing
  30. 30. Typical Mistakes <ul><li>Testing storage performance with file copy commands. </li></ul><ul><li>Comparing storage devices back-to-back without clearing server cache. </li></ul><ul><li>Testing where the data set is so small the benchmark rarely goes beyond server cache. </li></ul><ul><li>Testing peak device performance with one outstanding I/O. </li></ul><ul><li>Testing peak bandwidth of storage with a PCI 32/33 PCI slot. </li></ul><ul><li>Forgetting to monitor processor utilization during testing. </li></ul><ul><li>Monitoring the wrong server’s performance. </li></ul>
  31. 31. Demonstration <ul><li>IOMeter Demonstration (time allowing): </li></ul><ul><ul><li>Peak IOPS </li></ul></ul><ul><ul><li>Peak Bandwidth </li></ul></ul><ul><ul><li>Simulating a messaging application </li></ul></ul><ul><ul><li>Simulating a database application </li></ul></ul><ul><ul><li>Simulating a data acquisition application </li></ul></ul>
  32. 32. SNIA Legal Notice <ul><li>The material contained in this tutorial is copyrighted by the SNIA. </li></ul><ul><li>Member companies and individuals may use this material in presentations and literature under the following conditions: </li></ul><ul><ul><li>Any slide or slides used must be reproduced without modification </li></ul></ul><ul><ul><li>The SNIA must be acknowledged as source of any material used in the body of any document containing material from these presentations. </li></ul></ul><ul><li>This presentation is a project of the SNIA Education Committee. </li></ul>
  33. 33. Q&A / Feedback <ul><li>Please send any questions or comments on this presentation to SNIA: track- [email_address] </li></ul>Many thanks to the following individuals for their contributions to this tutorial. SNIA Education Committee Sarah Worthy Jamon Bowen Storage Performance Council Elaine Silber