Going for Speed: Testing for Performance

1,369 views

Published on

Unit Testing has settled into the mainstream. As developers, we write code that checks code, ensuring that the outcome matches some expected result. But, are we really? As end-users (which includes each one of us from time to time), when we ask a question, we don't just expect our answer to be right, we expect it to be right now. So as developers, why are we only validating for accuracy? Why aren't we going for speed? During this session we'll discuss meeting the performance needs of an application, including developing a performance specification, measuring application performance from stand-alone testing through unit testing, using tools ranging from Team Foundation Server to the command line, and asserting on these measurements to ensure that all expectations are met. Your application does "right." Let's focus on "right now."

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,369
On SlideShare
0
From Embeds
0
Number of Embeds
17
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Unit Testing has settled into the mainstream. As developers, we write code that checks code, ensuring that the outcome matches some expected result. But, are we really? As end-users (which includes each one of us from time to time), when we ask a question, we don't just expect our answer to be right, we expect it to be right now. So as developers, why are we only validating for accuracy? Why aren't we going for speed? During this session we'll discuss meeting the performance needs of an application, including developing a performance specification, measuring application performance from stand-alone testing through unit testing, using tools ranging from Team Foundation Server to the command line, and asserting on these measurements to ensure that all expectations are met. Your application does "right." Let's focus on "right now."
  • Jay Harris is a .NET developer in Southeast Michigan and an independent software consultant at Arana Software ( http://www.aranasoft.com ). He has been developing on the web for 15 years, since he abandoned VB3 for JavaScript because he didn't have to wait for a compile. With a career focus on end-user experience, he is a strong advocate of practices and processes that improve quality through code, ranging from automated testing, continuous integration, and performance analysis, to designing applications from the perspective of the user instead of the database. Jay is also active in the developer community beyond speaking, including serving as President of Ann Arbor .Net Developers ( http://www.aadnd.org ) an d as an organizer fo r Lansing Give Camp. When not coding, he is usually blogging to http://www.cptloadtest.com or playing games on his Xbox 360.
  • Presentation agenda: Justifying Performance Testing, Determining Performance Goals, Identifying Existing Issues, Testing a Unit of Code for Performance, andTesting an Application for Performance.
  • Going for Speed: Testing for Performance

    1. 1. going for speed T E S T I N G A G A I N S T P E R F O R M A N C E E X P E C TAT I O N S
    2. 2. why does performance matter?
    3. 3. poor performance = ?
    4. 4. stuck in traffic in detroit 60hours per year tinyurl.com/3c3df9g
    5. 5. flight is delayed with american airlines 26%tinyurl.com/3j9hmje
    6. 6. auto fuel efficiency power to the wheels 20%www.fueleconomy.gov/feg/atv.shtml
    7. 7. an annual loss of $1285.66% www.fueleconomy.gov/feg/atv.shtml
    8. 8. poor performance = wasted time & money
    9. 9. why does performance matter to developers?
    10. 10. ? throw at it? what can we
    11. 11. more hardware! let’s throw
    12. 12. more resources! let’s throw
    13. 13. your overtime! let’s throw
    14. 14. poor performance = a miserable life
    15. 15. suffer does my app ?
    16. 16. 75% sustain under
    17. 17. r:% advanced metrics
    18. 18. but all that really matters... perception
    19. 19. goals establishing
    20. 20. what is the target?
    21. 21. avoid buzzwords99.999% of the time
    22. 22. target threshold vs. ↼ ↼
    23. 23. constraints and budgets
    24. 24. assumptions validate
    25. 25. plan.execute( )
    26. 26. unit analysis //MSTest Example [TestMethod] public void ShouldDoThis() { // Test Functionality }
    27. 27. unit analysis //MSTest Example [TestMethod] public void ShouldDoThisInUnder100ms() { // Test Functionality // TODO: Write code to test for speed }
    28. 28. unit analysis //MSTest Example [TestMethod] public void ShouldDoThisInUnder100ms() { var stopwatch = Stopwatch.StartNew(); // Test Functionality stopwatch.Stop(); int elapsedMs = stopwatch .ElapsedMilliseconds; Assert.True(elapsedMs <= 100); }
    29. 29. unit analysis //NUnit Example [TestFixture] public void ShouldDoThisInUnder100ms() { // Test Functionality // TODO: Write code to test for speed }
    30. 30. unit analysis //NUnit Example [TestFixture, MaxTime(100)] public void ShouldDoThisInUnder100ms() { // Test Functionality // Done. Keep being Awesome. }
    31. 31. memory usage Process.GetCurrentProcess().WorkingSet64 using System.Diagnostics; GC.GetTotalMemory(false) using System;
    32. 32. visual studio performance explorer demo profiling studio with
    33. 33. web analysis
    34. 34. demo visual studio web test
    35. 35. traffic flow patterns ORANVIRIYINCY | FLICKR.COM/PEOPLE/VIRIYINCY
    36. 36. demo visual studio performance test
    37. 37. everything simultaneously run ! rule#10
    38. 38. isolate data at all times ! rule#9
    39. 39. real user test like a ! rule#8
    40. 40. production test on a ! rule#7 clone of
    41. 41. production test on ! rule#6
    42. 42. turned off tests can be ! rule#5
    43. 43. !assumptions validate your rule#4
    44. 44. !budget stick to your rule#3
    45. 45. !design performancestarts at rule#2
    46. 46. want more information?
    47. 47. Microsoft Debug Diagnostics Tool tinyurl.com/DebugDiag tinyurl.com/DebugDiagWP (Whitepaper) Web Capacity Analysis Tool (WCAT) bit.ly/ffq6dD JetBrains dotTrace jetbrains.com Red Gate ANTS Performance and Memory Profilers red-gate.com HP LoadRunner tinyurl.com/LoadRunner
    48. 48. Performance Testing Guidelines for Web Applications Microsoft Patterns & Practices perftestingguide.codeplex.com J.D Meier, Carlos Farre, Prashant Bansode, Scott Barber, Dennis Rea
    49. 49. rule #1: quality is everyone’s job!
    50. 50. go for speed
    51. 51. # g o i n g F o r S p e e d jay@aranasoft.com

    ×