Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Natural Laws of Software Performance


Published on

Just like you can't defeat the laws of physics there are natural laws that ultimately decide software performance. Even the latest technology beta is still bound by Newton's laws, and you can't change the speed of light, even in the cloud!

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Natural Laws of Software Performance

  1. 1. Natural Laws of Software Performance<br />The changing face of performance optimization<br />
  2. 2. Who Am I?<br />Kendall Miller<br />One of the Founders of Gibraltar Software<br />Small Independent Software Vendor Founded in 2008<br />Developers of VistaDB and Gibraltar<br />Engineers, not Sales People<br />Enterprise Systems Architect & Developer since 1995<br />BSE in Computer Engineering, University of Illinois Urbana-Champaign (UIUC)<br />@KendallMiller<br />
  3. 3. Traditional Performance Optimization<br />Run suspect use cases and find hotspots<br />Very Linear<br />Finds unexpected framework performance issues<br />Final Polishing Step<br />
  4. 4.
  5. 5. Algorithms and Asymptotics<br />Asymptotic (or ‘Big Oh’) Notation<br />Describes the growth rate of functions<br />Answers the question…<br />Does execution time of A grow faster or slower than B?<br />The rules of asymptotic notation say<br />A term of n^3 will tend to dominate a term of n^2<br />Therefore<br />We can discount coefficients and lower order terms<br />And so f(n) = 6n^2 + 2n + 3 + n^3<br />Can be expressed as O(n) = n^3<br />
  6. 6. You Can’t Optimize out of Trouble<br />
  7. 7.
  8. 8. So Where Are We?<br />
  9. 9.
  10. 10.
  11. 11.
  12. 12. Moore’s Law<br />“The number of components in integrated circuits doubles every year”<br />Components = Transistors<br />
  13. 13. Processor Iron Triangle<br />Clock Speed<br />Power<br />Speed of Light<br />Size<br />Complexity<br />Manufacturing Process<br />
  14. 14. A Core Explosion<br />
  15. 15.
  16. 16. Before you Leap into Optimizing…<br />Algorithms are your first step<br />Cores are a constant multiplier, algorithms provide exponential effect<br />Everything we talk about today is ignored in O(n)<br />Parallel processing on cores can get you a quick boost trading cost for modest boost<br />Other tricks can get you more (and get more out of parallel)<br />
  17. 17. Fork / Join Parallel Processing<br />Split a problem into a number of independent problems<br />Process each partition independently (potentially in parallel)<br />Merge the results back together to get the final outcome (if necessary)<br />
  18. 18. Fork / Join Examples<br />Multicore Processors<br />Server Farm<br />Web Server<br />Original Http servers literally forked a process for each request<br />
  19. 19. Fork / Join in .NET<br />System.Threading.ThreadPool<br />Parallel.ForEach<br />PLINQ<br />Parallel.Invoke<br />
  20. 20. Try it Now - PLINQ<br />
  21. 21. Fork / Join Usage<br />Tasks that can be broken into “large enough” chunks that are independent of each other<br />Little shared state required to process<br />Tasks with a low Join cost<br />
  22. 22. Pipelines<br />Partition a task based on stages of processing instead of data for processing<br />Each stage of the pipeline processes independently (and typically concurrently)<br />Stages are typically connected by queues<br />Producer (prev stage) & Consumer (next stage)<br />
  23. 23. Pipeline Examples<br />Order Entry & Order Processing<br />Classic Microprocessor Design<br />Break the instruction processing into stages and process one stage per clock cycle<br />GPU Design<br />Combines Fork/Join with Pipeline<br />
  24. 24. Pipeline Examples in .NET<br />Not the ASP.NET processing Pipeline<br />No parallelism/multithreading/queueing<br />Stream Processing<br />Map Reduce<br />BlockingCollection<T><br />Gibraltar Agent<br />
  25. 25. Pipeline Usage<br />Significant shared state between data elements prevents decoupling them<br />Linear processing requirements within parts of the workflow<br />
  26. 26. It’s the Law<br />Speed of Light: 3x10^8 M/S<br />About 0.240 seconds to Geosynchronous orbit and back<br />About 1 foot per nanosecond<br />3GHz : 1/3rd ns period = 4 inches<br />
  27. 27. Latency – The Silent Killer<br />The time for the first bit to get from here to there<br />Typical LAN: 0.4ms<br />
  28. 28. 5500 KM<br />18 ms<br />TCP Socket Establish: 54ms<br />London<br />NewYork<br />
  29. 29.
  30. 30. Caching<br />Save results of earlier work nearby where they are handy to use again later<br />Cheat: Don’t make the call<br />Cheat More: Apply in front of anything that’s time consuming<br />
  31. 31. Why Caching?<br />Apps ask a lot of repeating questions.<br />Stateless applications even more so<br />Answers don’t change often<br />Authoritative information is expensive<br />Loading the world is impractical<br />
  32. 32. Caching in Hardware<br />Processor L1 Cache (typically same core)<br />Processor L2 (shared by cores)<br />Processor L3 (between proc & main RAM)<br />Disk Controllers<br />Disk Drives<br />…<br />
  33. 33. .NETCaching Examples<br />ASP.NET Output Cache<br />System.Web.Cache (ASP.NET only)<br />AppFabric Cache<br />
  34. 34. Go Asynchronous<br />Delegate the latency to something that will notify you when it’s complete<br />Do other useful stuff while waiting.<br />Otherwise you’re just being efficient, not faster<br />Maximize throughput by scheduling more work than can be done if there weren’t stalls<br />
  35. 35. .NET Async Examples<br />Standard Async IO Pattern<br />.NET 4 Task<T><br />Combine with Queuing to maximize throughput even without parallelization<br />
  36. 36. Visual Studio Async CTP<br />async methods will compile to run asynchronously<br />await forces method to stall execution until the async call completes before proceeding<br />
  37. 37. Batching<br />Get your money’s worth out of every latency hit<br />Tradeoff storage for duration<br />
  38. 38. General Batching Examples<br />SQL Connection Pooling<br />HTTP Keep Alive<br />
  39. 39. .NET Batching<br />DataSet / Entity Sets<br />
  40. 40. Optimistic Messaging<br />Assume it’s all going to work out and just keep sending<br />
  41. 41. Side Points<br />Stateful interaction general increases the cost of latency<br />Minimize Copying<br />It takes blocking time to copy data, introducing latency<br />Your Mileage May Vary<br />Latency on a LAN can be dramatically affected by hardware and configuration<br />
  42. 42. Critical Lessons Learned<br />Algorithms, Algorithms, Algorithms<br />Plan for Latency & Failure<br />Explicitly Design for Parallelism<br />
  43. 43. Additional Information:<br />Websites<br /><br /><br />Follow Up<br /><br />Twitter: kendallmiller<br />