Carey Schwaber Analyst Forrester Research


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • In Wal-Mart’s case, that’s 107 million credit card validations in less than one second each
  • Several surveys have shown that over one third (35% and 39% respectively) of companies surveyed estimate the cost of one hour of downtime at over $100K
  • As development shops mature their performance practices, they pay more attention to performance earlier in the life cycle
  • A US health insurance company has begun heading in this direction after discovering that architects, service providers, and DBAs weren’t coordinating their component-level performance targets to ensure that they all added up This necessitated significant rework late in the project life cycle, ultimately causing missed release dates and cost overruns In a performance-driven development organization, the design team maps out a strategy for meeting performance requirements by:
  • Carey Schwaber Analyst Forrester Research

    1. 1. February 14, 2005 Call in at 12:55 p.m. Eastern Time Carey Schwaber Analyst Forrester Research Software Performance And The Development Life Cycle
    2. 2. Software performance <ul><li>The speed at which software functions (sub-second response time versus three-second response time) </li></ul><ul><li>How often software is available (99% uptime versus scheduled weekly downtime) </li></ul><ul><li>How these two factors change as usage levels increase (from dozens of current users to thousands) </li></ul>
    3. 3. Load testing is necessary but not sufficient Load testing ? ? ? ?
    4. 4. Theme A performance-driven development life cycle helps IT shops more effectively and efficiently meet business performance requirements
    5. 5. Agenda <ul><li>Software performance matters </li></ul><ul><li>Performance verification ensures software performance meets business needs </li></ul><ul><li>But performance-driven development drives down the cost of meeting these needs </li></ul><ul><li>How to move to performance-driven development </li></ul>
    6. 6. Agenda <ul><li>Software performance matters </li></ul><ul><li>Performance verification ensures software performance meets business needs </li></ul><ul><li>But performance-driven development drives down the cost of meeting these needs </li></ul><ul><li>How to move to performance-driven development </li></ul>
    7. 7. Software performance matters to the business <ul><li>Software performance is a limiting factor of business performance </li></ul><ul><ul><li>A financial services company can’t provide a life insurance quote to a customer any faster than its quote engine can compute the appropriate rate </li></ul></ul><ul><ul><li>An automotive supplier can’t start building parts if the software that delivers orders from its OEM customers has been taken down by unanticipated usage levels </li></ul></ul><ul><ul><li>A retailer can’t process orders any faster than it can process credit card validations </li></ul></ul>
    8. 8. But development shops neglect performance, and firefighting wastes time, money, and good will <ul><li>IT loses credibility when performance problems disrupt business operations </li></ul><ul><ul><li>Problems with in-store app performance earn a retail CIO the ire of his fellow executives </li></ul></ul><ul><li>Postponing problem resolution until late in the day is less cost effective by at least an order of magnitude </li></ul><ul><ul><li>A US health insurance company calculates several million dollars of avoidable costs every year </li></ul></ul><ul><li>Millions of dollars are wasted on unnecessary hardware </li></ul><ul><ul><li>A global pharmaceutical purchases several times the hardware it needs </li></ul></ul><ul><li>Funds are diverted from strategic initiatives to ongoing maintenance and operations </li></ul><ul><ul><li>A telecom spends $29 million on support calls in six months when its EAI infrastructure experiences 15-second timeouts </li></ul></ul>
    9. 9. The answer? A performance-driven development life cycle Firefighting Performance verification Maturity Performance-driven development Stage of the software development life cycle Production Development Testing Requirements Design
    10. 10. Agenda <ul><li>Software performance matters </li></ul><ul><li>Performance verification ensures software performance meets business needs </li></ul><ul><li>But performance-driven development drives down the cost of meeting these needs </li></ul><ul><li>How to move to performance-driven development </li></ul>
    11. 11. Why performance verification? The view from ops Lost revenue % of irrecoverable business Duration of outage Average revenue per hour Cost of data recovery IT costs Problem identification Analysis and resolution Testing Cost of external support Productivity costs Number of people affected Average % of lost productivity Average cost per employee Duration of outage Average overtime costs + + = Real costs Costs if resolved in development + Costs if resolved in production = =
    12. 12. Business and IT partner to set performance requirements <ul><li>Look first at high-level business needs </li></ul><ul><ul><li>New functionality on an office supply eCommerce site can’t degrade page load time by more than 10% </li></ul></ul><ul><li>Right-size expectations by talking tradeoffs and chargebacks </li></ul><ul><ul><li>A UK life insurance company has found that building service levels into chargebacks results in more accurate performance requirements </li></ul></ul><ul><ul><li>Pitney Bowes helps business stakeholders understand necessary tradeoffs by asking questions like, “Are you willing to pay an additional $10,000 for a third failover site?” </li></ul></ul><ul><li>The production environment is itself a set of requirements </li></ul><ul><ul><li>A US grocery chain traces the majority of its performance problems back to improperly understood production constraints </li></ul></ul>
    13. 13. Get more bang for your performance testing buck <ul><li>Cut the cost of replication by centralizing the performance test lab </li></ul><ul><ul><li>One $30B+ firm spent $11 million on testing and staging environments </li></ul></ul><ul><ul><li>Adding virtualization to the test lab’s tool kit drives this cost down </li></ul></ul><ul><ul><li>A team at CSFB spent approximately $300,000 on performance modeling tools instead </li></ul></ul><ul><li>Improve and share performance testing services with a test center of excellence </li></ul><ul><ul><li>Experienced performance testers command salaries ranging from $75,000 to $150,000 </li></ul></ul><ul><ul><li>Software necessary for testing 3,000 concurrent users costs $300,000 dollars </li></ul></ul><ul><li>Offshore performance test scripts created to keep staff focused on optimization </li></ul><ul><ul><li>Performance testing is too iterative to offshore entirely </li></ul></ul><ul><ul><li>A Midwest retailer has TCS conduct capacity planning, design for performance, performance optimization, and tuning on-site but creates and executes test scripts offshore </li></ul></ul><ul><li>Enable data and asset sharing by using testing tools that integrate with other life-cycle tools </li></ul><ul><ul><li>Testing with monitoring or testing and monitoring with development </li></ul></ul>
    14. 14. Performance verification isn’t enough <ul><li>Resolving performance problems during testing still takes more time and money than it does to prevent them from being inserted in the first place </li></ul><ul><ul><li>Resolving problems earlier in the life cycle is less expensive by several orders of magnitude, but your mileage will vary </li></ul></ul><ul><li>For most companies, the majority of performance problems are rooted in design decisions </li></ul><ul><ul><li>Once performance testing gets underway, it’s too late in the game to make any significant changes to an application’s architecture </li></ul></ul><ul><li>The amount of time and money it takes varies widely and is difficult to predict </li></ul><ul><ul><li>Problems in performance testing are a common cause of missed release dates and budget overruns </li></ul></ul><ul><ul><li>Some outsourcers avoid performance testing engagements because their indeterminate length leads to tension with clients </li></ul></ul><ul><li>Shops that stop treating design and development as a black box and adopt practices that prevent performance defects from ever being inserted increase both the efficiency and the predictability of their development efforts </li></ul>
    15. 15. Agenda <ul><li>Software performance matters </li></ul><ul><li>Performance verification ensures software performance meets business needs </li></ul><ul><li>But performance-driven development drives down the cost of meeting these needs </li></ul><ul><li>How to move to performance-driven development </li></ul>
    16. 16. Design maps out a strategy for meeting performance requirements <ul><li>Define performance objectives for application components the team owns </li></ul><ul><ul><li>Component-level performance objectives as pass/fail criteria for developers’ early performance testing efforts </li></ul></ul><ul><li>Secure performance contracts for components maintained by other groups </li></ul><ul><ul><li>A major grocery chain has an “ITIL guy” to help IT better define performance contracts for internal services to get more “givens” </li></ul></ul><ul><li>Model application performance to validate design decisions </li></ul><ul><ul><li>A financial services firm estimates that using HyPerformix to model app performance and identify potential bottlenecks during design has saved it $15 million — far more than its $300,000 investment </li></ul></ul>
    17. 17. Developers anticipate test and monitoring <ul><li>Check their work by testing early, often, and automatically </li></ul><ul><ul><li>A $15 billion retailer finds that that when developers test component-level performance, there are fewer problems during final performance testing — mostly integration issues </li></ul></ul><ul><ul><li>ThoughtWorks, a systems integrator, performs continuous performance testing </li></ul></ul><ul><li>Minimize the time to problem resolution by instrumenting code for test and monitoring </li></ul><ul><ul><li>A multi-channel retailer expects that adding JMX instrumentation to its corporate applications will dramatically decrease the time it takes to resolve performance problems </li></ul></ul><ul><ul><li>Instrumentation plugs into management consoles from vendors like HP </li></ul></ul><ul><ul><li>IDE plug-ins make it easier for developers to instrument their code </li></ul></ul>
    18. 18. Sample economics of a move to performance-driven development Life-cycle stage Cost of resolution for 100 defects at x = $100 Requirements Design Development Test Production Cost of problem resolution 1x 2x 10x 50x 100x Maturity level Firefighting $1,000,000 % resolved cost 0% 0% 0% 0% 100% $1,000,000 $0 $0 $0 $0 Performance-driven development % resolved cost 10% $1,000 40% $8,000 25% $25,000 20% $50,000 5% $100,000 $184,000 Performance verification % resolved cost 10% $1,000 $0 $0 0% 0% 60% $300,000 30% $300,000 $601,000
    19. 19. Agenda <ul><li>Software performance matters </li></ul><ul><li>Performance verification ensures software performance meets business needs </li></ul><ul><li>But performance-driven development drives down the cost of meeting these needs </li></ul><ul><li>How to get to performance-driven development </li></ul>
    20. 20. Three ways to get to performance-driven development <ul><li>Management decree </li></ul><ul><ul><li>Motivated by pressure from the business or pressure on the bottom line </li></ul></ul><ul><li>IT operations imposition of release criteria </li></ul><ul><ul><li>Production refuses to accept any more apps that haven’t been subjected to rigorous performance testing </li></ul></ul><ul><ul><li>The development organization suffers from cost overruns and missed release dates until it begins to adopt performance practices for earlier in the life cycle </li></ul></ul><ul><li>Shared service model </li></ul><ul><ul><li>A team with expertise in some or all of the activities in performance-driven development grows in proportion to its ability to help internal customers cut costs, cut time-to-market, and improve their own customer satisfaction levels </li></ul></ul><ul><li>These paths are not mutually exclusive </li></ul><ul><li>The move to performance-driven development is incremental </li></ul>
    21. 21. Where should performance live on the org chart? <ul><li>Pros and cons to every arrangement </li></ul><ul><li>For performance verification: </li></ul><ul><ul><li>Quality assurance </li></ul></ul><ul><ul><li>IT operations </li></ul></ul><ul><li>For performance-driven development: </li></ul><ul><ul><li>Architecture </li></ul></ul><ul><ul><li>Application-specific, cross-functional team </li></ul></ul>
    22. 22. Vendor offerings Tool integration that enables data or asset sharing Monitoring Development * Testing OpNet Empirix Borland* HP HyPerformix Microsoft Mercury Avicode Compuware IBM Requirements Design* ‡ *Vendors with support for performance-driven development activities within their design and development products. † Through its pending purchase of Segue Software, Borland gains performance testing and monitoring tools ‡ Mercury OEMs HyPerformix technology for sale as part of Mercury Performance Center.
    23. 23. Thank you Carey Schwaber +1 617/613-6260 [email_address]
    24. 24. Selected bibliography <ul><li>In production: March 2006 Best Practices “Performance-Driven Development” </li></ul><ul><li>July 12, 2005, Trends “Applying Process Control Principles To Application Performance Management” </li></ul><ul><li>May 16, 2005, Trends “Software Quality Is Everybody’s Business” </li></ul><ul><li>February 11, 2005, Best Practices “Performance Management And The Application Life Cycle” </li></ul>