Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Lightning Talks by Globant - One step into the future of performance testing

27 views

Published on

In the last 10 years, technologies and the processes used to develop applications has changed tremendously. How have performance tests evolved to address those changes?

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Lightning Talks by Globant - One step into the future of performance testing

  1. 1. Evolving Load and Performance testing
  2. 2. AGENDA QE Studio Load & Performance Practice Some history The big changes and How was performance testing affected? Early testing & CI-CD User experience Jmeter Frontend ○ Webpage test ○ Speed page oi ○ Yslow ○ Google tools APMs
  3. 3. Globant Proprietary | Confidential Information We provide comprehensive testing services with proven experience on distributed teams. Our flexible working model easily adapts to the different customer’s methodologies and engagements. We offer the most effective and diversified testing strategies. 1200+ ENGINEERS 35+ DEVELOPMENT CENTERS 400+ ACTIVE PROJECTS FACTS RELEVANT CLIENTS Load & Performance Testing Functional Testing Mobile Testing Game Testing Accessibility Testing Test Automation PRACTICES QUALITY ENGINEERING STUDIO
  4. 4. Argentina Colombia Uruguay Mexico India Perú We have the ability and experience to test, validate, suggest and help development teams to evolve applications into better performing solutions, looking to reduce costs and increase reliability during operation We are an small team, 25 PTEs (Performance Test Engineers) located in 7 offices in 6 countries. Strong team with great talent to improve software performance. LOAD & PERFORMANCE PRACTICE
  5. 5. SOME HISTORY 2003 2005 2006 2007 2008 2009 2011 2012 2013 2014 2015 2017 SPA New web architectures Node JS New HTTP servers Less CSS framework Moviles Slower networks Limited resources Web 2.0 User as a content creator HTML5 Nuevo estandar HTML ANGULAR js Frontend framework REACT Frontend framework VUE js Frontend framework APMs Deep monitoring Cloud AWS new infrastructure REST API New architectures Sky rocket social networks AJAX Asynchronous Web applications Sass CSS framework SCRUM Methodologies with small increments.
  6. 6. Deeper monitoring and new infrastructures (Cloud) Big frontend frameworks Agile Methodologies (Scrum) Users with limited connection and resourcers More interactive pages Here is the current challenge What will come next? BIG CHANGES
  7. 7. • Methodologies: Agile (Scrum/Kanban/Scrumban) • Architectures: • Microservices • REST APIs • Lambdas EARLY TESTING
  8. 8. • Risk Mitigation • Increase Coverage • Automated Reports • Automated results comparison • Reduced Overhead • Consistency of Build Process CI servers CI-CD (JENKINS/BAMBOO/PIPELINE)
  9. 9. Types - Load - Stress - Soak - Spike Concurrency - Response times - Throughput - Workload model (usage patterns) ToolsServer Metrics - Memory - CPU - Hard drives - Network - Connection pool - Threads Example with beerbook (Jmeter) LOAD RUNNER DEMO TIME CONCURRENCY TESTING
  10. 10. Before - Response time - Concurrent users - Throughput - Server resources Now - Response time - Concurrent users - Throughput - Server resources + User experience + Usable (interact) + Render time + Auto-scalable “1 second of load lag time would cost Amazon $1.6 billion in sales per year” - Amazon “A broker could lose $4 million in revenues per millisecond if their electronic trading platform is 5 milliseconds behind the competition.” “When load times jump from 1 seconds to 4 seconds, conversions decline sharply. For every 1 second of improvement, we experience a 2% conversion increase” - Walmart “In 2000, research by Microsoft placed the average human attention span at 12 seconds. By 2015, the same study found that number had fallen to just 8 seconds” - Microsoft “Google found an extra 0.5 secs in search page generation time dropped traffic by 20%.” - Google In 1993 Nielsen define 3 limits • 0.1 sec: reacting instantaneously • 1.0 sec: the limit for the user's flow of thought • 10 secs: the limit for keeping the user's attention - Nielsen Norman Group IMPACT WHY? USER EXPERIENCE
  11. 11. Before - Response time - Concurrent users - Throughput - Server resources Now - Response time - Concurrent users - Throughput - Server resources + User experience + Usable (interact) + Render time + Auto-scalable JavaScript Processing time for cnn.com • Desktop 2.061 sec • Laptop 2.891 sec • High-end Phone 3.967 sec • AVG Phone 13.355 sec • Cheap phones 36.284 sec - Addy Osmani (Google Engineer) MOBILE • First Byte: First server response • Start Render: First visual • Load Time: beginning of the window load event (onload). • Fully Loaded: no network activity after Document Complete LOAD PHASES USER EXPERIENCE
  12. 12. - Web page test: http://www.webpagetest.org/ - Sitespeed.io: https://www.sitespeed.io/ - GT Metrix: http://gtmetrix.com - Yslow: http://yslow.org/ - Google insights: https://developers.google.com/speed/pagespeed/insights/ - Google lighthouse: https://developers.google.com/web/tools/lighthouse/ Tools FRONTEND PERFORMANCE
  13. 13. SITESPEED.IO docker run --shm-size=1g --rm -v <dest_folder>:/sitespeed.io sitespeedio/sitespeed.io:7.4.0 https://www.globant.com/ Some configurations - Browsers: firefox, chrome - Network: 3g, 3gfast, 3gslow, 3gem, 2g, cable, native, custom - Number of runs - Latency - Authentication - Proxy DEMO TIME FRONTEND PERFORMANCE
  14. 14. Servers resources usage Application errors and logs Internal application response time APM https://newrelic.com/products/application-monitoring APMs MONITORING WITH APMs
  15. 15. Thank you! Nov, 2018

×