0
May 22, 2014 1
Attendee Photo
 SCQAA-SF (www.scqaa.net) chapter sponsors the
sharing of information to promote and encourage the
improvement in informa...
 Excellent speaker presentations on advancements in
technology and methodology
 Networking opportunities
 PDU, CSTE and...
 Recently revised our membership dues policy to better
accommodate member needs and current economic
conditions.
 Annual...
Prabhu Meruga
Director - Solution Engineering
21st May
SCQAA – San Fernando, CA
• Basic Complexity
• Why performance?
• Performance failure statistics
• myths of performance testing
• Span of performanc...
Data
Centers
Firewall
Operations
Network
Enterprise
Technolog
y partners
Multiple
channels
of access
Enterprise
integratio...
NFV. SDN. Cloud. 3G/4G data transmission.
Customer First. Ease of use. Multiple channels.
Responsiveness. Transactions. Gl...
UK businesses
could lose up to
£36.7 billion in
revenue per year.
Source: Microfocus
Annual loss of
1.6 million hours
of d...
Technical
• Load tests equal
performance, scalability
and sizing tests
• Load tests provide
reliable performance
informati...
Traditional performance/load
testing scope Current trending effective performance /
load testing built on end user experie...
• Client side processing
(Platform, Browser)
• Network variants (LAN, WAN, Wifi)
Workload growth
Hardware resource
consump...
End User Performance Testing & Monitoring elements
Physical, Virtual &
Mobile Device
Performance
• Storage &
Event Log
• H...
Source: Compuware
Client Side Statistics: Application statistics,
Location Origin, source hygiene check (PC,
LAPTOP, Mobile device) configur...
Performance bottlenecks survey by
Oracle
Key elements to
focus in network
performance testing
• Routers
• Swtiches
• NFV (...
Planning
Tool
selection
Test
Infrastructu
re setup
Scripting
&
execution
• Dependent on
skill set – this
can be optimal
ex...
Potenti
al
benefits
Performance
test planning
Traditional Performance Testing cycle and
activities Dedicated
performance test
Environment
setu...
Load
Generation
over the cloud
Performance Engineering
Passive Monitoring Active Monitoring
Performance Testing
Performance Results
Tuning
recommendation...
For applications
performance
testing
 JMeter
 OpenSTA
 WebLOAD
 The Grinder
 Multi-Mechanize
 Selenium
 Capybara
 ...
 Background
◦ Pre-release Performance Testing for a portal
toolkit
 Expedites and standardizes the process of
developing...
 Scalable to 20 Users
 CPU Utilization
 Application profiled
 Regular expressions consuming
the most CPU time
ABC Bank...
 Scalable to 140 Users
 CPU Utilization Trends
 DB I/O
 Query costs
ABC Bank Online : Run 2
0
40
80
120
160
200
0:00 0...
Recommendations:
 Eliminate or reduce the use of regular expression to free up CPU time
 Fix serialization
 Tune databa...
©2014 CSS Corp
The information contained herein is subject to change without
notice. All other trademarks mentioned herein...
SCQAA-SF Meeting on May 21 2014
Upcoming SlideShare
Loading in...5
×

SCQAA-SF Meeting on May 21 2014

80

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
80
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "SCQAA-SF Meeting on May 21 2014 "

  1. 1. May 22, 2014 1
  2. 2. Attendee Photo
  3. 3.  SCQAA-SF (www.scqaa.net) chapter sponsors the sharing of information to promote and encourage the improvement in information technology quality practices and principles through networking, training and professional development.  Networking: We meet once in 2 months in San Fernando Valley.  Check us out on LinkedIn (SCQAA-SF)  Contact Sujit at sujit58@gmail.com or call 818-878-0834 May 22, 2014 3
  4. 4.  Excellent speaker presentations on advancements in technology and methodology  Networking opportunities  PDU, CSTE and CSQA credits  Regular meetings are free for members and include dinner May 22, 2014 4
  5. 5.  Recently revised our membership dues policy to better accommodate member needs and current economic conditions.  Annual membership is $50, or $35 for those who are in between jobs.  Please check your renewal with Cheryl Leoni. If you have recently joined or renewed, please check before renewing again May 22, 2014 5
  6. 6. Prabhu Meruga Director - Solution Engineering 21st May SCQAA – San Fernando, CA
  7. 7. • Basic Complexity • Why performance? • Performance failure statistics • myths of performance testing • Span of performance testing • Application performance factors • End User experience • Cost comparison analysis • Process Improvements in performance life cycle • performance metrics -prioritize what's needed, • Case study
  8. 8. Data Centers Firewall Operations Network Enterprise Technolog y partners Multiple channels of access Enterprise integratio n 1 Billion smartphones shipped in 2013 50% of the internet users are from mobile 80% of mobile time is spent on apps Mobile web adoption is growing 8 times faster Statistics Source: Digitalbu
  9. 9. NFV. SDN. Cloud. 3G/4G data transmission. Customer First. Ease of use. Multiple channels. Responsiveness. Transactions. GlobalSpeed User Experience Infrastructur e evolution Front End. Back End. Middleware.Technology Evolution
  10. 10. UK businesses could lose up to £36.7 billion in revenue per year. Source: Microfocus Annual loss of 1.6 million hours of downtime each year across North America. Source: CA Technologies Single outage can cost up to USD 300,000 an hour, certainly not an amount to be taken lightly. Source: Emerson Network Power • 60% enterprises overestimate their site’s capacity to handle user traffic. • 98% of the online retailers thought 2 sec response time was desirable. • Source: news.cison.com Application performance failures account for 73% of all failures in IT infrastructure today. Source: eginnovations Comair airlines had cancellation of over 1,000 flights on Christmas Day after its computer systems for reservations crashed. Source: internetnews.com
  11. 11. Technical • Load tests equal performance, scalability and sizing tests • Load tests provide reliable performance information • The right load test tool will do everything for me • User experience is driven by server response time Process and Commercials • Performance/Load Testing needs complex planning and scheduling • Performance/Load Testing is limited to applications and not infrastructure • Performance testing tools are license based and implementation is costly • Open Source performance testing tools are not scalable and robust
  12. 12. Traditional performance/load testing scope Current trending effective performance / load testing built on end user experience Why is network performance important? What’s the role of end user and experience of the application usage? Does this mean increased effort, scope and complexity?
  13. 13. • Client side processing (Platform, Browser) • Network variants (LAN, WAN, Wifi) Workload growth Hardware resource consumption Architectural design strategies • User population • Database changes • Component allocation • Application population • Transaction complexity • CPU consumption • Memory allocation • Disk I/O subsystem • Network hardware • Logical packaging • Physical deployment • Component instancing • Optimized database access End User Experience Heterogeneous channels
  14. 14. End User Performance Testing & Monitoring elements Physical, Virtual & Mobile Device Performance • Storage & Event Log • Hung Processes • App crashes • Operating System • Login Profiling • Geographical Origin Application Performance • Latency • Response Time • Throughput • Broken Links • Successful Transactions • Failed Transactions User Productivity • Application/Mo dule wise usage statistics • Usage trail from login to logout • Transaction execution time • Time spent on web page
  15. 15. Source: Compuware
  16. 16. Client Side Statistics: Application statistics, Location Origin, source hygiene check (PC, LAPTOP, Mobile device) configuration pre- checks, transactions/second Network Statistics: Latency, Firewall hops, data transfer rate, Data center hops, network infrastructure performance (Switches, Routers etc), Bandwidth, Connections per second, Maximum concurrent connections Server Side Statistics: Transactions/second, active sessions, log archive, open Vs ended sessions, Memory leaks Vs usage, DB/App server performance Meaningful analysis of metrics is “Analytics”
  17. 17. Performance bottlenecks survey by Oracle Key elements to focus in network performance testing • Routers • Swtiches • NFV (network function virtualization) • Firewalls • Load Balancers
  18. 18. Planning Tool selection Test Infrastructu re setup Scripting & execution • Dependent on skill set – this can be optimal exercise and can be controlled. • Driver for entire test scripting, execution and end result reporting. Options available but one time selection is important. Paradigm shift in test infrastructure setup – options available here too!! Dependent on the pre requisites such as tool selection and ease of use. Open Source
  19. 19. Potenti al benefits
  20. 20. Performance test planning Traditional Performance Testing cycle and activities Dedicated performance test Environment setup Test scripts creation Test scripts execution & results baseline 1 – 2 weeks 4 – 6 weeks 2 – 4 weeks 1 – 2 weeks Cloud based Performance Testing cycle model Performanc e test planning Performance test Environment setup on cloud Test scripts creation Test scripts execution & results baseline 1 - 2 weeks 1 week 2-3 weeks 1 week 3-5 week effort savings realized through cloud based performance testing infrastructure model
  21. 21. Load Generation over the cloud
  22. 22. Performance Engineering Passive Monitoring Active Monitoring Performance Testing Performance Results Tuning recommendations Network Simulation Predictive Analytics
  23. 23. For applications performance testing  JMeter  OpenSTA  WebLOAD  The Grinder  Multi-Mechanize  Selenium  Capybara  OpenSTA  Pylot  Webrat  Windmill  www.apicasystems.c om  Locust.io For network simulation testing • ns (open source) • OPNET (proprietar y software) • NetSim (proprietar y software) • Shunra (proprietary) For end user experience testing • Open Web Analytics • PIWIK • Google Page Speed Module • Site Speed • CSS Corp PROBLR • New Relic Lite
  24. 24.  Background ◦ Pre-release Performance Testing for a portal toolkit  Expedites and standardizes the process of developing customized Internet portals  Developed for several geographical regions including Central and Eastern Europe, Middle East and Africa, and was hosted in UK ◦ Developed by one of the Top 5 outsourcing vendors ◦ Single instance application running in multiple locations  Challenges ◦ 100% availability and scalability requirements ◦ Improve service uptime and QoS ◦ Optimize Application availability & performance  Value Addition ◦ Scaled the system from 20 to 500 users. ◦ Reduced CPU utilization to allow for growth ◦ System Architecture for growth planning › Performance Engineering Results • Recommendations – Regular expression mismatch – rewrite – Fix serialization – Implement Bind Variables • Tuning activities – Created Function-based Indexes – Tuned Resource Crunching SQL Queries – Reconfigured Instance Level Parameters – Addressed Wait Events Run 1 Run 2 Run 3
  25. 25.  Scalable to 20 Users  CPU Utilization  Application profiled  Regular expressions consuming the most CPU time ABC Bank Online : Run 1 0 12 24 36 48 60 0:00 0:05 0:10 0:15 0:20 0:25 0:30 Elapsed Time (hh:mm) LoadSize/Throughput(KBps) 0.00 20.00 40.00 60.00 80.00 100.00 CPU(DB)/CPU(APP) Load Size Throughput (KBps) CPU Utilization (App) CPU Utilization (DB) App Server Observations Analysis – after Run1 Run 1
  26. 26.  Scalable to 140 Users  CPU Utilization Trends  DB I/O  Query costs ABC Bank Online : Run 2 0 40 80 120 160 200 0:00 0:05 0:10 0:15 0:20 0:25 0:30 0:35 0:40 0:45 0:50 0:55 1:00 Elapsed Time (hh:mm) LoadSize/Throughput(KBps) 0.00 20.00 40.00 60.00 80.00 100.00 CPU(DB)/CPU(APP) Load Size Throughput (KBps) CPU Utilization (App) CPU Utilization (DB)  Bind Variables issues  CPU usage (parsing), Memory (SQL Area)  Indexing  Full table scans on indexed columns due to functions  Errant Queries with huge buffer gets  Instance Level Parameters – DB_Block_Buffers, Shared_Pool, Sort_Area_Size not optimized  High Wait Events – DB Scattered Read & DB Sequential Read DB Server Observations Analysis – after Run 2 Bottlenecks Identified Run 2
  27. 27. Recommendations:  Eliminate or reduce the use of regular expression to free up CPU time  Fix serialization  Tune database by implementing blind variables and reconfiguring instance level parameters  Code Profiling – Java Tuning:  Created Function-based Indexes  Tuned Resource Crunching SQL Queries  Reconfigured Instance Level Parameters  Addressed Wait Events Benefits:  Scaled the system to 1000 users  Reduced CPU to allow for growth  Achieved better than target SLA of 400 Kbps throughput Run3 & Engagement Summary RUN 3 Results Run 3
  28. 28. ©2014 CSS Corp The information contained herein is subject to change without notice. All other trademarks mentioned herein are the property of their respective owners. Thank You! Want to be invited by SCQAA-SF? Please contact sujit.ghosh@3sgbs.com
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×