CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
PAC 2020 Santorin - Ankur Jain
1. PERFORMANCE
IS NOT A MYTH
P E R F O R M A N C E A D V I S O R Y C O U N C I L
SANTORINI GREECE
FEBRUARY 26 - 27 2020
Measure User Perceived Time In
Production Without APM Tool
Ankur Jain
2. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Speaker Intro
I am Ankur Jain, based out of Pune, India.
5+ years of experience in performance testing and engineering. Currently working on
storage and backup domain for a SAAS(AWS) based company, taking care of backup
performance, capacity planning, python/GO code level performance bottleneck identification
and tuning.
Keen interest in tuning client side/UI performance of web pages.
Besides doing nerdy things, I like playing Badminton and Cricket and loves travelling
around.
Email: ankur.jain9292@gmail.com
Linkedin: https://www.linkedin.com/in/ankur9292/
3. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Problem Statement
• Getting user perceived time of reactive applications or single page
application(SPA)
• Monitoring real user in production
• Automated
• Standard across browser
• Cost Effective
• In House Tool
4. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Performance Is Not A Myth
• How important speed is to users?
• AS significant as
Server-side
performance
• User experience is
directly proportional to
revenue and customer
loyalty
Source: Speed Matters, Vol. 3
UI
5. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
The Number Game
Consumers that will pay more for
a better customer experience
Consumers that began doing business
with a competitor following a poor
customer experience
Increase in customer loyalty Increase in revenue
Source: Oracle 2011 Customer Experience Impact Report Source: Dimension Data, 2017 Global CX Benchmark Report
6. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Ways of Monitoring UI Performance
Synthetic Monitoring
• Controlled pre-production environment
• Run test at Scheduled intervals
• Early detection of performance issues
• Benchmark competitors
• Monitor third party services
Real User Monitoring (RUM)
• Real World environments
• Gather real data from different devices,
browsers, geo locations etc
• Correlate gathered data with business
Key performance indicator(KPIs) and
user engagements
• Both types of monitoring is required for 360-degree view of the application.
7. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Why Is It Difficult To Monitoring SPA?
• Rely heavily on JS, CSS,DHTML and AJAX
• Single onload event with multiple page rewritten events
• Traditional methods like windows.onload, first paint, time to DOM content
ready etc are not accurate and sometimes misleading
• Speed index and above-the-fold render time are not a good user
perceived indicator and not feasible for RUM
8. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
What Can Be Done?- The Solution!
• Solution should be
• Simple
• Accurate
• Easy to implement for RUM
• The Answer to this is building custom metrics with User Timing APIs
• Loading Webpage with one liner JS code to get key UI execution
milestones
9. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
User Timing API- Methods
• performance.mark(“MarkerName”)
• Performance.measure(“MeasureName”, “StartMarkerName”,
“EndMarkerName”)
• Performance.clearMark(“MarkName”)
• Performance.clearMeasure(“MeasureName”)
10. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
User Timing Implementations
11. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Retrieving Results
• performance.getEntriesByType()
performance.getEntriesByType(“mark”)
performance.getEntriesByType(“Measure”)
• performance.getEntriesByName()
performance.getEntriesByName(“MarkName”)
performance.getEntriesByName(“MeasureName”)
12. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
User Timing - Supported Browser
Method Chrome Edge
Firefo
x
Internet
Explore
r Opera Safari
Android
webview
Chrome
for
Android
Firefox
for
Android
Opera
for
Android
Safari
on iOS
Samsung
Internet
Mark 28 12 41 10 33 11 Yes 28 42 33 11 1.5
Measure 28 12 41 10 33 11 46 28 42 33 11 1.5
clearMarks 29 12 41 10 33 11 Yes 29 42 33 11 2.0
clearMeasures 29 12 41 10 33 11 Yes 29 42 33 11 2.0
13. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Limitations
• Webpage code changes required
• Thorough app knowledge from user perception required
14. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Live Examples
15. P E R F O R M A N C E A D V I S O R Y C O U N C I L
byP E R F O R M A N C E A D V I S O R Y C O U N C I L
Demo