What do we mean when we talk about "web performance"? Why should you care about it? How can measure it? How do you get other people in your organization to care? In this workshop at the 2021 Chrome Dev Summit, I covered these questions – including an overview of the history of performance metrics, up to Core Web Vitals.
13. People in rural areas
People in Indigenous communities
People with lower incomes
Children
Seniors
People with accessibility challenges
People in developing countries
22. “I grew up with our community being under constant boil
water advisories,
and I wasn’t able to safely drink water
out of the tap until just a few years ago.
High-speed internet feels equally life changing.”
Chief Willie Sellars, 2020
23. We can’t fix our networks, but
we can fix our pages.
30. Page jank affects people with
motor skill challenges (esp. on mobile).
Assistive technology (e.g., screenreaders)
may not work until the DOM fully loads.
JavaScript can block assistive tech.
@marcysutton
31. Users aged 65 and older
are 43% slower at using websites than
users aged 21-55.
nngroup.com/articles/usability-for-senior-citizens/
@tameverts
37. 37
“We want you to be able to flick from one
page to another as quickly as you can flick a
page on a book.
So, we’re really aiming very, very high
here… at something like
100 milliseconds.”
Urs Hölzle
SVP Engineering, Google
38.
39. “web stress”
When apps or sites are slow,
we have to concentrate
up to 50% harder to stay on task.
@tameverts
58. Every 1 second of load time improvement equaled a 2%
conversion rate increase for Walmart
Staples shaved 1 second from median load time, improved
conversion rate by 10%
Fanatics cut median load times by 2 seconds, almost
doubled mobile conversions
73. Threshold YOU create for metrics
that are meaningful for YOUR site
Milestone timings (e.g. Start Render)
Quantity-based (e.g. image weight)
Rules-based (e.g. Lighthouse scores)
74. A good performance budget
should show you…
What your budget is
When you go out of bounds
How long you’re out of bounds
When you’re back within budget
77. 2009
Improved average load time from 6s 1.2s
7-12% increase in conversion rate + 25% increase in PVs
Average load time degraded to 5s
User feedback: “I will not come back to this site again.”
Re-focused on performance
0.4% increase in conversion rate
2010
2011
@tameverts
78. 1. No front-end measurement
2. Constant feature development
3. Badly implemented third-parties
4. Waited too long to tackle problems
5. Relied on performance sprints
6. No way to track regressions
79. 1. Which metrics should I focus on?
2. What should my budget thresholds be?
3. How do I stay on top of them?
84. What tools can we use?
Synthetic (lab)
Consistent baseline
Mimics network & browser conditions
No installation
Compare any sites
Detailed analysis
Waterfall charts
Filmstrips and videos
Limited URLs
Real user monitoring (field)
Requires JavaScript installation
Large sample size (up to 100%)
Real network & browser conditions
Geographic spread
Correlation with other metrics (bounce rate)
No detailed analysis
Only measure your own site
103. Score that reflects how much page elements shift during rendering.
Available in Chrome and Chromium-based browsers.
Synthetic & RUM
104.
105. Size of the shifting element matters
speedcurve.com/blog/visualising-cls-layout-shifts/
106. Image carousels can generate false positives
speedcurve.com/blog/visualising-cls-layout-shifts/
107. Web fonts & opacity changes can cause issues
speedcurve.com/blog/visualising-cls-layout-shifts/
108. Amount of time it takes for page to respond to user input
(e.g. click, tap, key)
Only measurable via RUM
109. FID can seem fast because user interactions
take place later in the page’s rendering cycle...
after CPU-hogging long tasks have completed.
speedcurve.com/blog/first-input-delay-google-core-web-vitals/
110. No correlation when looking at all sessions
speedcurve.com/blog/first-input-delay-google-core-web-vitals/
111. Stronger correlation at 75th percentile
speedcurve.com/blog/first-input-delay-google-core-web-vitals/
112. Long Tasks
Measures JavaScript functions that take 50ms or longer.
Long or excessive JS tasks can delay rendering,
as well as cause page “jank”.
Measurable across browser types.
Synthetic & RUM
113.
114. Long Tasks have a high correlation to conversions
speedcurve.com/blog/first-input-delay-google-core-web-vitals/
116. Custom metrics
Measure performance with high-precision timestamps
Synthetic & RUM
https://www.w3.org/TR/user-timing/
https://speedcurve.com/blog/user-timing-and-custom-metrics/
117. How long does it
take to display the
main product image
on my site?
118. Time to First Tweet
The time from clicking the link to viewing
the first tweet on each page’s timeline
Pinner Wait Time (PWT)
The time from initiating an action (e.g., tapping a pin) until the
action is complete (pin close-up view is loaded)
Time to Interact (TTI)
119.
120. Lighthouse
Scores based on audits run on synthetic tests.
Checks your page against “rules” for Performance, PWA, Best
Practices, and SEO.
For each category, you get a score out of 100 and
recommendations for what to fix.
developers.google.com/web/tools/lighthouse
132. Goals are aspirational.
How fast do I want to be eventually?
Budgets are pragmatic.
How can I keep my site from getting slower
while I work toward my goals?
141. “The largest hurdle to creating and
maintaining stellar site performance
is the culture
of your organization.
Lara Hogan
designingforperformance.com
142. “No matter the size or type of team,
it can be a challenge to educate,
incentivize, and empower those around you.
“Performance more often comes down to
a cultural challenge, rather than simply
a technical one.”
Lara Hogan
designingforperformance.com
149. Embrace performance from the ground up.
Embed engineers into other teams.
Enlist performance ambassadors.
Teach people how to use (or at least understand) the
monitoring tools you use.
152. We first went
to the engineering leaders,
and then we went to
our product leader.
Our pitch was
totally different...
Reefath Rajali // PayPal
chasingwaterfalls.io/episodes/episode-two-with-reefath-rajali/
153. “When we went to our product leaders,
we spoke more about the business numbers
and the business benefits.
“When we spoke to our engineering leaders,
it was more about our consumer delight.”
Reefath Rajali // PayPal
chasingwaterfalls.io/episodes/episode-two-with-reefath-rajali/
169. Who they are What they care about What to show them
Executives
Competition
Business impact
Benchmarks (filmstrips and videos)
Correlation charts (perf + KPIs)
Marketing
Third parties
Traffic + engagement
SEO
Content
Third-party performance
Correlation charts (perf + bounce rate)
Lighthouse SEO audits
Image size
Devs / engineers Well, lots of stuff, probably Consult with perf team
@tameverts
173. “One of the original directives of the
performance team was we weren’t
going
to set ourselves up
to be performance cops.”
Dan Chilton, Vox Media
responsivewebdesign.com/podcast/vox-media-performance/
174. “We weren’t going to go around slapping people on the
wrist, saying, ‘You built an article that broke the page
size budget! You have to take that down or change that
immediately!’
“Our goal setting out was to set up best practices, make
recommendations, and be a resource within the company
that people can turn to when they have to make
performance-related decisions.”
Dan Chilton, Vox Media
responsivewebdesign.com/podcast/vox-media-performance/
177. “We, as engineers, should
learn how
to show the impact
on anything we do.”
Malek Hakim // Priceline
chasingwaterfalls.io/episodes/episode-one-with-malek-hakim/