My talk from #BrightonSEO 2019, the twentieth edition. Building on my talk from TechSEO Boost 2018, my talk at Brighton explores the changes in #EdgeSEO and the future possibilities given the advent of Akamai Edge Workers, AWS Lambda capabilities and the prospect of Fastly's WASM solution.
Edge SEO means using edge computing technologies to
create new SEO implementation methods, testing, and
research processes outside of the current parameters
in which we operate.
OOTB CDN Benefits
• Speed optimization of content delivery
• Including payload optimization & compressions
• Bandwidth savings
• Content accessibility / uptime reliability
• Security benefits, i.e. WAF / DDoS Mitigation
• Reverse proxy other platforms to subfolders etc
• “Edge SEO” stuff
Why is this needed?
• Congested development queues // long lead times
• Lack of “business buy in” to action SEO fixes as a priority
• Platforms with restrictions
• Random Google support changes
• Builds not scoped properly
• Code freezes
Robots.txt Modding Yes Yes* Yes
Redirects Yes Yes* Yes
AB Testing Yes Yes* Yes
Hreflang Injection Yes Yes* Limited
Security Headers Yes Yes* Yes
“Logging” Yes Yes* No Need
Yes Yes* Yes
Meta Data Yes Yes* Yes
Example worker codes/uses
• Implement redirects
• AB testing
• Overwrite hardcoded meta data / HTML elements
• Implement Hreflang
• Pseudo log file collection
• Dynamic meta data
Pseudo log file collection
Not all platforms all log file collection…
Sometimes getting hold of logs isn’t easy due to gatekeepers.
•Access to edit robots.txt
•Access to pull pseudo log files
•Customise server headers
A grey cloud means your web traffic is not running
• ABCD test URLs by directing traffic via the CDN
• Monitor page level metrics via Google Analytics / Google
Data Studio Reports
• Requires pages to “be live”
periodically, rather than on a per request basis.
Can reduce costs for prerendering JS server side, and complies
with Google’s recommended implementation of dynamic
rendering, by switching between client-side and pre-rendered
content for specific user-agents.
Step 1 – Halt, who goes there?
Request Comes In
Step 2 – =IF(”Googlebot|Bingbot…”, P1, P2)
Check Cache For
Client Side Render
Step 3 – What Happens If No Cache?
1. Wait and return the prerendered page – but monitor the
time it takes, if it takes 7-seconds, negative impact.
2. We could return the page after 1-second of prerendering,
but likely to be useless.
Worker Solution When Caught Short
We return https://httpstatuses.com/503 with Retry-
After; this has not been tested in real world AFAIK, but
the idea would be to tell crawler to attempt again in 10
seconds for that page
Cold Cache Issues
To solve issue of cold caches, we have a batch job,
checking cache for stale entries and prerendering them,
and to seed cache we can either use a sitemap or crawl
our own site.
•Next to zero dev ops required
•Can be verified and monitoring through existing
• Potential to affect and impact all requests between client and
• Potential to add latency and slow page load times, depending
on implementation (our testing has shown between 10ms
and 50ms latency)
• Potential to introduce front-end bugs that are difficult to
debug when it is unclear what is being modified/injected
through stream transformation
Internal Processes & Challenges
•Responsibility and accountability
•Development and release management
•Compliance (legal, privacy, GDPR)
Restricting access to your CDN
Access to your Cloudflare, Akamai, or Incapsula account
needs to be locked down within the organisation
because even without Cloudflare Workers being
enabled, you can still do a lot of damage through the
Cloudflare dashboard if you don’t know what you’re