3. HTTP Caching
A technique that stores a copy of a frequently
accessed resource and serves it back when
requested.
4. Why Caching?
Reduce redundant data transfer
Reuse content
Increase application performance
Reduces the load on
the database.
5. What Can be Cached?
Cache-friendly Content Careful in Caching Content
6. What Can be Cached? (cont.)
Image files
Html files
Css files
Javascript files
7. When the data that is being requested
by an application is found in the cache
and data is still fresh.
When the data that is being requested by an
application isn’t found in the cache or data is not
fresh.
Cache Hit Cache Miss
8. Figure 1 : Processing a fresh cache hit
Client
Cache
Server
1. Receive HTTP
request message
2. Parse message 3. In cache?
4.Is fresh?
New headers Server headers
Body Body
5.Create
response
headers
6. Send Response
9. Figure 2 : Processing a cache miss
Client
Cache
Server
1. Receive HTTP request
message
2. Parse
message
3. In cache?
5. Send
Response
NEW headers
Body
4.Create
response
headers
10. Figure 3 : Processing a cache miss when cache is stale
Client
Cache
1. Receive HTTP
request message
2. Parse message
3. In cache?
4.Is fresh?
Server headers
Body
6. Send
Response
Server
NEW headers
Body
5.Create
response
headers
11. Types of caches in general view point
Private Cache Public Cache
12. Private Cache
A private cache is only used by one client, only for the IP it was created for.
User: rakib3004
Private Cache
Cached only for
User: rakib3004
13. Public Cache
Public cache is used by more than one client. Any public content that don’t need any
authentication or personalized info, that is stored in public cache
Client 1
Public
Cache
Client 2
Cached for both
Client 1 & Client 2
14. Some Common Approaches for Caching
Read Through
Cache Aside
Read Strategies
Write Back
Write Through
Write Strategies
15. Cache Aside
The application first checks the cache. If the data is not found in cache it queries the database to
read the data, returns it to the client and stores the data in cache.
Application
Cache
Database
Faster in read heavy applications
Data is not much frequently updated
Use TTL for remain consistent
between cache and database
16. Read Through
The application first checks the cache. If the data is not found in cache it queries the
database to read the data, populates the cache and returns it to the application.
Application Cache Database
Cache and database data always
remain consistent
Cache sits in-line with the
database.
The application treats cache as
the main data store
2. Cache Hit
17. Write Through
Data is first written to the cache and then to the database. The cache sits in-line with the
database and writes always go through the cache to the main database.
Cache maintains consistency
with the main database.
Implements in write heavy
applications
Application
Cache
Database
Heavy load on database due to
redundant write operation
1. {color: teal }
3.{color: dodger-blue}
5.{color: dark-turquoise}
7.{color: brown}
2. {color: teal }
4.{color: dodger-blue}
6.{color: dark-turquoise}
8.{color: brown}
18. Write Back
Here, the application writes data to the cache which stores the data and acknowledges to
the application immediately. A modified cache block is written back to the database, just
before it is replaced.
Reduces write operations in
write heavy applications
Asynchronous between cache
and database
Writes to the database with a
delay Application Cache Database
1. {color: teal }
3.{color: dodger-blue}
2.{color: dark-turquoise}
4.{color: brown}
5.{color: brown}
19. Some Caching Types in Web Development
Application Caching
Web Caching Data Caching
Distributed Caching
20. Web Caching is 2 types. They are: Browser Caching & Proxy Caching
Web Caching
Browser
Cache
Server
request
Response (content is
not cached or stale)
request
Response
(content is
cached and
fresh
Browser Caching helps individual users to quickly navigate pages they have recently visited.
A temporary storage area in memory or on disk
Instruct the user’s browser to cache certain files.
21. Web Caching (cont.)
Proxy Server
Cache
Server
Client
Proxy Caching allows cached information to be shared across larger groups of users.
Proxy Cache provides content to be delivered
to customers faster.
Proxy Cache data does not change frequently
22. Application Caching
Application caching cache database queries, HTML fragments, or the output of heavy
computations. It can drastically reduce your website load time and reduce server overhead.
Application
Cache
Database
Client
Decreased network costs
Access content without internet
23. Data Caching
It stores the data in local memory on the server and helps avoid extra trips to the database
for retrieving data that has not changed.
Client Data Cache Database
Reduces turnover time of database
queries
Implements in Database Driven Applications
Backup for any cache failure
24. Distributed Caching
Distributed cache is made up of a cluster of machines only serving up memory. Web servers
pull and store from distributed server’s memory.
Client Server
Distributed Cache
Used in high volume applications like
google, microsoft, amazon
Increase scalability
Application acceleration
“A technique that stores a copy of a frequently accessed resource and serves it back when requested”
Definition: https://www.digitalocean.com/community/tutorials/web-caching-basics-terminology-http-headers-and-caching-strategies
Advantages of Caching : https://www.digitalocean.com/community/tutorials/web-caching-basics-terminology-http-headers-and-caching-strategies
What Can be Cached? : https://www.digitalocean.com/community/tutorials/web-caching-basics-terminology-http-headers-and-caching-strategies
Logos and brand images, Style sheets , General Javascript files, Media Files, —--------------------------- HTML pages, Frequently modified Javascript and CSS, Content requested with authentication cookie
What Can be Cached? : https://www.digitalocean.com/community/tutorials/web-caching-basics-terminology-http-headers-and-caching-strategies
Cache URL link:
https://mail.google.com/mail/u/0/
Cache Hit: When the data that is being requested by an application is found in the cache and data is still fresh.
Cache Miss: When the data that is being requested by an application isn’t found in the cache or data is not fresh.
Processing a fresh cache hit: https://www.oreilly.com/library/view/http-the-definitive/1565925092/ch07s07.html
Cache Processing Steps:
Receiving—Cache reads the arriving request message from the network.
Parsing—Cache parses the message, extracting the URL and headers.
Lookup—Cache checks if a local copy is available and, if not, fetches a copy (and stores it locally).
Freshness check—Cache checks if cached copy is fresh enough and, if not, asks server for any updates.
Response creation—Cache makes a response message with the new headers and cached body.
Sending—Cache sends the response back to the client over the network.
Logging—Optionally, cache creates a log file entry describing the transaction.
Generally this applies only to a cache maintained by that client itself, though if you had a proxy that was only being used by one client it would be possible to configure it to act as a private cache.
Example: Github Own Profile
Private: Default value. Sets Cache-Control: private to specify that the response is cacheable only on the client and not by shared (proxy server) caches.
Private Cache: The first link on the home page takes user to another page which is also uploaded to same S3 bucket but additionally has metadata configured
A public, or “shared” cache is used by more than one client. As such, it gives a greater performance gain and a much greater scalability gain, as a user may receive cached copies of representations without ever having obtained a copy directly from the origin server.
Public Cache : From the response header captured in Chrome, notice that the browser serves page from Cache after first access. Multiple refresh of this page would return 304 Not Modified
https://software-factotum.medium.com/effective-http-caching-part-iii-public-private-and-no-store-b64f0452325#:~:text=The%20%E2%80%9Cpublic%E2%80%9D%20response%20directive%20indicates,stored%20by%20a%20shared%20cache.
Example of public cache:
Squid
Akamai
Croxy Proxy
Varnish Cache
https://dev.to/tharindu/a-quick-introduction-to-distributed-caching-48e3
https://medium.com/system-design-blog/what-is-caching-1492abb92143
https://www.prisma.io/dataguide/managing-databases/introduction-database-caching
Read Strategies and Write Strategies naam e alada korte hobe
In memory cache, fast. No direct connection between cache and database
TTL:Expire key, when it expires, then the key is not found. The database is queried for the key and update the cache.
Best suits for:
Read-heavy workloads
Data that is not much frequently updated:
https://medium.com/bliblidotcom-techblog/redis-cache-aside-pattern-72fff2e4f927
Auto Refresh in cache aside alachisoft
Read Through and Write Through work together,
The application treats cache as the main data store:
https://www.alachisoft.com/resources/articles/readthru-writethru-writebehind.html#:~:text=Read%2Dthrough%2FWrite%2Dthrough,the%20application%20of%20this%20responsibility
Cache sits in-line with the database: Read Through Prisma Blog
Cache and database data always remain consistent:
https://dev.to/tharindu/a-quick-introduction-to-distributed-caching-48e3
It is common for developers to mitigate this delay by ‘warming’ the cache by issuing likely to happen queries manually.
Cache sits in-line with the database.
When there is a cache miss, it loads missing data from the database, populates the cache and returns it to the application.
Cache always stays consistent with the database.
Best suits for: Read-heavy workloads.
Read through and write through cache can see in a same application.
Auto Refresh in read through alachisoft
Data is first written to the cache and then to the database.
The cache sits in-line with the database and writes always go through the cache to the main database.
Maintains high consistency between the cache and the database (but, adds a little latency during the write operations as data is to be updated in the cache additionally.)
Best suits for: write-heavy workloads like online multiplayer games
Reference: https://dev.to/tharindu/a-quick-introduction-to-distributed-caching-48e3
https://www.prisma.io/dataguide/managing-databases/introduction-database-caching
Write Back Reference: https://dev.to/tharindu/a-quick-introduction-to-distributed-caching-48e3
In write through, more load on database so we use write back to reduce server load.
Application writes data to the cache which acknowledges immediately and after some delay, it writes the data back to the database.
Asynchronous between cache and database
Some Caching Types in Web Development main reference:
https://bootcamp.uxdesign.cc/caching-techniques-one-should-know-603e09d2b298
A browser or Web cache does exactly that, except with program and website assets. When you visit a website, your browser takes pieces of the page and stores them on your computer's hard drive. Some of the assets your browser will store are:
Images - logos, pictures, backgrounds, etc.
HTML
CSS
JavaScript
In short, browsers typically cache what are known as "static assets" - parts of a website that do not change from visit to visit.
Link: https://www.bigcommerce.com/ecommerce-answers/what-browser-cache-and-why-it-important/
Cache Provide: https://docs.servicestack.net/caching
Proxy Cache provides content to be delivered to customers faster.
https://www.stackpath.com/edge-academy/what-is-proxy-caching/#:~:text=Proxy%20caching%20is%20a%20feature,files%2C%20images%20and%20web%20pages.
Proxy Cache data does not change frequently:https://bootcamp.uxdesign.cc/caching-techniques-one-should-know-603e09d2b298
Implements in Database Driven Applications: https://bootcamp.uxdesign.cc/caching-techniques-one-should-know-603e09d2b298
Reduces turnover time of database queries: https://bootcamp.uxdesign.cc/caching-techniques-one-should-know-603e09d2b298
Backup for any cache failure: https://www.ironistic.com/insights/four-major-caching-types-and-their-differences/
It stores the data in local memory on the server which is the fastest way to retrieve information on a web server: https://www.ironistic.com/insights/four-major-caching-types-and-their-differences/
Used in high volume applications like google, microsoft, amazon : https://bootcamp.uxdesign.cc/caching-techniques-one-should-know-603e09d2b298
Increase scalability: https://www.ironistic.com/insights/four-major-caching-types-and-their-differences/
Application acceleration : https://www.linkedin.com/pulse/application-caching-strategies-konstantinos-kalafatis-1f?trk=pulse-article_more-articles_related-content-card
Application acceleration: Most applications rely on disk-based databases, either directly or indirectly, and can't always meet today's increasingly demanding requirements. By caching the most frequently accessed data in a distributed cache, we can dramatically reduce the bottleneck of disk-based systems.