Caching Techniquesfor Content Delivery


Published on

Published in: Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Caching Techniquesfor Content Delivery

  1. 1. Caching Techniques <ul><li>Caching: </li></ul><ul><li>Speeds up user access to Web content </li></ul><ul><li>Reduces network load </li></ul><ul><li>Reduces origin server load </li></ul>Sanjoy Sanyal:
  2. 2. Basic Operations of a Web Cache Client Cache Sever HTTP:GET Stored? Fresh? HTTP:GET HTTP:RESPONSE tSore Copy? HTTP:RESPONSE No Yes HTTP:RESPONSE HTTP:GET File modified Caching works on the principle that the client request can be served from the cache Sanjoy Sanyal:
  3. 3. Replacement Rules for Items in the Cache <ul><li>Least Recently Used (LRU) </li></ul><ul><li>First In First Out (FIFO) </li></ul><ul><li>Least Frequently Used (LFU) </li></ul><ul><li>Next to Expire (NTE) </li></ul><ul><li>Largest File First (LFF) </li></ul>These rules can be used to decide which files will be replaced as new items are stored in the cache Sanjoy Sanyal:
  4. 4. Decision to store or serve from the cache <ul><li>Cache-Control header field </li></ul><ul><li>Cache request directives : </li></ul><ul><ul><li>Whether to use a cached object in without revalidating </li></ul></ul><ul><ul><li>Whether to use only cached objects </li></ul></ul><ul><ul><li>Whether to store any part of the response </li></ul></ul><ul><ul><li>Whether the user will accept responses within age/freshness limits </li></ul></ul><ul><li>Cache response directives </li></ul><ul><ul><li>Whether to store or not </li></ul></ul><ul><ul><li>Whether to revalidate future requests </li></ul></ul><ul><ul><li>Whether to revalidate user credentials before response </li></ul></ul><ul><ul><li>When an Object expires </li></ul></ul>Specified in HTTP protocol From the client From the server Sanjoy Sanyal:
  5. 5. Placing a Network in a Cache: Forward Proxy Internet Web Server Web Clients Web Cache Forward Proxy Workgroup LAN Forward Proxy acts on behalf of content consumers The Request is first sent over the LAN to the Forward Proxy Network administrators set up Forward Proxy to help speed up web access for users Sanjoy Sanyal:
  6. 6. Try This! <ul><li>Find out the forward proxy address of your workgroup </li></ul><ul><li>Open your Browser </li></ul><ul><li>(For IE) Tools>Internet Options>Connections>LAN settings </li></ul><ul><li>You can see your Proxy address </li></ul>Sanjoy Sanyal:
  7. 7. Placing a Network in a Cache: Reverse Proxy Internet Web Servers Web Clients Web Cache Reverse Proxy Sever LAN Also called sever accelerator Reverse Proxy acts on behalf of origin server The Request is sent first to the reverse proxy cache Web Farms set up reverse proxies to improve performance &scalability. Also reverse proxies can be located remotely near customers. Sanjoy Sanyal:
  8. 8. Placing a Network in a Cache: Interception Proxy Internet Web Server Web Clients Web Cache Interception Proxy ISP Network Interception Proxy acts on behalf of the content traffic (the ISP) The Request is first sent over the ISP Network to the Interception Proxy ISPs set up Interception Proxy to help speed up web access for customers and reduce wide area bandwidth costs Sanjoy Sanyal:
  9. 9. Cache Techniques for Streaming Media <ul><li>Buffering allows for smoothening of multimedia: </li></ul><ul><ul><li>It acts as an elastic store with the renderer taking frames from the front of the buffer at a constant rate </li></ul></ul><ul><ul><li>And the network fills the back of the queue at a variable rate </li></ul></ul><ul><ul><li>The buffer stores several seconds of the stream trading an initial delay for smooth rendering </li></ul></ul><ul><ul><li>If the stream arriving at the renderer is extremely variable this may lead to buffer underflow/overflow </li></ul></ul>Buffer stores several seconds of data Constant Bit Rate outflow Variable Bit Rate inflow Sanjoy Sanyal:
  10. 10. Delays in Multimedia Streaming Server Client Client Request Playback begins Connection Delay Buffer Delay Total Delay = Connection Delay + Buffer Delay Sanjoy Sanyal:
  11. 11. Fast Prefix Caching Server Client Cache Client Request To Cache Buffer From Stored Prefix From Cache Buffer ConnectionDelay Buffer Delay Playback Begins Prefix caching reduces the connection delay The client request is served by previously stored multimedia data Locating the cache close to the client and employing a fast connection reduces the connection delay Other advantage: the cache can serve as a splitter for multiple clients with only one connection to server Sanjoy Sanyal:
  12. 12. Dynamic Caching Ring Buffer Server Client 1 Client 2 Patch t T+  Data for Client 1 Data for Client 2 Start caching Serve Client 2 from Cache How it works: Client 2 requests the same streaming object as Client 1 When Client 2 receives start of stream Client 1 has already received  seconds of stream The temporal distance between the clients is  The data streamed to Client 1 is cached in the ring buffer to be served to Client 2 Client 2 is served the initial  seconds of stream by a patch from the server or a prefix T+2  Sanjoy Sanyal:
  13. 13. Summary <ul><li>Caching helps in accessing content </li></ul><ul><li>Caching helps in particular in delivering streaming media </li></ul><ul><li>Content delivery networks uses caches at various locations </li></ul>Sanjoy Sanyal: