Caching Techniquesfor Content Delivery

  • 1,224 views
Uploaded on

 

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,224
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
43
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Caching Techniques
    • Caching:
    • Speeds up user access to Web content
    • Reduces network load
    • Reduces origin server load
    Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 2. Basic Operations of a Web Cache Client Cache Sever HTTP:GET Stored? Fresh? HTTP:GET HTTP:RESPONSE tSore Copy? HTTP:RESPONSE No Yes HTTP:RESPONSE HTTP:GET File modified Caching works on the principle that the client request can be served from the cache Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 3. Replacement Rules for Items in the Cache
    • Least Recently Used (LRU)
    • First In First Out (FIFO)
    • Least Frequently Used (LFU)
    • Next to Expire (NTE)
    • Largest File First (LFF)
    These rules can be used to decide which files will be replaced as new items are stored in the cache Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 4. Decision to store or serve from the cache
    • Cache-Control header field
    • Cache request directives :
      • Whether to use a cached object in without revalidating
      • Whether to use only cached objects
      • Whether to store any part of the response
      • Whether the user will accept responses within age/freshness limits
    • Cache response directives
      • Whether to store or not
      • Whether to revalidate future requests
      • Whether to revalidate user credentials before response
      • When an Object expires
    Specified in HTTP protocol From the client From the server Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 5. Placing a Network in a Cache: Forward Proxy Internet Web Server Web Clients Web Cache Forward Proxy Workgroup LAN Forward Proxy acts on behalf of content consumers The Request is first sent over the LAN to the Forward Proxy Network administrators set up Forward Proxy to help speed up web access for users Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 6. Try This!
    • Find out the forward proxy address of your workgroup
    • Open your Browser
    • (For IE) Tools>Internet Options>Connections>LAN settings
    • You can see your Proxy address
    Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 7. Placing a Network in a Cache: Reverse Proxy Internet Web Servers Web Clients Web Cache Reverse Proxy Sever LAN Also called sever accelerator Reverse Proxy acts on behalf of origin server The Request is sent first to the reverse proxy cache Web Farms set up reverse proxies to improve performance &scalability. Also reverse proxies can be located remotely near customers. Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 8. Placing a Network in a Cache: Interception Proxy Internet Web Server Web Clients Web Cache Interception Proxy ISP Network Interception Proxy acts on behalf of the content traffic (the ISP) The Request is first sent over the ISP Network to the Interception Proxy ISPs set up Interception Proxy to help speed up web access for customers and reduce wide area bandwidth costs Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 9. Cache Techniques for Streaming Media
    • Buffering allows for smoothening of multimedia:
      • It acts as an elastic store with the renderer taking frames from the front of the buffer at a constant rate
      • And the network fills the back of the queue at a variable rate
      • The buffer stores several seconds of the stream trading an initial delay for smooth rendering
      • If the stream arriving at the renderer is extremely variable this may lead to buffer underflow/overflow
    Buffer stores several seconds of data Constant Bit Rate outflow Variable Bit Rate inflow Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 10. Delays in Multimedia Streaming Server Client Client Request Playback begins Connection Delay Buffer Delay Total Delay = Connection Delay + Buffer Delay Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 11. Fast Prefix Caching Server Client Cache Client Request To Cache Buffer From Stored Prefix From Cache Buffer ConnectionDelay Buffer Delay Playback Begins Prefix caching reduces the connection delay The client request is served by previously stored multimedia data Locating the cache close to the client and employing a fast connection reduces the connection delay Other advantage: the cache can serve as a splitter for multiple clients with only one connection to server Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 12. Dynamic Caching Ring Buffer Server Client 1 Client 2 Patch t T+  Data for Client 1 Data for Client 2 Start caching Serve Client 2 from Cache How it works: Client 2 requests the same streaming object as Client 1 When Client 2 receives start of stream Client 1 has already received  seconds of stream The temporal distance between the clients is  The data streamed to Client 1 is cached in the ring buffer to be served to Client 2 Client 2 is served the initial  seconds of stream by a patch from the server or a prefix T+2  Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
  • 13. Summary
    • Caching helps in accessing content
    • Caching helps in particular in delivering streaming media
    • Content delivery networks uses caches at various locations
    Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/