A Personal Assistant for Web
     Database Caching



    Beat Signer, signer@inf.ethz.ch
    Institute for Information Sy...
Overview

•   Motivation
•   Internet OMS architecture
•   Personal Assistant
•   Cache architecture
•   Performance measu...
Motivation

• Lack of most browsers to cache dynamically
  generated web pages
• Cooperative client, middle layer and serv...
Internet OMS Architecture

                         Web Browser

                         Front-End Agent           Sessio...
Cache Requirements

•   Reduction of response times (latency)
•   Reduction of idle times (e.g. modem connection)
•   Cach...
Personal Assistant Cache Structure

           Personal Assistant Cache   Persistent Client Cache


                  Sess...
Prefetching Cache Statistics

                 LRUB-Cache (fixed size)



  query 1        query 2                    quer...
Cache Replacement Strategy

• Least recently used (LRU)
• Introduce bonus for entries often used in the past
  but not acc...
Advantages of Adaptive Prefetching

                 • Assumption: global cache of
                   size 4 for most freq...
Cache Administration Tools
Average Response Times


                      Average Time Worst Case                                                   A...
Image Response Times


                   Average Image Time Worst Case                                               Aver...
Cache Hit Rates

• Increased hit rate even in worst case scenario
  (due to larger caches)
• Front-end agent profits from ...
Conclusion

• Advantages
     increased cache hit rate
     reduced response times
     adaptation to individual users
...
Future Work

• Use similiar prefetching techniques to build an
  HTTP caching proxy
• First prototype running at gordon.in...
Upcoming SlideShare
Loading in …5
×

A Personal Assistant for Web Database Caching

1,265 views

Published on

Presentation given at CAiSE 2000, 12th International Conference on Advanced Information Systems Engineering, Stockholm, Sweden, June 2000

ABSTRACT: To improve the performance of web database access for regular users, we have developed a client caching agent, referred to as a personal assistant. In addition to caching strategies based on data characteristics and user specification, the personal assistant dynamically prefetches information based on previously monitored user access patterns. It is part of an overall multi-layered caching scheme where cache coherency is ensured through cooperation with a server-side database caching agent. The personal assistant has been implemented in Java and integrated into the web architecture for the OMS~Pro database management system.

Published in: Technology, Business

A Personal Assistant for Web Database Caching

  1. 1. A Personal Assistant for Web Database Caching Beat Signer, signer@inf.ethz.ch Institute for Information Systems ETH Zurich
  2. 2. Overview • Motivation • Internet OMS architecture • Personal Assistant • Cache architecture • Performance measurements • Conclusion
  3. 3. Motivation • Lack of most browsers to cache dynamically generated web pages • Cooperative client, middle layer and server caches to improve performance • Reduction of query response times by better usage of the available resources
  4. 4. Internet OMS Architecture Web Browser Front-End Agent Session Cache Personal Assistant Active Cache Internet    HTTP Server Database Agent Global Cache OMS Database
  5. 5. Cache Requirements • Reduction of response times (latency) • Reduction of idle times (e.g. modem connection) • Cache consistency • Persistence • Flexibility (different user skills) • Active caching component (local prefetching) • Small overhead to maintain cache • Optimal usage of available cache resource
  6. 6. Personal Assistant Cache Structure Personal Assistant Cache Persistent Client Cache Session Cache Short-Term Cache query results images  passive  priority 3 disjoint Personal Cache Explicit User Specified Objects query results images  permanent  passive  priority 1 Prefetching Cache Long-Term Cache query results images  user profile  active  priority 2
  7. 7. Prefetching Cache Statistics LRUB-Cache (fixed size) query 1 query 2 query n statistic statistic statistic next query 1 next query 1 next query 1 next query 2 next query 2 next query 2 next query 3 next query 3 next query 3 LRUB-Cache (fixed size) .... .... .... next query n next query n next query n
  8. 8. Cache Replacement Strategy • Least recently used (LRU) • Introduce bonus for entries often used in the past but not accessed recently  LRUB • Weight for cache entry i defined as follows:   (1   ) H i weight i  , 0  1 Ti H i : Number of hits for cache entry i Ti : Time since last access of cache entry i
  9. 9. Advantages of Adaptive Prefetching • Assumption: global cache of size 4 for most frequently used queries  which of the q5 6 “hot queries” should be q2 cached? q6 q1 q3 q7 • Adaptive prefetching  sliding window mechanism, q4 q8 i.e. cache of size 2 is q9 sufficient!
  10. 10. Cache Administration Tools
  11. 11. Average Response Times Average Time Worst Case Average Time Best Case 5000 5000 4500 4500 4000 4000 3500 3500 time[ms] 3000 time[ms] 3000 2500 2500 2000 2000 1500 1500 1000 1000 500 500 0 0 Total Time Query Time Image Time Total Time Query Time Image Time Without Personal Assistant With Personal Assistant Without Personal Assistant With Personal Assistant
  12. 12. Image Response Times Average Image Time Worst Case Average Image Time Best Case 5000 5000 4500 4500 4000 4000 3500 3500 time[ms] time[ms] 3000 3000 2500 2500 2000 2000 1500 1500 1000 1000 500 500 0 0 Without Personal Assistant With Personal Assistant Without Personal Assistant With Personal Assistant User 1 User 2 User 3 User 4 User 1 User 2 User 3 User 4
  13. 13. Cache Hit Rates • Increased hit rate even in worst case scenario (due to larger caches) • Front-end agent profits from locality of queries in best case scenario Worst Case Best Case queries images queries images Front-End Agent 13% 4% 33% 29% Personal Assistant 29% 28% 75% 68%
  14. 14. Conclusion • Advantages  increased cache hit rate  reduced response times  adaptation to individual users  easy to use (transparent)  prefetching statistic always up to date • Costs  database agent eventually has to handle an increased number of queries (prefetching)  increased server loads  overhead to maintain cache consistency
  15. 15. Future Work • Use similiar prefetching techniques to build an HTTP caching proxy • First prototype running at gordon.inf.ethz.ch:9090

×