Be the first to like this
In 1995 when the Internet was still in its infancy there were about 16 million users worldwide using it, compared to about 2 billion worldwide users today1. Over the last fifteen years not only has the number of people using the Internet grown exponentially, but we have also witnessed an evolution of technology standards, protocols, and information consumption patterns. The Internet is no longer limited to desktop/laptop computers. An increasing number of people on the go are using handheld devices to access their preferred websites. The easy access of websites has resulted in a significant increase in Web traffic.
Today while designing a Web application or a website that is expected to generate a lot of interest, one has to ensure that the Web application has the right design and infrastructure to handle the extra load, failing which websites are likely to experience difficulties. For instance, the highly popular micro-blogging website twitter.com faced stability issues for a long time after its launch, since it was not designed to handle a large amount of traffic.
The performance of a Web application is determined by multiple factors such as design and application architecture, quality of code and hardware infrastructure. Performance needs to be built at every layer of the technology stack to get a solid finished product.
This paper focuses on the Web content caching aspect of website performance.