• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Cache memory presentation

Cache memory presentation






Total Views
Views on SlideShare
Embed Views



3 Embeds 666

http://programmingpalace.wordpress.com 663
http://webcache.googleusercontent.com 2
http://translate.googleusercontent.com 1



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Cache memory presentation Cache memory presentation Presentation Transcript

    • CACHE MEMORY 07/07/12 How caching works1
    • What is a Cache?The cache is a very high speed, expensive piece of memory, which is used to 07/07/12speed up the memory retrieval process. Due to it’s higher cost, the CPU comeswith a relatively small amount of cache compared with the main memory.Without cache memory, every time the CPU requests for data, it would send therequest to the main memory which would then be sent back across the systembus to the CPU. This is a slow process. The idea of introducing cache is that thisextremely fast memory would store data that is frequently accessed and ifpossible, the data that is around it. This is to achieve the quickest possibleresponse time to the CPU. 2
    • Role of Cache in ComputersIn early PCs, the various components had one thing in common: they were all reallyslow. The processor was running at 8 MHz or less, and taking many clock cycles to get 07/07/12anything done. In fact, on some machines the memory was faster than the processor.With the advancement of technology, the speed of every component has increaseddrastically. Now processors run much faster than everything else in the computer. Thismeans that one of the key goals in modern system design is to ensure that to whateverextent possible, the processor is not slowed down by the storage devices it works with.Slowdowns mean wasted processor cycles, where the CPU cant do anything because itis sitting and waiting for information it needs.The best way to keep the processor from having to wait is to make everything that ituses as fast as it is. But that would be very expensive.There is a good compromise to this however. Instead of trying to make the whole 64MB out of this faster, expensive memory, you make a smaller piece, say 256 KB. Thenyou find a smart algorithm (process) that allows you to use this 256 KB in such a waythat you get almost as much benefit from it as you would if the whole 64 MB wasmade from the faster memory. How do you do this? The answer is by using this smallcache of 256 KB to hold the information most recently used by the processor.Computer science shows that in general, a processor is much more likely to need again 3information it has recently used, compared to a random piece of information inmemory. This is the principle behind caching
    • Types of Cache Memory• Memory Cache: A memory cache, sometimes called a cache store or RAM cache, is a 07/07/12portion of memory made of high-speed static RAM (SRAM) instead of the slower andcheaper dynamic RAM (DRAM) used for main memory. Memory caching is effectivebecause most programs access the same data or instructions over and over. By keeping asmuch of this information as possible in SRAM, the computer avoids accessing the slowerDRAM.• Disk Cache: Disk caching works under the same principle as memory caching, butinstead of using high-speed SRAM, a disk cache uses conventional main memory. Themost recently accessed data from the disk (as well as adjacent sectors) is stored in amemory buffer. When a program needs to access data from the disk, it first checks thedisk cache to see if the data is there. Disk caching can dramatically improve theperformance of applications, because accessing a byte of data in RAM can be thousandsof times faster than accessing a byte on a hard disk. 4
    • Levels of Cache: Cache memory is categorized in levels based on it’s closenessand accessibility to the microprocessor. There are three levels of a cache.Level 1(L1) Cache: This cache is inbuilt in the processor and is made of SRAM(Static RAM)Each time the processor requests information from memory, the cache controller on the chip uses 07/07/12special circuitry to first check if the memory data is already in the cache. If it is present, then thesystem is spared from time consuming access to the main memory. In a typical CPU, primary cacheranges in size from 8 to 64 KB, with larger amounts on the newer processors. This type of CacheMemory is very fast because it runs at the speed of the processor since it is integrated into it.Level 2(L2) Cache: The L2 cache is larger but slower in speed than L1 cache. It is used to seerecent accesses that is not picked by L1 cache and is usually 64 to 2 MB in size. A L2 cache is alsofound on the CPU. If L1 and L2 cache are used together, then the missing information that is notpresent in L1 cache can be retrieved quickly from the L2 cache. Like L1 caches, L2 caches arecomposed of SRAM but they are much larger. L2 is usually a separate static RAM (SRAM) chip and itis placed between the CPU & DRAM(Main Memory)Level 3(L3) Cache: L3 Cache memory is an enhanced form of memory present on themotherboard of the computer. It is an extra cache built into the motherboard between the processor andmain memory to speed up the processing operations. It reduces the time gap between request andretrieving of the data and instructions much more quickly than a main memory. L3 cache are being usedwith processors nowadays, having more than 3 MB of storage in it. 5
    • Diagram showing different types of cache and theirposition in the computer system 07/07/12 6
    • Principle behind Cache MemoryCache is a really amazing technology. A 512 KB level 2 cache, caching 64 MB ofsystem memory, can supply the information that the processor requests 90-95% of thetime. The level 2 cache is less than 1% of the size of the memory it is caching, but it is 07/07/12able to register a hit on over 90% of requests. Thats pretty efficient, and is the reasonwhy caching is so important.The reason that this happens is due to a computer science principle called locality ofreference. It states basically that even within very large programs with severalmegabytes of instructions, only small portions of this code generally get used at once.Programs tend to spend large periods of time working in one small area of the code,often performing the same work many times over and over with slightly different data,and then move to another area. This occurs because of "loops", which are whatprograms use to do work many times in rapid succession. 7
    • Locality of ReferenceLets take a look at the following pseudo-code to see how locality of reference worksOutput to screen « Enter a number between 1 and 100 »Read input from user 07/07/12Put value from user in variable XPut value 100 in variable YPut value 1 in variable ZLoop Y number of time Divide Z by X If the remainder of the division = 0then output « Z is a multiple of X »Add 1 to ZReturn to loopEndThis small program asks the user to enter a number between 1 and 100. It reads the value entered by the user.Then, the program divides every number between 1 and 100 by the number entered by the user. It checks ifthe remainder is 0. If so, the program outputs "Z is a multiple of X", for every number between 1 and 100.Then the program ends.Now it is easy to understand that in the 11 lines of this program, the loop part (lines 7 to 9) are executed 100times. All of the other lines are executed only once. Lines 7 to 9 will run significantly faster because ofcaching. This program is very small and can easily fit entirely in the smallest of L1 caches, but lets say thisprogram is huge. The result remains the same. When you program, a lot of action takes place inside loops.This 95%-to-5% ratio (approximately) is what we call the locality of reference, and its why a cache worksso efficiently. This is also why such a small cache can efficiently cache such a large memory system. Youcan see why its not worth it to construct a computer with the fastest memory everywhere. We can deliver 95 8percent of this effectiveness for a fraction of the cost
    • Importance of CacheCache is responsible for a great deal of the system performance 07/07/12improvement of todays PCs. The cache is a buffer of sorts betweenthe very fast processor and the relatively slow memory that servesit. The presence of the cache allows the processor to do its workwhile waiting for memory far less often than it otherwise would.Without cache the computer will be very slow and all our works getdelay. So cache is a very important part of our computer system. 9
    • Resources:www.pcguide.com 07/07/12www.howstuffworks.comwww.webopedia.com Thank You 10