The Presentation introduces the basic concept of cache memory, its introduction , background and all necessary details are provided along with details of different mapping techniques that are used inside Cache Memory.
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Cache memory
1. P R E S E N T E D B Y
A H S A N A S H F A Q
B C S ( H O N S ) F R O M U O P ( G O L D M E D A L I S T )
( 2 0 0 8 - 2 0 1 1 )
M S S C H O L A R I N I M | S C I E N C E S P E S H A W A R
( 2 0 1 3 - 2 0 1 5 )
CACHE MEMORY
STRUCTURE, ORGANIZATION AND MAPPING
2. MAJOR ACHIEVEMENTS
• Gold Medalist from University Of Peshawar.
• Course : BCS (Hons)
• Session : 2008 to 2011
• CGPA : 3.9/4.0
• 3rd Position in Computer Science Group in Board of
Intermediate and Secondary Education Peshawar.
• Course : FSc Computer Science
• Session : 2006-2007
• Marks : 901/1100
2
3. CONTENTS
• Memory Hierarchy
• Background
• Working of Cache Memory
• Structure and Organization
• Reference of Locality
• Mapping Techniques
• Direct
• Associative
• Set Associative
• Write Policies
• Cache Misses and its types
• Solution for Cache Misses
3Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
5. BACKGROUND
• Main Memory provides data to Processor for
Processing.
• Difference between the speed of Processor and Main
Memory.
• Processor being the Main Resource can not be utilized fully.
5Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
6. CACHE MEMORY
• Cache Memory is used in order to achieve higher
performance of CPU by allowing the CPU to access
data at faster speed.
6Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
7. HOW CACHE WORKS ?
7Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
8. CACHE OPERATION
• CPU requests contents of memory location
• Check cache for this data
• If present, get from cache (fast)
• If not present, read required block from main
memory to cache
• Question : Why a Block is transferred ?
8Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
9. REFERENCE OF LOCALITY
• Spatial vs. Temporal
Temporal : Recently referenced items are likely to be
references in near future.
Spatial : Items with near by addresses tend to be referenced
close together in time.
9Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
12. CACHE ADDRESSING
• Where does cache sit?
• Between processor and virtual memory management unit
• Between MMU and main memory
• Logical cache (virtual cache) stores data using virtual
addresses
• Processor accesses cache directly, not thorough physical
cache
• Cache access faster, before MMU address translation
• Virtual addresses use same address space for different
applications
• Must flush cache on each context switch
• Physical cache stores data using main memory physical
addresses
12Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
13. MAPPING FUNCTION
• Number of Blocks (M) in the main Memory is quite
large then the Number of Lines (m) in the cache
Memory i.e. m<M.
• So we need a Mapping Function.
• Direct Mapping
• Associative Mapping
• Set Associative Mapping
13Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
14. DIRECT MAPPING
• Each block of Main Memory can be mapped to a
specific line of Cache.
• If we want a block we will check its present in a
specific line.
• Mapping Function is
• i=j module m
• i=Cache Line number
• J=Main memory Block
• m=Total lines in Cache
14Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
16. DIRECT MAPPING ADDRESS FORMAT
2w = Words in a Block
2S = Total blocks in MM
2r = Total lines in Cache
16Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
17. DIRECT MAPPING PROS & CONS
• Simple
• Inexpensive
• Fixed location for given block
• If a program accesses 2 blocks that map to the same line
repeatedly, cache misses are very high
17Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
18. ASSOCIATIVE MAPPING
• A main memory block can load into any line of
cache
• Memory address is interpreted as tag and word
• Tag uniquely identifies block of memory
• Every line’s tag is examined for a match
• Cache searching gets expensive
18Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
20. ASSOCIATIVE MAPPING ADDRESS
FORMAT
Tag 22 bit
Word
2 bit
s w
Address Length=(s+w) bits
2w = Words in a Block/Line
2S = Total blocks in MM
Number of Lines in Cache=Undefined
20Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
21. ASSOCIATIVE MAPPING PROS & CONS
• Associative mapping result in high memory
utilization.
• Less Efficient search criteria
• Search is expensive in terms of price and performance.
21Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
22. SET ASSOCIATIVE MAPPING
• Cache is divided into a number of sets
• Each set contains a number of lines
• Between MM Block and Cache Set Mapping is
direct and inside Set its Associative.
• Mapping Function
• Set # = Block # module Total Sets
• e.g. 2 lines per set
• 2 way associative mapping
• A given block can be in one of 2 lines in only one set
22Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
23. SET-ASSOCIATIVE MAPPING
ADDRESS FORMAT
Address Length=(s+w) bits
2w = Words in a Block/Line
2r = Total sets in cache
wrs-r
23Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
25. REPLACEMENT ALGORITHMS
ASSOCIATIVE & SET ASSOCIATIVE
• Least Recently used (LRU)
• e.g. in 2 way set associative
• Which of the 2 block is lru?
• First in first out (FIFO)
• replace block that has been in cache longest
• Least frequently used
• replace block which has had fewest hits
• Random
25Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
26. WRITE POLICY
• Must not overwrite a cache block unless main
memory is up to date
• Multiple CPUs may have individual caches
• I/O may address main memory directly
26Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
27. WRITE THROUGH
• All writes go to main memory as well as cache
• Multiple CPUs can monitor main memory traffic to
keep local (to CPU) cache up to date
• Lots of traffic
• Slows down writes
27Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
28. WRITE BACK
• Updates initially made in cache only
• Update bit for cache slot is set when update occurs
• If block is to be replaced, write to main memory
only if update bit is set
• I/O must access main memory through cache
28Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar
29. CACHE MISSES
• Compulsory Misses
• Capacity Misses
• Conflict Misses
Solution
Miss Cache
Victim Cache
Prefetching
29Cache Memory By Ahsan Ashfaq MS Scholar, IM|Sciences Peshawar