Levels of Memory
• Level 1 or Register: It is a type of memory in which
data is stored and accepted that are immediately stored
in the CPU. The most commonly used register is
Accumulator, Program counter, Address Register, etc.
• Level 2 or Cache memory: It is the fastest memory
that has faster access time where data is temporarily
stored for faster access.
• Level 3 or Main Memory: It is the memory on which
the computer works currently. It is small in size and
once power is off data no longer stays in this memory.
• Level 4 or Secondary Memory: It is external memory
that is not as fast as the main memory but data stays
permanently in this memory.
Cache Memory
Cache Memory
• Cache Memory is a special very high-speed
memory.
• The cache is a smaller and faster memory
that stores copies of the data from frequently
used main memory locations.
• There are various different independent
caches in a CPU, which store instructions
and data.
• The most important use of cache memory is
that it is used to reduce the average time to
access data from the main memory.
Characteristics of Cache Memory
• Cache memory is an extremely fast memory type
that acts as a buffer between RAM and the CPU.
• Cache Memory holds frequently requested data
and instructions so that they are immediately
available to the CPU when needed.
• Cache memory is costlier than main memory or
disk memory but more economical than CPU
registers.
• Cache Memory is used to speed up and
synchronize with a high-speed CPU.
Cache Memory
Cache Performance
• When the processor needs to read or write a location in the
main memory, it first checks for a corresponding entry in the
cache.
• If the processor finds that the memory location is in the cache,
a Cache Hit has occurred and data is read from the cache.
• If the processor does not find the memory location in the
cache, a cache miss has occurred. For a cache miss, the
cache allocates a new entry and copies in data from the main
memory, then the request is fulfilled from the contents of the
cache.
• The performance of cache memory is frequently measured in
terms of a quantity called Hit ratio.
• Hit Ratio(H) = hit / (hit + miss) = no. of hits/total accesses Miss Ratio
= miss / (hit + miss) = no. of miss/total accesses = 1 - hit ratio(H)
Cache Mapping
• There are three different types of mapping
used for the purpose of cache memory
which is as follows:
• Direct Mapping
• Associative Mapping
• Set-Associative Mapping
1. Direct Mapping
• The direct mapping technique is simple and inexpensive to
implement.
• When the CPU wants to access data from memory, it places
a address. The index field of CPU address is used to access
address.
• The tag field of CPU address is compared with the
associated tag in the word read from the cache.
• If the tag-bits of CPU address is matched with the tag-bits
of cache, then there is a hit and the required data word is
read from cache.
• If there is no match, then there is a miss and the required
data word is stored in main memory. It is then transferred
from main memory to cache memory with the new tag.
1. Direct Mapping
• Associative memories are expensive compared to random-
access memories because of the added logic associated with
each cell.
• Direct mapping uses a random-access memory for the cache.
• The CPU address of 15 bits is divided into two fields.
• The nine least significant bits constitute the index field and the
remaining six bits form the tag field.
• The number of bits in the index field is equal to the number of
address bits required to access the cache memory.
• The n-bit memory address is divided into two fields:k bits for
the index field and n-k bits for the tag field.
• The direct mapping cache organization uses the n-bit address
to access the main memory and the k-bit index to access
cache.
1. Direct Mapping
1. Direct Mapping
2. Associative Mapping
● An associative mapping usesan associative
memory.
● Thismemoryisbeing accessed usingits contents.
● Each line of cache memory will accommodate
the address (main memory) and the contentsof
that addressfrom the main memory.
● That is why this memory is also called Content
Addressable Memory(CAM). It allowseachblock
of main memoryto be stored in the cache.
• The fastest and most flexible cache organization uses an
associative memory.
• The associative memory stores both the address and
content of the memory word.
• This permits any location in cache to store any word from
main memory.
• A CPU address of 15 bits is placed in the arguement
register and the associative memory is searched for a
matching address.
• If the address is found ,the corresponding 12 bit data is
read and sent to the CPU.
• If no match occurs, the main memory is accessed for the
word.The address-data pair is then transferred to the
associative cache memory.
• If the cache is full,an address-data pair must be displaced
to make room for a pair that is needed and not presently in
the cache.
• Different replacement algorithms like FIFO is used for
this.
2. Associative Mapping
3. Set Associative Mapping
● That is the easy control of the direct mapping
cache and the more flexible mapping of the fully
associative cache.
● In set associative mapping, each cache location
can have more than one pair of tag + data items.
● That is more than one pair of tag and data are
residing at the same location of cache memory. If
one cache location is holding two pair of tag + data
items, that is called 2-wayset associativemapping.
3. Set Associative Mapping
• Set-associative mapping is an improvement over the direct mapping
organization in that each word of cache can store two or more words
of memory under the same index address.
• Each data word is stored together with its tag and the number of tag-
data items in one word of cache is said to form a set.
• An example of a set - associative cache organization for a set size
of two is shown.
• Each index address refers to two data words and their associated
tags.
• When the CPU generates memory request,the index value of the
address is used to access the cache.
• The tag field of the CPU address is then compared with both tags in
the cache to determine if a match occurs.
• The comparison logic is done by an associative search of the tags in
the set similar to an associative memory search:thus the name "set-
associative".
3. Set Associative Mapping

Cache Memory.pptx

  • 1.
    Levels of Memory •Level 1 or Register: It is a type of memory in which data is stored and accepted that are immediately stored in the CPU. The most commonly used register is Accumulator, Program counter, Address Register, etc. • Level 2 or Cache memory: It is the fastest memory that has faster access time where data is temporarily stored for faster access. • Level 3 or Main Memory: It is the memory on which the computer works currently. It is small in size and once power is off data no longer stays in this memory. • Level 4 or Secondary Memory: It is external memory that is not as fast as the main memory but data stays permanently in this memory.
  • 3.
  • 4.
    Cache Memory • CacheMemory is a special very high-speed memory. • The cache is a smaller and faster memory that stores copies of the data from frequently used main memory locations. • There are various different independent caches in a CPU, which store instructions and data. • The most important use of cache memory is that it is used to reduce the average time to access data from the main memory.
  • 5.
    Characteristics of CacheMemory • Cache memory is an extremely fast memory type that acts as a buffer between RAM and the CPU. • Cache Memory holds frequently requested data and instructions so that they are immediately available to the CPU when needed. • Cache memory is costlier than main memory or disk memory but more economical than CPU registers. • Cache Memory is used to speed up and synchronize with a high-speed CPU.
  • 6.
  • 7.
    Cache Performance • Whenthe processor needs to read or write a location in the main memory, it first checks for a corresponding entry in the cache. • If the processor finds that the memory location is in the cache, a Cache Hit has occurred and data is read from the cache. • If the processor does not find the memory location in the cache, a cache miss has occurred. For a cache miss, the cache allocates a new entry and copies in data from the main memory, then the request is fulfilled from the contents of the cache. • The performance of cache memory is frequently measured in terms of a quantity called Hit ratio. • Hit Ratio(H) = hit / (hit + miss) = no. of hits/total accesses Miss Ratio = miss / (hit + miss) = no. of miss/total accesses = 1 - hit ratio(H)
  • 8.
    Cache Mapping • Thereare three different types of mapping used for the purpose of cache memory which is as follows: • Direct Mapping • Associative Mapping • Set-Associative Mapping
  • 9.
    1. Direct Mapping •The direct mapping technique is simple and inexpensive to implement. • When the CPU wants to access data from memory, it places a address. The index field of CPU address is used to access address. • The tag field of CPU address is compared with the associated tag in the word read from the cache. • If the tag-bits of CPU address is matched with the tag-bits of cache, then there is a hit and the required data word is read from cache. • If there is no match, then there is a miss and the required data word is stored in main memory. It is then transferred from main memory to cache memory with the new tag.
  • 10.
    1. Direct Mapping •Associative memories are expensive compared to random- access memories because of the added logic associated with each cell. • Direct mapping uses a random-access memory for the cache. • The CPU address of 15 bits is divided into two fields. • The nine least significant bits constitute the index field and the remaining six bits form the tag field. • The number of bits in the index field is equal to the number of address bits required to access the cache memory. • The n-bit memory address is divided into two fields:k bits for the index field and n-k bits for the tag field. • The direct mapping cache organization uses the n-bit address to access the main memory and the k-bit index to access cache.
  • 11.
  • 12.
  • 13.
    2. Associative Mapping ●An associative mapping usesan associative memory. ● Thismemoryisbeing accessed usingits contents. ● Each line of cache memory will accommodate the address (main memory) and the contentsof that addressfrom the main memory. ● That is why this memory is also called Content Addressable Memory(CAM). It allowseachblock of main memoryto be stored in the cache.
  • 14.
    • The fastestand most flexible cache organization uses an associative memory. • The associative memory stores both the address and content of the memory word. • This permits any location in cache to store any word from main memory. • A CPU address of 15 bits is placed in the arguement register and the associative memory is searched for a matching address. • If the address is found ,the corresponding 12 bit data is read and sent to the CPU. • If no match occurs, the main memory is accessed for the word.The address-data pair is then transferred to the associative cache memory. • If the cache is full,an address-data pair must be displaced to make room for a pair that is needed and not presently in the cache. • Different replacement algorithms like FIFO is used for this.
  • 15.
  • 16.
    3. Set AssociativeMapping ● That is the easy control of the direct mapping cache and the more flexible mapping of the fully associative cache. ● In set associative mapping, each cache location can have more than one pair of tag + data items. ● That is more than one pair of tag and data are residing at the same location of cache memory. If one cache location is holding two pair of tag + data items, that is called 2-wayset associativemapping.
  • 17.
    3. Set AssociativeMapping • Set-associative mapping is an improvement over the direct mapping organization in that each word of cache can store two or more words of memory under the same index address. • Each data word is stored together with its tag and the number of tag- data items in one word of cache is said to form a set. • An example of a set - associative cache organization for a set size of two is shown. • Each index address refers to two data words and their associated tags. • When the CPU generates memory request,the index value of the address is used to access the cache. • The tag field of the CPU address is then compared with both tags in the cache to determine if a match occurs. • The comparison logic is done by an associative search of the tags in the set similar to an associative memory search:thus the name "set- associative".
  • 18.