Your SlideShare is downloading. ×
Dynamic Cache Management
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Dynamic Cache Management

2,018

Published on

3 Comments
0 Likes
Statistics
Notes
  • Be the first to like this

No Downloads
Views
Total Views
2,018
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
57
Comments
3
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. A Seminar Presentation On “Dynamic Cache Management Technique” Presented By: Ajay Singh Lamba (IT , Final Year)
  • 2. Content
    • Introduction to cache memory
    • How stored data is transferred to the CPU
    • Mapping functions
    • Dynamic Cache Management
    • Dynamic Techniques For L0-cache Management
  • 3. Introduction to cache memory
    • A cache, in computer terms, is a place to store information that's faster than the place where the information is usually stored
    • Cache memory is fast memory that is used to hold the most recently accessed data
    • Only frequently accessed data will stay in cache, which allows the CPU to access it more quickly
    • it is placed in the processor chip, which allows it to 'talk' with the processor direct at a much higher speed than standard RAM.
  • 4. How stored data is transferred to the CPU ??
  • 5. Mapping functions
    • Since M>>C, how are blocks mapped to specific lines in cache.
    • Direct mapping
    • Associative mapping
    • Set associative mapping
  • 6. Dynamic Cache Management
    • It’s resizing strategy of the cache memory
    • Dynamic caching allows for dynamic resizing both across and within applications execution.
    • The basic idea is that only the most frequently executed portion of the code should be stored in the L0-cache
  • 7. POWER TRENDS FOR CURRENT MICROPROCESSORS
  • 8. DYNAMIC TECHNIQUES FOR L0-CACHE MANAGEMENT
    • 1. Simple Method.
    • 2. Static Method.
    • 3. Dynamic Confidence Estimation Method.
    • 4. Restrictive Dynamic Confidence Estimation Method.
    • 5. Dynamic Distance Estimation Method.
  • 9. SIMPLE METHOD
    • If a branch predictor is mispredicted, the machine will access the I-cache to fetch the instructions.
    • If a branch is predicted correctly, the machine will access the L0-cache.
    • In a misprediction , the machine will start fetching the instructions from the correct address by accessing the I-cache.
  • 10. STATIC METHOD
    • If a ‘high confidence’ branch was predicted incorrectly, the I-cache is accessed for the subsequent basic blocks.
    • If more than n low confidence branches have been decoded in a row, the I-cache is accessed. Therefore the L0-cache will be bypassed when either of the two conditions are satisfied.
    • In any other case the machine will access the L0-cache.
  • 11. DYNAMIC CONFIDENCE ESTIMATION METHOD
    • It is a dynamic version of the static method.
    • The confidence of the I-cache is accessed if
    • 1. A high confidence branch is mispredicted.
    • 2. More than n successive ‘low confidence’ branches are encountered.
    • it is more accurate in characterizing a branch and, then, regulating the access of the L0-cache.
  • 12. RESTRICTIVE DYNAMIC CONFIDENCE ESTIMATION METHOD
    • Restrictive dynamic scheme is a more selective scheme in which only the really important basic blocks would be selected for the L0-cache.
    • The L0-cache is accessed only if a ‘high confidence’ branch is predicted correctly. The I-cache is accessed in any other case.
    • This method selects some of the most frequently executed basic blocks, yet it misses some others.
  • 13. Dynamic Distance Estimation Method
    • All n branches after a mispredicted branch are tagged as ‘low confidence’ otherwise as ‘high confidence’.
    • The basic blocks after a ‘low confidence’ branch are fetched from the L0-cache.
    • The net effect is that a branch misprediction causes a series of fetches from the I-cache.
    • A counter is used to measure the distance of a branch from the previous mispredicted branch.
  • 14. Thank you Any Query ??

×