Anda di halaman 1dari 26

Cache memory

October 16, 2007 By: Tatsiana Gomova

The memory hierarchy


Cache memory has the fastest access time after registers.

Why is cache memory needed?


When a program references a memory location, it is likely to reference that same memory location again soon. A memory location that is near a recently referenced location is more likely to be referenced then a memory location that is farther away.

Why is cache memory needed


A small but fast cache memory, in which the contents of the most commonly accessed locations are maintained, can be placed between the CPU and the main memory. When a program executes, the cache memory is searched first.

Why is cache memory fast?


Faster electronics used A cache memory has fewer locations than a main memory, which reduces the access time The cache is placed both physically closer and logically closer the the CPU than the main memory

Why is cache memory fast


This cacheless computer usually needs a few bus cycles to synchronize the CPU with the bus. A cache memory can be positioned closer to the CPU.

Cache mapping
Commonly used methods: Associative Mapped Cache Direct-Mapped Cache Set-Associative Mapped Cache

Associative Mapped Cache


Any main memory blocks can be mapped into each cache slot. To keep track of which on of the 227 possible blocks is in each slot, a 27-bit tag field is added to each slot.

Associative Mapped Cache


Valid bit is needed to indicate whether or not the slot holds a line that belongs to the program being executed. Dirty bit keeps track of whether or not a line has been modified while it is in the cache.

Associative Mapped Cache


The mapping from main memory blocks to cache slots is performed by partitioning an address into fields. For each slot, if the valid bit is 1, then the tag field of the referenced address is compared with the tag field of the slot.

Associative Mapped Cache


How an access to the memory location (A035F014)16 is mapped to the cache. If the addressed word is in the cache, it will be found in word (14)16 of a slot that has a tag of (501AF80)16 , which is made up of the 27 most significant bits of the address.

Associative Mapped Cache


Advantages
Any main memory block can be placed into any cache slot. Regardless of how irregular the data and program references are, if a slot is available for the block, it can be stored in the cache.

Associative Mapped Cache


Disadvantages
Considerable hardware overhead needed for cache bookkeeping. There must be a mechanism for searching the tag memory in parallel.

Direct-Mapped Cache
Each cache slot corresponds to an explicit set of main memory. In our example we have 227 memory blocks and 214 cache slots. A total of 227 / 214 = 213 main memory blocks
can be mapped onto each cache slot.

Direct-Mapped Cache
The 32-bit main memory address is partitioned into a 13-bit tag field, followed by a 14-bit slot field, followed by a five-bit word field.

Direct-Mapped Cache
When a reference is made to the main memory address, the slot field identifies in which of the 214 slots the block will be found. If the valid bit is 1, then the tag field of the referenced address is compared with the tag field of the slot.

Direct-Mapped Cache
How an access to memory location (A035F014)16 is mapped to the cache. If the addressed word is in the cache, it will be found in word (14)16 of slot (2F80)16 which will have a tag of (1406)16.

Direct-Mapped Cache
Advantages
The tag memory is much smaller than in associative mapped cache. No need for an associative search, since the slot field is used to direct the comparison to a single field.

Direct-Mapped Cache
Disadvantages
Consider what happens when a program references locations that are 219 words apart, which is the size of the cache. Every memory reference will result in a miss, which will cause an entire block to be read into the cache even though only a single word is used.

Set-Associative Mapped Cache


Combines the simplicity of direct mapping with the flexibility of associative mapping For this example, two slots make up a set. Since there are 214 slots in the cache, there are 214/2 =213 sets.

Set-Associative Mapped Cache


When an address is mapped to a set, the direct mapping scheme is used, and then associative mapping is used within a set.

Set-Associative Mapped Cache


The format for an address has 13 bits in the set field, which identifies the set in which the addressed word will be found. Five bits are used for the word field and 14-bit tag field.

Set-Associative Mapped Cache


Consider again the address (A035F014)16

Set-Associative Mapped Cache


Advantages
In our example the tag memory increases only slightly from the direct mapping and only two tags need to be searched for each memory reference. The set-associative cache is widely used in todays microprocessors.

Cache Performance
Cache read and write policies

Cache Performance
As to which cache read or write policies are best, there is no simple answer. The organization of a cache is optimized for each computer architecture and the mix of programs that the computer executes.

Anda mungkin juga menyukai