Semi-inified caches - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Rapport (Rapport De Recherche) Année : 1993

Semi-inified caches

André Seznec

Résumé

Since the gap between main memory access time and processor cycle time is continuously increasing, processor performance dramatically depends on the behavior of caches and particularly on the behavior of small on-chip caches. In this paper, we present a new organisation for on-chip caches : the semi-unified cache organization. In most microprocessors, two physically split caches are used for respectively storing data and instructions. The purpose of the semi-unified cache organization is to use the data cache (resp. instruction cache) as an on-chip second-level cache for instructions (resp. data). Thus the associativity degree of both on-chip caches is artificially increased and the cache spaces respectively devoted to instructions and data are dynamically adjusted. The off-chip miss ratio of a semi-unified cache built with two direct-mapped caches of size S is equal to the miss ratio of a unified two-way set associative cache of size 2S ; yet, the hit time of this semi-unified cache is equal to the hit time of a direct-mapped cache ; moreover both instructions and data may be accessed in parallel as for the split data/instruction cache organization. Since on-chip miss penalty is lower than off-chip miss penalty, trace driven simulations show that using a direct-mapped semi-unified cache organization leads to higher overall system performance than using usual split instruction/data cache organization.

Domaines

Autre [cs.OH]
Fichier principal
Vignette du fichier
RR-1841.pdf (811.91 Ko) Télécharger le fichier

Dates et versions

inria-00074831 , version 1 (24-05-2006)

Identifiants

  • HAL Id : inria-00074831 , version 1

Citer

Nathalie Drach, André Seznec. Semi-inified caches. [Research Report] RR-1841, INRIA. 1993. ⟨inria-00074831⟩
148 Consultations
32 Téléchargements

Partager

Gmail Facebook X LinkedIn More