Hot-and-Cold: Using Criticality in the Design of Energy-Efficient Caches

As technology scales and processor speeds improve, power has become a first-order design constraint in all aspects of processor design. In this paper, we focus on reducing both dynamic and leakage energy of the data cache using architectural techniques that exploit a statically partitioned cache organization. We leverage the ability to predict whether an access is in the application’s critical path to partition the accesses into multiple streams. Accesses in the critical path are serviced by a high-performance (hot) cache bank. Accesses not in the critical path are serviced by a lower energy (and lower performance (cold)) cache bank. The resulting organization is a physically banked cache with different levels of energy consumption and performance in each bank. Our results show that each additional cycle in the cold cache access time slows performance down by only 0.8%. If the cold cache is designed to be highly energy-efficient (consuming 20% of the dynamic and leakage energy of the hot cache), we observe L1 data cache energy savings of as much as 37%.

By: Rajeev Balasubramonian, Viji Srinivasan, Sandhya Dwarkadas, Alper Buyuktosunoglu

Published in: Lecture Notes in Computer Science, volume 3164, (no ), pages 180-95 in 2004

Please obtain a copy of this paper from your local library. IBM cannot distribute this paper externally.

Questions about this service can be mailed to reports@us.ibm.com .