Supply and power optimization in leakage-dominant technologies

In this paper, we present a methodology for systematically optimizing the power-supply voltage for either maximizing the performance of very large scale integration (VLSI) circuits or minimizing the power dissipation in technologies where leakage power is not an insignificant fraction of the total p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on computer-aided design of integrated circuits and systems 2005-09, Vol.24 (9), p.1362-1371
Hauptverfasser: Man Lung Mui, Banerjee, K., Mehrotra, A.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we present a methodology for systematically optimizing the power-supply voltage for either maximizing the performance of very large scale integration (VLSI) circuits or minimizing the power dissipation in technologies where leakage power is not an insignificant fraction of the total power dissipation. For this purpose, we develop simplified empirical equations that describe the transistor behavior as a function of power supply and temperature. We use these models to calculate the full-chip power dissipation as a function of power supply and temperature. We then solve the power and chip thermal equations simultaneously to calculate the chip temperature and power dissipation at a given power supply. By varying the power-supply voltage, we determine the optimum V/sub DD/ value that minimized delay per unit length in global interconnects and therefore maximizes performance. Using the same framework, by again varying the supply we find the optimum V/sub DD/ that minimized the total power dissipation while maintaining a given delay per unit length. We show that for 90- and 65-nm technologies, where leakage power represents a significant fraction of the total power dissipation, optimum V/sub DD/ for maximum performance is lower than the International Technology Roadmap for Semiconductors (ITRS) specified supply voltage. This is due to the fact that reducing V/sub DD/ results in a large reduction in total power dissipation, and therefore the chip temperature, which improves performance. This improvement in performance is greater than the performance penalty incurred due to reduction in V/sub DD/. We also show that as the required delay per unit length is increased, total chip power consumption is reduced significantly if the power supply is also reduced as compared to the case when power supply is fixed at the nominal value. This change becomes larger with technology scaling due to the fact that leakage power, which is a very strong function of chip temperature, becomes a larger fraction of the full-chip power dissipation.
ISSN:0278-0070
1937-4151
DOI:10.1109/TCAD.2005.852039