High-Performance Computing-in-Memory Architecture Using STT-/SOT-Based Series Triple-Level Cell MRAM
Spin-torque-based magnetic random access memories (MRAMs) have emerged as a promising option for next-generation data-centric computing systems. Multi-level cell (MLC) configuration is an efficient method to increase storage density. In this article, we propose a series triple-level cell (sTLC) arch...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on magnetics 2021-08, Vol.57 (8), p.1-12 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Spin-torque-based magnetic random access memories (MRAMs) have emerged as a promising option for next-generation data-centric computing systems. Multi-level cell (MLC) configuration is an efficient method to increase storage density. In this article, we propose a series triple-level cell (sTLC) architecture based on spin-transfer torque (STT) and spin-orbit torque (SOT) switching mechanisms. The proposed hybrid STT/SOT sTLC MRAM architecture is capable of storing 3 bits of data using a maximum of two writing steps. However, most of the switching transitions (72%) use only single-step writing. The simulation results of the sTLC MRAM showed 82% and 68% saving in the write energy compared with previously published STT and STT-/SOT-based TLC structures, respectively. One-step parallel read operation for sTLC is presented in this work that enables ultra-fast reading of 3 bits of data. Furthermore, a novel sTLC-based computing-in memory (CiM) architecture is proposed, and high-performance AND/OR/XOR and magnetic full-adder (MFA) logic circuits have been implemented. The proposed sTLC-based CiM MFA shows 33% lesser transistor counts with nearly equivalent energy performance in comparison to the recently published spin-Hall effect (SHE)-based CiM MFA. |
---|---|
ISSN: | 0018-9464 1941-0069 |
DOI: | 10.1109/TMAG.2021.3084869 |