MulTa-HDC: A Multi-Task Learning Framework For Hyperdimensional Computing
Brain-inspired Hyperdimensional computing (HDC) has shown its effectiveness in low-power/energy designs for edge computing in the Internet of Things (IoT). Due to limited resources available on edge devices, multi-task learning (MTL), which accommodates multiple cognitive tasks in one model, is cons...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on computers 2021-08, Vol.70 (8), p.1269-1284 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Brain-inspired Hyperdimensional computing (HDC) has shown its effectiveness in low-power/energy designs for edge computing in the Internet of Things (IoT). Due to limited resources available on edge devices, multi-task learning (MTL), which accommodates multiple cognitive tasks in one model, is considered a more efficient deployment of HDC. However, as the number of tasks increases, MTL-based HDC (MTL-HDC) suffers from the huge overhead of associative memory (AM) and performance degradation. This hinders MTL-HDC from the practical realization on edge devices. This article aims to establish an MTL framework for HDC to achieve a flexible and efficient trade-off between memory overhead and performance degradation. For the shared-AM approach, we propose Dimension Ranking for Effective AM Sharing (DREAMS) to effectively merge multiple AMs while preserving as much information of each task as possible. For the independent-AM approach, we propose Dimension Ranking for Independent MEmory Retrieval (DRIMER) to extract and concatenate informative components of AMs while mitigating interferences among tasks. By leveraging both mechanisms, we propose a hybrid framework of Mul ti- Ta sking HDC, called MulTa-HDC. To adapt an MTL-HDC system to an edge device given a memory resource budget, MulTa-HDC utilizes three parameters to flexibly adjust the proportion of the shared AM and independent AMs. The proposed MulTa-HDC is widely evaluated across three common benchmarks under two standard task protocols. The simulation results of ISOLET, UCIHAR, and MNIST datasets demonstrate that the proposed MulTa-HDC outperforms other state-of-the-art compressed HD models, including SparseHD and CompHD, by up to 8.23% in terms of classification accuracy. |
---|---|
ISSN: | 0018-9340 1557-9956 |
DOI: | 10.1109/TC.2021.3073409 |