Breaking On-device Training Memory Wall: A Systematic Survey
On-device training has become an increasingly popular approach to machine learning, enabling models to be trained directly on mobile and edge devices. However, a major challenge in this area is the limited memory available on these devices, which can severely restrict the size and complexity of the...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | On-device training has become an increasingly popular approach to machine
learning, enabling models to be trained directly on mobile and edge devices.
However, a major challenge in this area is the limited memory available on
these devices, which can severely restrict the size and complexity of the
models that can be trained. In this systematic survey, we aim to explore the
current state-of-the-art techniques for breaking on-device training memory
walls, focusing on methods that can enable larger and more complex models to be
trained on resource-constrained devices.
Specifically, we first analyze the key factors that contribute to the
phenomenon of memory walls encountered during on-device training. Then, we
present a comprehensive literature review of on-device training, which
addresses the issue of memory limitations. Finally, we summarize on-device
training and highlight the open problems for future research.
By providing a comprehensive overview of these techniques and their
effectiveness in breaking memory walls, we hope to help researchers and
practitioners in this field navigate the rapidly evolving landscape of
on-device training. |
---|---|
DOI: | 10.48550/arxiv.2306.10388 |