Scalable Hybrid Learning Techniques for Scientific Data Compression
Data compression is becoming critical for storing scientific data because many scientific applications need to store large amounts of data and post process this data for scientific discovery. Unlike image and video compression algorithms that limit errors to primary data, scientists require compress...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Data compression is becoming critical for storing scientific data because
many scientific applications need to store large amounts of data and post
process this data for scientific discovery. Unlike image and video compression
algorithms that limit errors to primary data, scientists require compression
techniques that accurately preserve derived quantities of interest (QoIs). This
paper presents a physics-informed compression technique implemented as an
end-to-end, scalable, GPU-based pipeline for data compression that addresses
this requirement. Our hybrid compression technique combines machine learning
techniques and standard compression methods. Specifically, we combine an
autoencoder, an error-bounded lossy compressor to provide guarantees on raw
data error, and a constraint satisfaction post-processing step to preserve the
QoIs within a minimal error (generally less than floating point error).
The effectiveness of the data compression pipeline is demonstrated by
compressing nuclear fusion simulation data generated by a large-scale fusion
code, XGC, which produces hundreds of terabytes of data in a single day. Our
approach works within the ADIOS framework and results in compression by a
factor of more than 150 while requiring only a few percent of the computational
resources necessary for generating the data, making the overall approach highly
effective for practical scenarios. |
---|---|
DOI: | 10.48550/arxiv.2212.10733 |