Spectral Image Tokenizer
Image tokenizers map images to sequences of discrete tokens, and are a crucial component of autoregressive transformer-based image generation. The tokens are typically associated with spatial locations in the input image, arranged in raster scan order, which is not ideal for autoregressive modeling....
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Image tokenizers map images to sequences of discrete tokens, and are a
crucial component of autoregressive transformer-based image generation. The
tokens are typically associated with spatial locations in the input image,
arranged in raster scan order, which is not ideal for autoregressive modeling.
In this paper, we propose to tokenize the image spectrum instead, obtained from
a discrete wavelet transform (DWT), such that the sequence of tokens represents
the image in a coarse-to-fine fashion. Our tokenizer brings several advantages:
1) it leverages that natural images are more compressible at high frequencies,
2) it can take and reconstruct images of different resolutions without
retraining, 3) it improves the conditioning for next-token prediction --
instead of conditioning on a partial line-by-line reconstruction of the image,
it takes a coarse reconstruction of the full image, 4) it enables partial
decoding where the first few generated tokens can reconstruct a coarse version
of the image, 5) it enables autoregressive models to be used for image
upsampling. We evaluate the tokenizer reconstruction metrics as well as
multiscale image generation, text-guided image upsampling and editing. |
---|---|
DOI: | 10.48550/arxiv.2412.09607 |