Neural Appearance Model for Cloth Rendering

The realistic rendering of woven and knitted fabrics has posed significant challenges throughout many years. Previously, fiber‐based micro‐appearance models have achieved considerable success in attaining high levels of realism. However, rendering such models remains complex due to the intricate int...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer graphics forum 2024-07, Vol.43 (4), p.n/a
Hauptverfasser: Soh, G. Y., Montazeri, Z.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The realistic rendering of woven and knitted fabrics has posed significant challenges throughout many years. Previously, fiber‐based micro‐appearance models have achieved considerable success in attaining high levels of realism. However, rendering such models remains complex due to the intricate internal scatterings of hundreds of fibers within a yarn, requiring vast amounts of memory and time to render. In this paper, we introduce a new framework to capture aggregated appearance by tracing many light paths through the underlying fiber geometry. We then employ lightweight neural networks to accurately model the aggregated BSDF, which allows for the precise modeling of a diverse array of materials while offering substantial improvements in speed and reductions in memory. Furthermore, we introduce a novel importance sampling scheme to further speed up the rate of convergence. We validate the efficacy and versatility of our framework through comparisons with preceding fiber‐based shading models as well as the most recent yarn‐based model.
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.15156