Neural Representation of Shape-Dependent Laplacian Eigenfunctions
The eigenfunctions of the Laplace operator are essential in mathematical physics, engineering, and geometry processing. Typically, these are computed by discretizing the domain and performing eigendecomposition, tying the results to a specific mesh. However, this method is unsuitable for continuousl...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The eigenfunctions of the Laplace operator are essential in mathematical
physics, engineering, and geometry processing. Typically, these are computed by
discretizing the domain and performing eigendecomposition, tying the results to
a specific mesh. However, this method is unsuitable for
continuously-parameterized shapes.
We propose a novel representation for eigenfunctions in
continuously-parameterized shape spaces, where eigenfunctions are spatial
fields with continuous dependence on shape parameters, defined by minimal
Dirichlet energy, unit norm, and mutual orthogonality. We implement this with
multilayer perceptrons trained as neural fields, mapping shape parameters and
domain positions to eigenfunction values.
A unique challenge is enforcing mutual orthogonality with respect to
causality, where the causal ordering varies across the shape space. Our
training method therefore requires three interwoven concepts: (1) learning $n$
eigenfunctions concurrently by minimizing Dirichlet energy with unit norm
constraints; (2) filtering gradients during backpropagation to enforce causal
orthogonality, preventing earlier eigenfunctions from being influenced by later
ones; (3) dynamically sorting the causal ordering based on eigenvalues to track
eigenvalue curve crossovers.
We demonstrate our method on problems such as shape family analysis,
predicting eigenfunctions for incomplete shapes, interactive shape
manipulation, and computing higher-dimensional eigenfunctions, on all of which
traditional methods fall short. |
---|---|
DOI: | 10.48550/arxiv.2408.10099 |