Sound field reconstruction in rooms: Inpainting meets super-resolution
In this paper, a deep-learning-based method for sound field reconstruction is proposed. The possibility to reconstruct the magnitude of the sound pressure in the frequency band 30–300 Hz for an entire room by using a very low number of irregularly distributed microphones arbitrarily arranged is show...
Gespeichert in:
Veröffentlicht in: | The Journal of the Acoustical Society of America 2020-08, Vol.148 (2), p.649-659 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, a deep-learning-based method for sound field reconstruction is proposed. The possibility to reconstruct the magnitude of the sound pressure in the frequency band 30–300 Hz for an entire room by using a very low number of irregularly distributed microphones arbitrarily arranged is shown. Moreover, the approach is agnostic to the location of the measurements in the Euclidean space. In particular, the presented approach uses a limited number of arbitrary discrete measurements of the magnitude of the sound field pressure in order to extrapolate this field to a higher-resolution grid of discrete points in space with a low computational complexity. The method is based on a U-net-like neural network with partial convolutions trained solely on simulated data, which itself is constructed from numerical simulations of Green's function across thousands of common rectangular rooms. Although extensible to three dimensions and different room shapes, the method focuses on reconstructing the two-dimensional plane of a rectangular room from measurements of the three-dimensional sound field. Experiments using simulated data together with an experimental validation in a real listening room are shown. The results suggest a performance which may exceed conventional reconstruction techniques for a low number of microphones and computational requirements. |
---|---|
ISSN: | 0001-4966 1520-8524 |
DOI: | 10.1121/10.0001687 |