A Two-Stream Deep Imaging Method for Multifrequency Capacitively Coupled Electrical Resistance Tomography
In this work, a novel deep imaging method is proposed for multifrequency capacitively coupled electrical resistance tomography (MFCCERT). A two-stream network consisting of a low-frequency stream and a high-frequency stream is developed according to the frequency characteristics of the interested im...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2023-03, Vol.23 (5), p.4362-4372 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this work, a novel deep imaging method is proposed for multifrequency capacitively coupled electrical resistance tomography (MFCCERT). A two-stream network consisting of a low-frequency stream and a high-frequency stream is developed according to the frequency characteristics of the interested impedance. Meanwhile, a cross-stream information intersection approach, which combines hyper-dense connection and gated channel transformation (GCT), is proposed to fuse the complementary information in the multifrequency impedance measurements. The multifrequency measurements of CCERT in the frequency range of 0.5-5 MHz are divided into the low-frequency band and the high-frequency band, which are taken as the inputs of the two streams of the network, respectively. With the proposed cross-stream information intersection approach, the useful features of the impedance in the same frequency band and the features of the impedance from the two frequency bands are fused. Experiments were carried out with the 12-electrode CCERT sensor to obtain the multifrequency impedance measurements. Both simulation and experimental data were used to test the developed two-stream network. Imaging results indicate that the proposed deep imaging method is effective. Compared with the single-stream Unet, the developed network has better information fusion capability and image reconstruction performance. |
---|---|
ISSN: | 1530-437X 1558-1748 |
DOI: | 10.1109/JSEN.2022.3200960 |