Fat-Corrected Pancreatic IR/I2 Relaxometry from Multi-Echo Gradient-Recalled Echo Sequence Using Convolutional Neural Network
Fat-corrected R2* relaxometry from multi-echo gradient-recalled echo sequences (mGRE) could represent an efficient approach for iron overload evaluation, but its use is limited by computational constraints. A new method for the fast generation of R2* and fat fractions (FF) maps from mGRE using a con...
Gespeichert in:
Veröffentlicht in: | Electronics (Basel) 2022-09, Vol.11 (18) |
---|---|
Hauptverfasser: | , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Fat-corrected R2* relaxometry from multi-echo gradient-recalled echo sequences (mGRE) could represent an efficient approach for iron overload evaluation, but its use is limited by computational constraints. A new method for the fast generation of R2* and fat fractions (FF) maps from mGRE using a convolutional neural network (U-Net) and deep learning (DL) is presented. A U-Net for the calculation of pancreatic R2* and FF maps was trained with 576 mGRE abdominal images and compared to conventional fat-corrected relaxometry. The U-Net was effectively trained and provided R2* and FF maps visually comparable to conventional methods. Predicted pancreatic R2* and FF values were well correlated with the conventional model. Estimated and ground truth mean R2* values were not significantly different (43.65 ± 21.89 vs. 43.77 ± 19.81 ms, p = 0.692, intraclass correlation coefficient-ICC = 0.9938, coefficient of variation-CoV = 5.3%), while estimated FF values were slightly higher in respect to ground truth values (27.8 ± 16.87 vs. 25.67 ± 15.43 %, p < 0.0001, ICC = 0.986, CoV = 10.1%). Deep learning utilizing the U-Net is a feasible method for pancreatic MR fat-corrected relaxometry. A trained U-Net can be efficiently used for MR fat-corrected relaxometry, providing results comparable to conventional model-based methods. |
---|---|
ISSN: | 2079-9292 2079-9292 |
DOI: | 10.3390/electronics11182829 |