Regularizing deep neural networks for medical image analysis with augmented batch normalization
Batch Normalization (BN) is a commonly employed regularization technique for deep neural networks. This technique employs normalization and affine transformation to accelerate the training phase. The normalization process forces the distribution of layer inputs into the standard normal distribution...
Gespeichert in:
Veröffentlicht in: | Applied soft computing 2024-03, Vol.154, p.111337, Article 111337 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Batch Normalization (BN) is a commonly employed regularization technique for deep neural networks. This technique employs normalization and affine transformation to accelerate the training phase. The normalization process forces the distribution of layer inputs into the standard normal distribution in a mini-batch, resolving the internal covariate shift problem. Meanwhile, the affine transformation preserves the network’s ability to perform non-linear feature transformations. However, the effectiveness of BN can be limited when dealing with small mini-batch sizes, since the batch statistics estimated from inadequate samples are inaccurate and unreliable. To address this issue, we present Noise-Assisted Batch Normalization (NABN), which serves as a variant of BN. The proposed method adds random noise into the mean and variance calculated from the mini-batch during the normalization process, enhancing the diversity of mean and variance. We evaluate the effectiveness of our NABN for image classification on CIFAR-10, retinal OCT, and chest X-ray datasets with various convolutional network architectures such as ResNet-20, ResNet-32, ResNet-44, and ResNet-50. Furthermore, experimental results demonstrate the superiority of our proposed approach over the traditional BN for medical image segmentation using U-Net, as evaluated on the MSD liver dataset. Code is available at https://github.com/ROSENty/NABN.git.
•We present a straightforward and efficient technique called NABN, which can be a more powerful substitute for BN to enhance the generalization ability and robustness of neural networks.•NABN maintains the regularization effect of BN and enjoys the benefits of adding noise, e.g., smoothing the structure of the input space.•Rigorous experiments conducted on the medical image classification and segmentation tasks provide compelling evidence for the effectiveness of NABN. Visualization results further illustrate that the NABN helps the network locate complicated lesions. |
---|---|
ISSN: | 1568-4946 1872-9681 |
DOI: | 10.1016/j.asoc.2024.111337 |