Automatic Image Thresholding Based on Shannon Entropy Difference and Dynamic Synergic Entropy
An automatic thresholding method based on Shannon entropy difference and dynamic synergic entropy is proposed to select a reasonable threshold from the gray level image with a unimodal, bimodal, multimodal, or peakless gray level histogram. Firstly, a new concept called Shannon entropy difference is...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.171218-171239 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | An automatic thresholding method based on Shannon entropy difference and dynamic synergic entropy is proposed to select a reasonable threshold from the gray level image with a unimodal, bimodal, multimodal, or peakless gray level histogram. Firstly, a new concept called Shannon entropy difference is proposed, and the stopping condition of a multi-scale multiplication transformation is automatically controlled by maximizing Shannon entropy difference to produce edge images. Secondly, the gray level image is thresholded by the gray levels in order from smallest to largest to generate a series of binary images, followed by extracting contour images from the binary images. Then, a series of gray level histograms that can dynamically reflect gray level distributions and pixel positions are constructed using the edge images and the contour images synergically. Finally, dynamic synergic Shannon entropy is calculated from this series of gray level histograms, and the threshold corresponding to maximum dynamic synergic entropy is taken as the final segmentation threshold. The experimental results on 40 synthetic images and 50 real-world images show that, although the proposed method is not superior to 8 automatic segmentation methods in computational efficiency, it has more flexible adaptivity of selecting threshold and better segmentation accuracy. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2020.3024718 |