Segmentation of carbon nanotube images through an artificial neural network
Segmentation of carbon nanotube images is an important task for nanotechnology. The segmentation stage determines the accuracy of the measurement process of nanotube when assessing the quality of nanomaterials. In this work, we propose two segmentation algorithms for carbon nanotube images. Each alg...
Gespeichert in:
Veröffentlicht in: | Soft computing (Berlin, Germany) Germany), 2017-02, Vol.21 (3), p.611-625 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Segmentation of carbon nanotube images is an important task for nanotechnology. The segmentation stage determines the accuracy of the measurement process of nanotube when assessing the quality of nanomaterials. In this work, we propose two segmentation algorithms for carbon nanotube images. Each algorithm includes three stages: preprocessing, segmentation and postprocessing. The first one is applied on images from scanning electron microscopy and employs a matched filter bank in the preprocessing step followed by a neural network in the segmenting phase. The second algorithm uses the Perona–Malik filter for enhancing the nanotube information. The segmentation phase is composed of the relaxed Otsu’s threshold and an artificial neural network. This algorithm is applied on images from transmission electron microscopy. The postprocessing stage, for both algorithms, is based on mathematical morphology. The performance of the proposed algorithms is numerically evaluated by using real image databases, manually segmented by an expert. The algorithm for segmentation of scanning electron microscopy achieved 92.74% of overall accuracy, while the algorithm for segmentation of transmission electron microscopy obtained an accuracy of 73.99% if the whole image is considered. A performance improvement is accomplished if only the region of interest is segmented, arriving to 84.19% of overall accuracy. |
---|---|
ISSN: | 1432-7643 1433-7479 |
DOI: | 10.1007/s00500-016-2426-1 |