StomachNet: Optimal Deep Learning Features Fusion for Stomach Abnormalities Classification

A fully automated design is proposed in this work employing optimal deep learning features for classifying gastrointestinal infections. Here, three prominent infections- ulcer, bleeding, polyp and a healthy class are considered as class labels. In the initial stage, the contrast is improved by fusin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020, Vol.8, p.197969-197981
Hauptverfasser: Khan, Muhammad Attique, Sarfraz, Muhammad Shahzad, Alhaisoni, Majed, Albesher, Abdulaziz A., Wang, Shuihua, Ashraf, Imran
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A fully automated design is proposed in this work employing optimal deep learning features for classifying gastrointestinal infections. Here, three prominent infections- ulcer, bleeding, polyp and a healthy class are considered as class labels. In the initial stage, the contrast is improved by fusing bi-directional histogram equalization with top-hat filtering output. The resultant fusion images are then passed to ResNet101 pre-trained model and trained once again using deep transfer learning. However, there are challenges involved in extracting deep learning features including impertinent information and redundancy. To mitigate this problem, we took advantage of two metaheuristic algorithms- Enhanced Crow Search and Differential Evolution. These algorithms are implemented in parallel to obtain optimal feature vectors. Following this, a maximum correlation-based fusion approach is applied to fuse optimal vectors from the previous step to obtain an enhanced vector. This final vector is given as input to Extreme Learning Machine (ELM) classifier for final classification. The proposed method is evaluated on a combined database. It accomplished an accuracy of 99.46%, which shows significant improvement over preceding techniques and other neural network architectures.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3034217