Exploiting Temporal Features in Calculating Automated Morphological Properties of Spiky Nanoparticles Using Deep Learning
Object segmentation in images is typically spatial and focuses on the spatial coherence of pixels. Nanoparticles in electron microscopy images are also segmented frame by frame, with subsequent morphological analysis. However, morphological analysis is inherently sequential, and a temporal regularit...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2024-10, Vol.24 (20), p.6541 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Object segmentation in images is typically spatial and focuses on the spatial coherence of pixels. Nanoparticles in electron microscopy images are also segmented frame by frame, with subsequent morphological analysis. However, morphological analysis is inherently sequential, and a temporal regularity is evident in the process. In this study, we extend the spatially focused morphological analysis by incorporating a fusion of hard and soft inductive bias from sequential machine learning techniques to account for temporal relationships. Previously, spiky Au nanoparticles (Au-SNPs) in electron microscopy images were analyzed, and their morphological properties were automatically generated using a hourglass convolutional neural network architecture. In this study, recurrent layers are integrated to capture the natural, sequential growth of the particles. The network is trained with a spike-focused loss function. Continuous segmentation of the images explores the regressive relationships among natural growth features, generating morphological statistics of the nanoparticles. This study comprehensively evaluates the proposed approach by comparing the results of segmentation and morphological properties analysis, demonstrating its superiority over earlier methods. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s24206541 |