Machine-learning models for analyzing TSOM images of nanostructures
Through-focus scanning optical microscopy (TSOM) is an economical and nondestructive method for measuring three-dimensional nanostructures. After obtaining a TSOM image, a library-matching method is typically used to interpret optical intensity information and determine the dimensions of a measureme...
Gespeichert in:
Veröffentlicht in: | Optics express 2019-11, Vol.27 (23), p.33978-33998 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Through-focus scanning optical microscopy (TSOM) is an economical and nondestructive method for measuring three-dimensional nanostructures. After obtaining a TSOM image, a library-matching method is typically used to interpret optical intensity information and determine the dimensions of a measurement target. To further improve dimensional measurement accuracy, this paper proposes a machine learning method that extracts texture information from TSOM images. The method extracts feature vectors of TSOM images in terms of the Gray-level Co-occurrence Matrix (GLCM), Local Binary Pattern (LBP), and Histogram of Oriented Gradient (HOG). We tested models trained with these vectors in isolation, in pairs, and a combination of all three to test seven possible feature vectors. Once normalized, these feature vectors were then used to train and test three machine-learning regression models: random forest, GBDT, and AdaBoost. Compared with the results of the library-matching method, the measurement accuracy of the machine learning method is considerably higher. When detecting dimensional features that fall into a wide range of sizes, the AdaBoost model used with the combined LBP and HOG feature vectors performs better than the others. For detecting dimensional features within a narrower range of sizes, the AdaBoost model combined with HOG feature extraction algorithm performs better. |
---|---|
ISSN: | 1094-4087 1094-4087 |
DOI: | 10.1364/OE.27.033978 |