Machine Learning-based Multi-objective Optimisation of Tunnel Field Effect Transistors
The ever-increasing growth of semiconductor industries owing to nano sizing of modern electronic devices intensifies the need to handle enormous data. It is necessary to rely on the process of creating algorithms that extract useful information from data, automatically. The majority of data found in...
Gespeichert in:
Veröffentlicht in: | SILICON 2022-11, Vol.14 (17), p.11109-11119 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The ever-increasing growth of semiconductor industries owing to nano sizing of modern electronic devices intensifies the need to handle enormous data. It is necessary to rely on the process of creating algorithms that extract useful information from data, automatically. The majority of data found in the real world are conflicting in nature and must be optimized to attain the demanded target. In the proposed work a Machine Learning (ML) based framework is developed for constructing the best Tunnel FETs on replacing the computationally intensive TCAD simulations. To optimize the design parameters and objectives, a Multi-Objective Optimization (MOO) technique based on Machine Learning and a natural selection approach of non-dominated sorting genetic algorithm-II (NSGA-II) is presented. TFETs optimum design together with the tradeoff between Natural Length, and Vertical Electric Field are automatically identified. The acquired results are compared to TCAD results for demonstrating the ML wrapped TFETs design in the MOO framework is advanced and applied to forecast optimal solutions for the design of TFETs. |
---|---|
ISSN: | 1876-990X 1876-9918 |
DOI: | 10.1007/s12633-022-01841-1 |