Resource allocation with edge computing in IoT networks via machine learning and deep learning
In this paper, we have prepared an optimal offloading scheme for resource allocation in IOT networks with edge computing. We have explored various machine learning and deep learning models for creating an optimal offloading scheme. ML and DL models outperforms most of the traditional methods of deci...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we have prepared an optimal offloading scheme for resource allocation in IOT networks with edge computing. We have explored various machine learning and deep learning models for creating an optimal offloading scheme. ML and DL models outperforms most of the traditional methods of deciding the offloading scheme in a very cost effective manner. Edge computing is a very sensational invention in today’s era. It has improved the functionality of countless devices. The computing ability of IOT devices have also improved with the latest discoveries. To provide the IOT user a good quality of service, an optimal offloading scheme is needed to decide whether to run a task locally or it should be offloaded to the edge server. We have made clusters of the tasks on the basis of priority scores given to the tasks according to various factors. We have used k-means clustering for creating the clusters and then a SVM machine learning model is used to classify the tasks. The classified groups are then sent to a Deep Q-Network where the tasks in each group are further classified by learning the optimal policy using Q-function in Q-learning. It also improved the efficiency of the model. Our offloading scheme is cost efficient as well as it also ensures a quality of service. |
---|---|
ISSN: | 0094-243X 1551-7616 |
DOI: | 10.1063/5.0148082 |