An optimizing method for performance and resource utilization in quantum machine learning circuits

Quantum computing is a new and advanced topic that refers to calculations based on the principles of quantum mechanics. It makes certain kinds of problems be solved easier compared to classical computers. This advantage of quantum computing can be used to implement many existing problems in differen...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific reports 2022-10, Vol.12 (1), p.16949-16, Article 16949
Hauptverfasser: Salehi, Tahereh, Zomorodi, Mariam, Plawiak, Pawel, Abbaszade, Mina, Salari, Vahid
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Quantum computing is a new and advanced topic that refers to calculations based on the principles of quantum mechanics. It makes certain kinds of problems be solved easier compared to classical computers. This advantage of quantum computing can be used to implement many existing problems in different fields incredibly effectively. One important field that quantum computing has shown great results in machine learning. Until now, many different quantum algorithms have been presented to perform different machine learning approaches. In some special cases, the execution time of these quantum algorithms will be reduced exponentially compared to the classical ones. But at the same time, with increasing data volume and computation time, taking care of systems to prevent unwanted interactions with the environment can be a daunting task and since these algorithms work on machine learning problems, which usually includes big data, their implementation is very costly in terms of quantum resources. Here, in this paper, we have proposed an approach to reduce the cost of quantum circuits and to optimize quantum machine learning circuits in particular. To reduce the number of resources used, in this paper an approach including different optimization algorithms is considered. Our approach is used to optimize quantum machine learning algorithms for big data. In this case, the optimized circuits run quantum machine learning algorithms in less time than the original ones and by preserving the original functionality. Our approach improves the number of quantum gates by 10.7% and 14.9% in different circuits respectively. This is the amount of reduction for one iteration of a given sub-circuit U in the main circuit. For cases where this sub-circuit is repeated more times in the main circuit, the optimization rate is increased. Therefore, by applying the proposed method to circuits with big data, both cost and performance are improved.
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-022-20375-5