Quantum Machine Learning: Fad or Future?
For the last few decades, classical machine learning has allowed us to improve the lives of many through automation, natural language processing, predictive analytics and much more. However, a major concern is the fact that we're fast approach the threshold of the maximum possible computational...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | For the last few decades, classical machine learning has allowed us to
improve the lives of many through automation, natural language processing,
predictive analytics and much more. However, a major concern is the fact that
we're fast approach the threshold of the maximum possible computational
capacity available to us by the means of classical computing devices including
CPUs, GPUs and Application Specific Integrated Circuits (ASICs). This is due to
the exponential increase in model sizes which now have parameters in the
magnitude of billions and trillions, requiring a significant amount of
computing resources across a significant amount of time, just to converge one
single model. To observe the efficacy of using quantum computing for certain
machine learning tasks and explore the improved potential of convergence, error
reduction and robustness to noisy data, this paper will look forth to test and
verify the aspects in which quantum machine learning can help improve over
classical machine learning approaches while also shedding light on the likely
limitations that have prevented quantum approaches to become the mainstream. A
major focus will be to recreate the work by Farhi et al and conduct experiments
using their theory of performing machine learning in a quantum context, with
assistance from the Tensorflow Quantum documentation. |
---|---|
DOI: | 10.48550/arxiv.2106.10714 |