Kurusch Ebrahimi-Fard, Department of Mathematical Sciences, NTNU
Fabian Harang, Department of Economics, Norwegian Business school, Oslo
Fride Straum, Department of Mathematical Sciences, NTNU



Propelled by the demonstrated value of applications utilizing Artificial Intelligence (AI) in numerous facets of contemporary daily living, research in AI is experiencing swift and dynamic growth. Machine Learning (ML) stands as a fundamental component of modern AI. A highly practical application of ML, for instance, involves algorithms specifically designed to diagnose cancer by analyzing medical scans [10].
In a nutshell, ML algorithms harness data to make predictions without the need for explicit programming. It can be divided into three categories: reinforcement learning, unsupervised learning, and supervised learning. We are mainly interested in latter, where models are trained on labeled data.
Unlike classical computing, Quantum Computing (QC) operates on the principles of quantum mechanics. While digital computers process information using bits, 0 or 1, quantum computers utilize quantum bits (qubits), which can represent both 0 and 1, as well as linear combinations of them —a state known as superposition. Consequently, quantum computers represent a fundamentally different paradigm from traditional supercomputers. Qubit operations are executed by unitary transformations, and a unitary transformation of a superposition is again another superposition. The excitement surrounding QC, which has been around for four decades [1], revolves around the concept of quantum supremacy.
This term denotes the ability to surpass classical computers in problem-solving speed and address tasks that are presently unfeasible for classical devices. It is widely believed that the problem of finding the prime factor of an integer, and the ”discrete logarithmic” problem have no efficient solution on a classical device. However, in 1994, Peter Shor showed that this problem could indeed be solved efficiently on a quantum computer [11].
Presently, quantum devices are accessible to the public via cloud services such as Quiskit (IBM), but these so-called Noisy Intermediate-Scale Quantum (NISQ) devices are not yet capable of performing largescale computations efficiently [7]. It should also be mentioned that critical questions persist regarding the optimization of quantum algorithms for real-world applications as well as the rigorous demonstration of quantum supremacy. Saying this, it is worth noting that there are skeptics in the field of quantum computing, most prominently, Gil Kalai, who presented an argument based on computational complexity that questions the practicality of quantum computers [9].
QC+ML=QML
In recent years, the concept of Quantum Machine Learning (QML) has emerged as a groundbreaking concept. In the broadest sense, QML is the synergy between quantum mechanics and ML. One line of research in this direction deals with how ML can be used to better understand quantum physics, exemplified in [5] where the authors use unsupervised learning to detect quantum entaglements. Another line of research in QML lies in the fusion between QC and ML that revolves around the development of quantum algorithms with the potential to outperform classical computers in the realm of ML tasks [3,14, 16]1. This field presents a promising area for exploration [4], even though we do not (yet) possess adequate quantum devices.

Challenges and promises
According to the recent textbook by Maria Schuld and Francesco Petruccione [14], a QML algorithm may be characterised by looking at classical data, such as text or images, from a quantum mechanical viewpoint with the goal of creating quantum algorithms that tackle ML problems in a more efficient/novel way compared to traditional ML algorithms. Many articles on QML try to achieve so-called quantum speedups of ML algorithms, in other words, an algorithm solving a problem more efficiently by using QC. In the search of high-speed algorithms for trade and payment execution in financial markets, such endeavours have, maybe not surprisingly, received much attention from both researchers and practitioners in finance [8]. Other approaches study how QML can solve problems that a classical computer cannot. In a recent article [6], it was shown that QML can classify certain types of data which is impossible for ML on classical computers; researchers showed that by using a quantum kernel method one can reveal patterns in data that appeared to a classical computer to be completely random. Even though results are promising, QML is still far from reaching its full potential [2] – which makes it a viral topic of research.
Another trending subject of research in classical ML is known as kernel methods [15]. Recent discoveries show that certain quantum models can be viewed as kernel methods. Indeed, classical kernel methods in ML involve data being transformed into a feature space, through a feature map, which aims to capture the underlying characteristics of the input data, with a kernel serving to compare features across different data points. The associated Reproducing Kernel Hilbert Space (RKHS) consists of functions which are kernel expansions of the data and is useful for understanding and working with kernel methods in ML. A quantum (ML) model can be defined as a composition of maps, where classical data is transformed into quantum states and then undergoes a unitary transformation via a quantum circuit. The expected value of a measurement is then interpreted as the output of the model [12, 14]. The first step is to translate classical data into quantum states. There are several routines to obtain this for different types of data, but these routines can all be viewed as a feature map. Hence, one can construct a quantum kernel of the desired data-encoding feature map. It is here that Schuld discovered in [13] that the RKHS of a quantum kernel coincides with the space of quantum models. In other words, quantum models are in fact kernel methods.
Despite the elegant theoretical framework for encoding classical data into quantum states, the practical execution of this process presents a significant challenge, often referred to as the input problem. Certain algorithms experience a bottleneck when processing classical data on quantum devices, as it demands exponential computational time, leading to noticeable slowdowns [3].
Time series in QML
While much progress has been made in research into QML, the area remains rather unexplored from a mathematician’s point of view. Even the seemingly simple task of transforming time series data into a quantum model seems to be a challenging task, where, to the best of our knowledge, little progress has been made to date. The crux of this task is to understand how one should transform temporal data into quantum states. How should one bridge the gap between the temporal nature of the data and the abstract field of quantum mechanics? Progress on this research problem may lead to significant advancements in, e.g., finance or climate modelling where vast amounts of seemingly random time-series data is used to find patterns or signals.
Conclusion
Quantum computing alone and in combination with machine learning techniques has a promising potential to solve highly complex problems arising in a variety of sciences. While much progress has been made already, there are several challenges with both practical implementation and the understanding of the algorithms needed. This is a field of research which certainly will evolve rapidly over the next decades and provide new insight into long standing mathematical problems but also contribute to solutions of important societal problems.
References
[1] 40 years of quantum computing. Nature Reviews Physics, 4(1):1–1, 2022.
[2] Seeking a quantum advantage for machine learning. Nature Machine Intelligence, 5(8):813–813, 2023.
[3] Jacob Biamonte, Peter Wittek, Nicola Pancotti, Patrick Rebentrost, Nathan Wiebe, and Seth Lloyd. Quantum machine learning. Nature, 549:195–202, sept 2017.
[4] Marco Cerezo, Guillaume Verdon, Hsin-Yuan Huang, Lukasz Cincio, and Patrick J. Coles. Challenges and opportunities in quantum machine learning. Nat Comput Sci, 2:567–576, sept 2022.
[5] Yiwei Chen, Yu Pan, Guofeng Zhang, and Shuming Cheng. Detecting quantum entanglement with unsupervised learning. Quantum Science and Technology, 7(1):015005, nov 2021.
[6] Kristian Temme et. al. Supervised learning with quantum-enhanced feature spaces. https://www.nature.com/articles/s41586-019-0980-2, 2019 (accessed September 9, 2023).
[7] Anne Kirsten Frederiksen. The quantum computer exists, but is not all that powerful. https://www.dtu.dk/english/newsarchive/2023/05/ the-quantum-computer-exists-but-is-not-all-that-powerful, 2023 (accessed: February 11, 2024).
[8] Antoine Jacquier and Oleksiy Kondratyev. Quantum Machine Learning and Optimisation in Finance. Packt, 2022.
[9] Gil Kalai. The argument against quantum computers, the quantum laws of nature, and google’s supremacy claims, 2021.
[10] Alexander Selvikvaag Lundervold and Arvid Lundervold. An overview of deep learning in medical imaging focusing on MRI. Zeitschrift fÅNur Medizinische Physik, 29(2):102–127, 2019. Special Issue: Deep Learning in Medical Physics.
[11] Michael A. Nielsen and Isaac L. Chuang. Quantum computation and quantum information. Cambridge University Press, Cambridge, 2000.
[12] Davide Pastorello. Concise guide to quantum machine learning, 2023.
[13] Maria Schuld. Supervised quantum machine learning models are kernel methods. https://arxiv.org/abs/2101.11020, 2021.
[14] Maria Schuld and Francesco Petruccione. Machine Learning with Quantum Computers. Springer Cham, 2022.
[15] Bernhard SchÅNolkopf and Alexander J. Smola. Learning with kernels : support vector machines, regularization, optimization, and beyond, 2002.
[16] Peter Wittek. Quantum Machine Learning. Elsevier, 2014.

