dc.description.abstract |
The development of novel algorithms that process information in ways that are classically intractable and achieve computational speedup is one of the prime motivations in quantum information research. Machine learning is a rapidly advancing field with broad applications in the natural sciences where quantum-inspired algorithms may offer significant speedup. To date, several quantum algorithms for discriminative machine learning have been formulated and lately, quantum-enhanced generative machine learning models have gained tremendous attention. However, the higher levels of noise, and lack of scalability of current quantum devices limit the depth and complexity of these algorithms. In this thesis, we propose and realize a working hybrid quantum-classical algorithm, termed the QeVAE, or Quantum-enhanced Variational Autoencoder for generative machine learning, suitable for noisy-intermediate quantum devices. We present a thorough discussion of the algorithm and its implementation, before presenting the results of our calculations for learning distributions that are classically easy to learn and distributions that are classically hard. We show that our algorithm in the zero-latent size limit yields the well-known generative quantum-machine learning model, the quantum circuit born machine (QCBM). For classically easy distributions, we find that our model performs at-par with purely classical algorithms. For classically hard distributions, we find that our model outperforms the pure quantum and pure classical models in certain cases and verify the same on the IBMq Manila quantum computer. Furthermore, we show how QeVAEs can assist in the practical task of circuit compilation. Finally, we identify crucial directions for improvement of the current algorithm that will be key to developing more challenging quantum-inspired algorithms for machine learning. |
en_US |