Quantum Transfer Learning for Real-World, Small, and Large-Scale Datasets

Soronzonbold Otgonbaatar, Gottfried Schwarz, Mihai Datcu, Dieter Kranzlmueller

Quantum machine learning (QML) networks promise to have quantum advantage for classifying supervised datasets over some conventional deep learning (DL) techniques due to its expressive power via local effective dimension. There are, however, two main challenges regardless of promised quantum advantage of QML networks: 1) Currently available quantum bits (qubits) are very small in number while real-world datasets are characterized by hundreds of large-scale elements (features). Additionally, there is not a single unified approach for embedding real-world large-scale datasets in limited qubits. 2) Some real-world datasets are very small for training QML networks. Hence, to tackle these two challenges for benchmarking and validating QML networks on real-world, small, and large-scale datasets in one-go, we employ quantum transfer learning composed a multi-qubit QML network and very deep convolutional network (VGG16) extracting informative features from any small, large-scale dataset. We use real amplitudes and strong entangling N-layer QML networks with and without data re-uploading layers as a multi-qubit QML network and evaluate their expressive power quantified by using local effective dimension; the lower local effective dimension of a QML network is, the better its performance on unseen data is. Our numerical result shows that the strong entangling N-layer QML network has lower local effective dimension than the real amplitudes QML network and outperforms it and classical transfer learning on the hard-to-classify three-class labelling problem. In addition, quantum transfer learning helps us to tackle the two challenges mentioned for benchmarking and validating QML networks on real-world, small, and large-scale datasets.

Knowledge Graph



Sign up or login to leave a comment