z-logo
open-access-imgOpen Access
Binary classification of single qubits using quantum machine learning method
Author(s) -
Yinglei Teng,
Jie Wang,
Fufang Xu
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2006/1/012020
Subject(s) - qubit , bloch sphere , quantum computer , quantum machine learning , computer science , quantum algorithm , quantum , binary number , artificial intelligence , quantum information , theoretical computer science , algorithm , mathematics , physics , quantum mechanics , arithmetic
Machine learning can make predictions on unseen data by extracting information and constructing algorithms and statistical models from the training data, which is highly desirable for speech recognition and computer vision. Quantum computing harnesses the phenomena of quantum mechanics realizing parallel computing, which could solve certain computational problems, such as RSA encryption significantly faster than classical computers. Quantum machine learning is reasonable to enable machine learning faster than that of classical computers by utilizing quantum operations. Binary classification is a typical quantum machine learning task applied to single qubits. Here we present a binary classifier for regions. Firstly, we sample two sets of quantum data points located in the X-Z plane of the Bloch sphere randomly. Then we utilize a single parameterized rotation gate and measure the qubits along the Z axis. We train the hybrid model combined by quantum mdoel and classical Neutral network and regard the corss entropy the predictions of the classical Netural network and labels as the loss function. Based on the above, we present the results of the trained hybrid model to classify new quantum qubits.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here