Channel coding considerations for wireless LANs
Daniel J. Costello Jr., Pierre R. Chevillat, et al.
ISIT 1997
In recent years, with the development of quantum machine learning, Quantum Neural Networks (QNNs) have gained increasing attention in the field of Natural Language Processing (NLP) and have achieved a series of promising results. However, most existing QNN models focus on the architectures of Quantum Recurrent Neural Network (QRNN) and Quantum Self-Attention Mechanism (QSAM). In this work, we propose a novel QNN model based on quantum convolution. We develop the quantum depthwise convolution that significantly reduces the number of parameters and lowers computational complexity. We also introduce the multi-scale feature fusion mechanism to enhance model performance by integrating word-level and sentence-level features. Additionally, we propose the quantum word embedding and quantum sentence embedding, which provide embedding vectors more efficiently. Through experiments on two benchmark text classification datasets, we demonstrate our model outperforms a wide range of state-of-the-art QNN models. Notably, our model achieves a new state-of-the-art test accuracy of 96.77% on the RP dataset. We also show the advantages of our quantum model over its classical counterparts in its ability to improve test accuracy using fewer parameters. Finally, an ablation test confirms the effectiveness of the multi-scale feature fusion mechanism and quantum depthwise convolution in enhancing model performance.
Daniel J. Costello Jr., Pierre R. Chevillat, et al.
ISIT 1997
Mario Blaum, John L. Fan, et al.
IEEE International Symposium on Information Theory - Proceedings
Ziv Bar-Yossef, T.S. Jayram, et al.
Journal of Computer and System Sciences
Harpreet S. Sawhney
IS&T/SPIE Electronic Imaging 1994