Quantum Machine Learning (QML) is a promising field that combines the power of quantum computing with machine learning. Variational quantum circuits, where parameters of circuits are learned classically, have been widely used in many recent applications of QML. This is an instance of a hybrid quantum-classical framework, where both classical and quantum components are present. However, applying these techniques to applications involving massive data is a challenging task. One way to overcome this, is using the concept of classical-quantum transfer learning with the help of dressed quantum circuit, introduced recently, where the underlying neural architecture is pre-trained classically, but at the final steps (decision layer), a quantum circuit is used, followed by quantum measurements and post-processing to classify images with high precision. In this paper, we applied hybrid classical-quantum transfer learning to another task of massive data processing, i.e. Natural Language Processing (NLP). We show how to (binary) classify short texts (e.g., SMS) with classical-quantum transfer learning, which was originally applied to image processing only. Our quantum network was pre-trained by Bidirectional Encoder Representations from the Transformers (BERT) model, and its variational quantum circuit is fine-tuned for text processing. We evaluated the performance of our hybrid neural architecture using the Receiver Operating Characteristic (ROC) curve, which is typically used in the evaluation of classification problems. The results indicate high precision as well as lower loss function. To our knowledge, our work is the first application of quantum transfer learning to the area of NLP. Finally a comparison with a tool that uses learning but in a different way than transfer learning is presented