Natural language processing, which is guided by deep learning, has made amazing strides in the analysis and comprehension of a vast array of linguistic documents. To deal for channel noise and semantic distortion, DeepSC employs a hybrid semantic-channel coding. Implementing DeepSC, a deep learning system for semantic correspondence in text transmission. Semantic encoders, channel encoders, channel decoders, and semantic decoders make up the DeepSC transceiver, Cross-entropy and mutual-information are the two loss functions that are used to optimize the receiver in order to comprehend the semantic meaning and increase system capacity simultaneously. Additionally, a new statistic is put forth to truly reflect how well the DeepSC performs there at semantic level. According to the Transformer, the DeepSC goals increase the framework limit and reduce semantic errors by regaining the significance of phrases rather than identifying or correcting typos in conventional correspondences. Transfer learning is used to expedite the model training process and ensure that the DeepSC is relevant to diverse correspondence settings. Another measurement called sentence similarity is provided in order to justify the accurate portrayal of semantic correspondences. The system’s robustness is next evaluated by analyzing the findings with certain conventional communication systems (combinations of various source and channel codes). These channels include AWGN, Rayleigh, and Nagakami.