Neural style transfer has achieved success in many tasks. It is also introduced to text style transfer, which uses a style image to generate transferred images with textures and shapes consistent with the semantic content of the reference image. However, when the text structure is complex, existing methods will encounter problems such as stroke adhesion and unclear text edges. This will affffect the aesthetics of the generated image and bring a lot of extra workload to the designers. This paper proposes an improved text style transfer network for complex multistroke texts. We use Shape-matching Generative Adversarial Network (GAN) as the baseline network and perform the following modififications: (1) Morphological methods, erosion and dilation, are introduced in image processing; (2) Residual modules and loss functions are added to the network; and (3) AdaBelief optimizer is adopted to constrain the transfer of text structure. Further, a new dataset of traditional Chinese characters is constructed to train the model. Experimental results show that the proposed method outperforms state-of-the-art methods on both simple characters and complex multi-stroke characters. It is shown that our method increases the readability and aesthetics of the text.