Deep learning technology can effectively assist physicians in diagnosing chest radiographs. Conventional domain adaptation methods suffer from inaccurate lesion region localization, large errors in feature extraction, and a large number of model parameters. To address these problems, we proposes a novel domain adaptation method called WDDM, which joints Wasserstein distance and discrepancy metric for chest radiograph abnormality identification. Specifically, our method uses BiFormer as a multi-scale feature extractor to extract deep feature representations of data samples, which focuses more on discriminant features than convolutional neural networks and Swin Transformer. In addition, based on the loss minimization of Wasserstein distance and contrast domain differences, the source domain samples closest to the target domain are selected to achieve similarity and dissimilarity across domains. Experimental results show that our method improves the mean value of AUC of chest radiograph classification by no less than 14.8% against the unmigrated methods which classify the target domain by training the resulting network directly with the source domain. In short, our method achieves higher classification accuracy and better generalization performance.