Self-training algorithm can quickly train an supervised classifier through a few labeled samples and lots of unlabeled samples. However, self-training algorithm is often affected by mislabeled samples, and local noise filter is proposed to detect the mislabeled samples. Nevertheless, current local noise filters have the problems: (a) Current local noise filters ignore the spatial distribution of the nearest neighbors in different classes. (b) They can’t perform well when mislabeled samples are located in the overlapping areas of different classes. To solve the above challenges, a new self-training algorithm based on density peaks combining globally adaptive multi-local noise filter (STDP-GAMNF) is proposed. Firstly, the spatial structure of data set is revealed by density peak clustering, and it is used for helping self-training to label unlabeled samples. In the meantime, after each epoch of labeling, GAMLNF can comprehensively judge whether a sample is a mislabeled sample from multiple classes or not, and will reduce the influence of edge samples effectively. The corresponding experimental results conducted on eighteen real-world data sets demonstrate that GAMLNF is not sensitive to the value of the neighbor parameter k, and it can be adaptive to find the appropriate number of neighbors of each class.