Feature selection plays a vital role in promoting classifier performance. However, current methods cannot effectively distinguish the complex interaction in the selected features. To further remove these hidden negative interactions, we propose a Genetic Algorithm (GA)-like Dynamic Probability (GADP) method with mutual information method (MI), which has a two-layer structure. The first layer applies MRMR to obtain a primary feature subset. Based on the former candidate features, the GADP, as the second layer, can mine more supportive features. Instead of the commonly used approaches that they frequently focus on improving the operators of GA to enhance the search ability and lowering the convergence time, we improve GA by boldly abandoning the operators and employing the dynamic probability that relies on the performance of each chromosome to determine feature selection in the new generation. The dynamic probability mechanism significantly reduces the parameter number in GA and makes it easy to use. As each gene's probability is independent, the chromosome variety in GADP is more notable than in traditional GA, which ensures GADP has a wider search space and can effectively and accurately select relevant features. To verify the superiority of our method, we evaluate it under multiple conditions on 15 datasets. The results demonstrate the outperformance of the proposed method. Generally, it has the best accuracy. Further, we also compare the proposed model to the state-of-the-art heuristic methods like Harris Hawk Optimization (HHO), Salp Swarm Algorithm (SSA), and White Shark Optimizer (WSO). Our model still owns advantages over them.