With the tremendous development of lightweight convolution neural networks, they have been widely employed in the research of lightweight single image super-resolution. However, these methods focus on the lightweight design of feature extraction blocks and neglect the lightweight aggregation methods that aggregate features from the feature extraction blocks. This paper analyzes the memory consumption of generic aggregation methods and proposes an adaptive local aggregation method. It adopts a set of trainable parameters and channels attention to implement a selective aggregation approach, further enhancing the aggregated network's generalization ability. To further lighten the model and improve the overall performance, we propose a novel re-parameterized branching block structure with spatial and channel diversity. While re-parameterizable structures have been employed in several types of research, they ignore that too many branches add additional training time overhead and are not friendly to devices with strong parallel computing power like GPU. Therefore, we redesign the branch structure by retaining part of the branches from the prior work with a significant performance gain and adding a lightweight branch structure, successfully balancing the expressivity and training time-cost of the block. Comprehensive experiments demonstrate the necessity and superiority of our method over state-of-the-art single image super-resolution methods.