Question Answering (QA) is an important task in Natural Language Processing (NLP). An effective and efficient question classification model is essential to question answering, which can not only reduce answers search space but also guide the QA system to select the correct knowledge base and search strategy. In the literature, most existing approaches adopt self-attention mechanisms to classify the questions. Thus, they only capture the global dependencies, ignoring the influence of adjacent local representations. To address this issue, this paper proposes a novel self-attention distribution network to enhance the local context awareness by exploiting neighbor relations (SNR) among representations in the text. We introduce a learnable Gaussian bias to obtain an adaptive self-attention distribution via a dynamic window. This learnable Gaussian bias is then integrated into the original attention distribution to capture both the long�term dependency and local representation of semantic relations among words in different sentences. Furthermore, a novel position-aware part of speech (POS) attention embedding layer is proposed, which can contribute to finding the appropriate word corresponding to the original word as the central word. Extensive experiments are conducted on Experimental Data for Question Classification (EDQC) dataset and Yahoo! Answers Comprehensive Questions and Answers 1.0, the results demonstrate that our model significantly improves the performance, achieving 95.59% in coarse-grained level accuracy and 92.91% in fine-grained level accuracy respectively.