Leveraging distant supervision for relation extraction has emerged as a robust method to harness large text corpora, widely adopted to unearth new relational facts from unstructured text. Prevailing neural approaches have made significant strides in relation extraction by representing sentences in compact, low-dimensional vectors. However, the incorporation of syntactic nuances when modeling entities remains underexplored. Our study introduces a novel method for crafting syntax-aware entity embeddings to boost neural relation extraction. We start by encoding entity contexts within dependency trees through tree-GRU to generate sentence-level entity embeddings. We then apply both intra-sentence and inter-sentence attention mechanisms to distill entity embeddings at the sentence set level, considering every occurrence of the pertinent entity pair. The culmination of our methodology is the fusion of sentence and entity embeddings for relation classification. Our experiments on a benchmark dataset indicate that our approach harnesses the full potential of informative instances, thereby setting new benchmarks for relation extraction performance.