In the realm of information extraction, event detection (ED) is a crucial subtask focused on identifying specific event instances in text. While integrating syntactic dependencies into graph convolutional networks has been validated as effective in previous research, these approaches often overlook the dependency label information, a rich linguistic resource beneficial for ED. This study introduces an innovative architecture, Dependency-aided Graph Convolution Networks (DaGCN), which leverages both syntactic structures and typed dependency labels. The core of DaGCN is an edge-aware node update module, designed to enrich word representations by incorporating syntactic connections through specific dependency types. Complementing this, a node-aware edge update module is introduced to refine relation representations with context-specific details, enhancing the overall effectiveness. Evaluated on the ACE2005 dataset, our DaGCN model significantly outperforms established baselines, highlighting the value of integrating dependency labels in ED tasks.