Vehicular edge computing (VEC) is emerging as a new computing paradigm to improve the quality of vehicular services and enhance the capabilities of vehicles. It aids vehicles execution tasks with low latency by deploying computing and storage resources close to the vehicle side. However, the traditional task offloading schemes waste the fine-grained offloading opportunities for subtasks by neglecting the decomposability and dependency of tasks. Furthermore, the continuous control problem caused by the learning-based offloading schemes should be taken into account. In this paper, we proposed an efficient dependency-aware task offloading scheme, which minimizes the average processing delay of tasks by fully utilizing the end-edge-cloud collaborative computing. Specifically, first, the directed acyclic graph (DAG) technique is utilized to model the inter-subtask dependency, which establishes the priority of subtask execution. Second, a task offloading algorithm based on Deep Deterministic Policy Gradient (DDPG) in Reinforcement Learning (RL) was proposed to obtain the optimal offloading strategy in a dynamic environment, which efficiently solves continuous control problems and helps reach fast convergence. Finally, we conduct extensive simulation experiments, and the experimental results show that the proposed dependency-aware task offloading scheme can achieve a good performance.