Edge computing is a new paradigm for providing cloud computing capacities at the edge of network near mobile users. It offers an effective solution to help mobile devices with computation-intensive and delay-sensitive tasks. However, the edge of network presents a dynamic environment with large number of devices, high mobility of the end user, heterogeneous applications and intermittent traffific. In such environment, edge computing always encounters workload scheduling problem of how to effificiently schedule incoming tasks from mobile devices to edge servers or cloud servers, which is a hard and online problem. In this work, we focus on the workload scheduling problem with the goal of balancing the workload, reducing the service time and minimizing the failed task rate. We proposed a reinforcement learning-based approach, which can learn from the previous actions and achieve best scheduling in the absence of a mathematical model of the environment. Simulation results show that our proposed approach achieves the best performance in aspects of service time, virtual machine utilization, and failed tasks rate compared with other approaches. Our reinforcement learning-based approach can provide an effificient solution to the workload scheduling problem in edge computing.