Video streaming stands out as the most significant traffic type consumed by mobile devices. This increased demand has been a major driver for research on bitrate adaptation algorithms. Bitrate adaptation ensures high user-perceived quality, which, in turn, correlates with higher profits for content providers and delivery systems. Dynamic adaptive streaming over HTTP (DASH) is a widely adopted video streaming standard utilized by service providers to provide competitive quality of experience (QoE). It is capable of providing seamless streaming via uncertain network conditions by switching across different video qualities and their corresponding video segment bitrates. The complexity of the video streaming environment makes it a good candidate for different learning-based approaches. Accordingly, this paper proposes a reinforcement-learning (RL) deep Q-network called DQNReg, that enhances the classical deep Q-learning method. A segment-wise QoE-based reward function is formulated so that the learning strategy can converge towards maximizing the QoE outcome. The proposed RL-based adaptation approach is evaluated using trace-based simulation for both wireless local area network channels and 5G mobile channels. The performance of this RL-based method is compared to three methods: A heuristic method, a model-based method, and a classical learning-based method. The comparison shows that the RL-based method converges faster while achieving a high QoE score. In addition, it reduces the re-buffering duration while maintaining higher video quality and relatively lower quality variations.