At present, the majority of techniques for assessing image quality are restricted to extracting characteristics from an image in a single space. To increase the precision of the evaluation of image quality, this paper propose a new dual-space multi-feature fusion based approach to full-reference image quality assessment, which simultaneously extracts features in YIQ spaces and L*a*b* spaces. Firstly, to describe the image's brightness and texture difference, the luminance features and slope features are taken out of the luminance channel. We extract chroma features in the color channel to describe the difference in color and saturation of the image. Extracting the image's gradient features to obtain its structural details. Extracting the image's spatial frequency features to represent the distribution differences of different spatial frequencies within the image. Next, a single feature vector is fused by extracting the features in both spaces. Finally, to predict the quality of the image, the random forest model receives as inputs the feature vector and the image's subjective score. Many experiments have been carried out on the four public databases, and contrasted with alternative methods. The experimental findings confirm that the suggested algorithm predicts image quality more accurately. The MATLAB source code and dataset of this article will be published on GitHub, and the corresponding author can be contacted if necessary.