Multimodal recommendation systems aim to capture diverse user preferences through data such as text and images, offering more personalized recommendation services. Accurately grasping user preferences can enhance the precision of recommendations and augment the user experience. Existing models often exploit the external attributes of item to capture users' preferences while neglecting their intrinsic attributes, such as ease of cleaning and cost-effectiveness, which are revealed in reviews. These attributes often reveal themselves in overlooked reviews. However, directly incorporating reviews into the representation will bring additional noise. Therefore, we propose the Personalized Multi-Preference Recommender (PMPR) model, which integrates reviews with multimodal data to extract multifaceted user preferences, enhancing the personalization of recommendations. Specifically, we designed a heterogeneous graph learning module based on reviews and a homogeneous graph learning module that combines the review features with multimodal features to capture users' diverse preferences. Considering the varying informational content of reviews, PMPR processes each review individually and utilizes user IDs to generate review attention vectors for aggregating the review features to reduce review noise. Finally, we integrate the Top-K method for recommendation. We compare common review processing methods with PMPR's approach to validate its effectiveness. In comparative analyses across three publicly available datasets, our enhanced model consistently demonstrated superior performance when benchmarked against seven widely recognized models. The results indicate a noteworthy improvement over the current State-of-the-Art (SOTA) model, ranging from 2.76% to 22.52% in terms of average performance.