Long-tail distribution is a diffiffifficult challenge for knowledge graph embedding.We expect to solve the problem by complementing the information of a few shot relations through the neighbor aggregation mechanism of GCN. However, the GCN method and its derivations ignore semantic information and are unable to learn the representation of edges. To address this problem, we propose RCGCN-TE, Relation Correlations-aware Graph Convolutional Network with Text-Enhanced for knowledge graph embedding. As far as we know, this is the fifirst effffort to enable GCN to learn the representation of relations directly. First, the pre-trained language model is used to extract semantic information from description text. Then, the relation correlation graph is constructed by defifining the relation relevance function based on the co-occurrence pattern and semantic similarity of relations. Finally, two GCNS are designed to learn entities and relations respectively, with the loss function allowing them to learn jointly. The experimental results reveal that the proposed model outperforms the baselines on the tasks such as triple classifification and link prediction. For example, Hits@10, Hits@3, and Hits@1 improved by 8.23%, 37.49%, and 46.94%, respectively, on the entity prediction task.