We implement a decentralized method for collaborative filtering via matrix factorization that can rest on peer-to-peer Distributed Hash Table architectures like Pastry and Chord, for the purpose of realizing a scalable and robust decentralised search engine. We begin by introducing a centralized method for matrix factorization that involves minimizing a cost function consisting of the ratings and user and item latent factors. We introduce biases and then derive the gradients for the error update step. We then introduce gossip learning; a matrix-factorization based collaborative filtering approach for a decentralized network. Since we could not obtain a dataset of browsing activity of users where the rating would constitute the number of times a website is visited, we used a dataset of movie ratings to test our method. This preserves the user and item biases and provides user-item ratings for rating to latent factor error computation. We performed a train-test split and loaded the training and validation set onto each node separately. This method is then tested using the PeerSim peer-to-peer network simulator to obtain the best learning rate $\eta$, regularization parameter $\lambda$ and latent factors $k$ for a $10,000$ node peer-to-peer network. The optimization metric used is the Root Mean Squared Error measuring the discrepancy between the ratings in the validation set and their respective predictions by model updates on the training set at each node. We found that by the fortieth cycle of the PeerSim simulator, the training set is fully distributed across each node. The best performing parameter setting over our $63$ experiment runs has learning rate $\eta = 0.001$, regularization parameter $\lambda = 1$ and latent factors $k = 20$. These settings are then built into a basic working prototype of a Pastry algorithm based peer-to-peer distributed search engine with message passing tested for $50$ nodes.