With the recent advancements of deep learning-based methods in image classification, the requirement of a huge amount of training data is inevitable to avoid overfitting problems. Moreover, supervised deep learning models require labeled datasets. Preparing such a huge amount of labeled data takes a lot of human effort and time. In this scenario, self-supervised models are becoming popular because of their ability to learn even from unlabelled datasets. This paper presents a method to automatically tune the hyperparameters of a self-supervised model for image classification. The learned features are transferred from a self-supervised model to a set of Fully Connected (FC) layer, where the hyperparameters related to the architecture are tuned. A softmax layer is used following the FC layers for classification. The hyperparameters such as the number of layers, number of units in each layer, learning rate, dropout, and optimizer are automatically tuned in these FC layers, using a Bayesian optimization technique called the Tree-structured Parzen Estimator Approach (TPE) algorithm. To evaluate the performance of the proposed method, state-of-the-art self-supervised models such as Simclr and SWAV are used to extract the learned features. Experiments are carried out on CIFAR-10, CIFAR-100, and Tiny Imagenet datasets. The proposed method outperforms the state-of-the-art.