Jaydeep Kishore, Snehasis Mukherjee. Impact of Autotuned Fully Connected Layers on Performance of Self-supervised Models for Image Classification[J]. Machine Intelligence Research. DOI: 10.1007/s11633-023-1435-7
Citation: Jaydeep Kishore, Snehasis Mukherjee. Impact of Autotuned Fully Connected Layers on Performance of Self-supervised Models for Image Classification[J]. Machine Intelligence Research. DOI: 10.1007/s11633-023-1435-7

Impact of Autotuned Fully Connected Layers on Performance of Self-supervised Models for Image Classification

  • With the recent advancements of deep learning-based methods in image classification, the requirement of a huge amount of training data is inevitable to avoid overfitting problems. Moreover, supervised deep learning models require labelled datasets for training. Preparing such a huge amount of labelled data requires considerable human effort and time. In this scenario, self-supervised models are becoming popular because of their ability to learn even from unlabelled datasets. However, the efficient transfer of knowledge learned by self-supervised models into a target task, is an unsolved problem. This paper proposes a method for the efficient transfer of knowledge learned by a self-supervised model, into a target task. Hyperparameters such as the number of layers, the number of units in each layer, learning rate, and dropout are automatically tuned in these fully connected (FC) layers using a Bayesian optimization technique called the tree-structured parzen estimator (TPE) approach algorithm. To evaluate the performance of the proposed method, state-of-the-art self-supervised models such as SimClr and SWAV are used to extract the learned features. Experiments are carried out on the CIFAR-10, CIFAR-100, and Tiny ImageNet datasets. The proposed method outperforms the baseline approach with margins of 2.97%, 2.45%, and 0.91% for the CIFAR-100, Tiny ImageNet, and CIFAR-10 datasets, respectively.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return