黃聰明Huang, Tsung-Ming劉又寧Liu, You-Ning2022-06-082022-11-052022-06-082022https://etds.lib.ntnu.edu.tw/thesis/detail/4696e22f128e07651f0e1771157d5da5/http://rportal.lib.ntnu.edu.tw/handle/20.500.12235/117059noneRecently, many deep learning methods have been proposed to learning representations or clustering without labelled data. Using the famous ResNet[1] backbone as an effective feature extractor, we present a deep efficient clustering method that optimizes the data representation and learn the clustering map jointly. Despite the many successful applications of Kullback–Leibler divergence and Shannon entropy, we use alpha-divergence and Tsallis entropy to be an extension of the common loss functions. For detailed interpretation , we further analyze the relation between the clustering accuracy and the distinct alpha values. Also, we achieve 53.96% test accuracy on CIFAR-10[2] dataset, 27.24% accuracy on CIFAR-100-20[2] dataset in unsupervised tasksnoneAlpha-DivergenceDeep LearningDeep ClusteringContrastive LearningResNetTsallis EntropyKL DivergenceShannon EntropyUnsupervised Clustering Based on Alpha-DivergenceUnsupervised Clustering Based on Alpha-Divergence學術論文