Unsupervised Clustering Based on Alpha-Divergence
dc.contributor | 黃聰明 | zh_TW |
dc.contributor | Huang, Tsung-Ming | en_US |
dc.contributor.author | 劉又寧 | zh_TW |
dc.contributor.author | Liu, You-Ning | en_US |
dc.date.accessioned | 2022-06-08T02:38:50Z | |
dc.date.available | 2022-11-05 | |
dc.date.available | 2022-06-08T02:38:50Z | |
dc.date.issued | 2022 | |
dc.description.abstract | none | zh_TW |
dc.description.abstract | Recently, many deep learning methods have been proposed to learning representations or clustering without labelled data. Using the famous ResNet[1] backbone as an effective feature extractor, we present a deep efficient clustering method that optimizes the data representation and learn the clustering map jointly. Despite the many successful applications of Kullback–Leibler divergence and Shannon entropy, we use alpha-divergence and Tsallis entropy to be an extension of the common loss functions. For detailed interpretation , we further analyze the relation between the clustering accuracy and the distinct alpha values. Also, we achieve 53.96% test accuracy on CIFAR-10[2] dataset, 27.24% accuracy on CIFAR-100-20[2] dataset in unsupervised tasks | en_US |
dc.description.sponsorship | 數學系 | zh_TW |
dc.identifier | 60740018S-40911 | |
dc.identifier.uri | https://etds.lib.ntnu.edu.tw/thesis/detail/4696e22f128e07651f0e1771157d5da5/ | |
dc.identifier.uri | http://rportal.lib.ntnu.edu.tw/handle/20.500.12235/117059 | |
dc.language | 英文 | |
dc.subject | none | zh_TW |
dc.subject | Alpha-Divergence | en_US |
dc.subject | Deep Learning | en_US |
dc.subject | Deep Clustering | en_US |
dc.subject | Contrastive Learning | en_US |
dc.subject | ResNet | en_US |
dc.subject | Tsallis Entropy | en_US |
dc.subject | KL Divergence | en_US |
dc.subject | Shannon Entropy | en_US |
dc.title | Unsupervised Clustering Based on Alpha-Divergence | zh_TW |
dc.title | Unsupervised Clustering Based on Alpha-Divergence | en_US |
dc.type | 學術論文 |