透過可控的知識蒸餾區域改進細粒度影像辨識的準確率
| dc.contributor | 林政宏 | zh_TW |
| dc.contributor | Lin, Cheng-Hung | en_US |
| dc.contributor.author | 蔡侑霖 | zh_TW |
| dc.contributor.author | Tsai, You-Lin | en_US |
| dc.date.accessioned | 2024-12-17T03:22:20Z | |
| dc.date.available | 2024-08-12 | |
| dc.date.issued | 2024 | |
| dc.description.abstract | 目前神經網路模型已經達到令人驚艷的表現,但這樣的模型往往有一個共同問題,就是架構過於龐大,不易實現在終端裝置。對於這樣的缺點前人提出了一個突破性的解法──知識蒸餾。知識蒸餾的技術的可以將大型網路架構學習的特徵有效轉移到一個簡單的模型,如此一來有效地降低模型複雜度,並且於過去的做法能達到不錯的轉移效果,但在細粒度的影像分類領域,少有人提出針對細粒度影像辨識領域專用的蒸餾方法。在本篇論文中,我們針對細粒度影像辨識的知識蒸餾過程加入模型對於區域關注度分布的調整策略,讓模型的知識轉移過程中,能更加專注於圖像細粒度的特徵。策略主要對於熱區圖的空間特徵進行調整,可以提升特徵次要區域,更符合作為蒸餾的知識:過濾低回饋的區域,有效降低雜訊,提昇學習專注度。本實驗測試於多個細粒度資料集,其中在CUB200-2011,對比未經蒸餾的原始模型能提昇4.86%的準確率,相對於傳統的知識蒸餾作法也能有1.05%的提昇。 | zh_TW |
| dc.description.abstract | State-of-the-art neural network models exhibit impressive performance but often suffer from large architectures, making them challenging to deploy on edge devices. Knowledge distillation offers a solution by transferring the features learned by a complex network to a simpler model, effectively reducing the model’s complexity. While past approaches have achieved good transfer results, there has been limited exploration of distillation methods specifically tailored for fine-grained image classification. In this paper, we introduce a knowledge distillation process for fine-grained image recognition that incorporates an adjustment strategy for the model's attention distribution on region importance. This allows the model to focus more on the fine-grained features of images. The strategy primarily involves adjusting the spatial characteristics of heatmaps to enhance secondary feature regions, thus better serving as distilled knowledge by filtering out low-feedback areas, effectively reducing noise, and improving learning focus. Our experiments conducted on multiple fine-grained datasets, including CUB200-2011, show that compared to the originalnon-distilled model, our approach achieves a 4.86% increase in accuracy. Additionally, it outperforms traditional knowledge distillation methods by 1.05%, demonstrating its effectiveness and potential benefits. | en_US |
| dc.description.sponsorship | 電機工程學系 | zh_TW |
| dc.identifier | 61075078H-45988 | |
| dc.identifier.uri | https://etds.lib.ntnu.edu.tw/thesis/detail/a042700e6d647efa81ab75ac54c2dcc2/ | |
| dc.identifier.uri | http://rportal.lib.ntnu.edu.tw/handle/20.500.12235/122921 | |
| dc.language | 中文 | |
| dc.subject | 細粒度影像辨識 | zh_TW |
| dc.subject | 知識蒸餾 | zh_TW |
| dc.subject | 類別激活映射圖 | zh_TW |
| dc.subject | Fine-grained Visual Classification | en_US |
| dc.subject | Knowledge Distillation | en_US |
| dc.subject | CAM | en_US |
| dc.title | 透過可控的知識蒸餾區域改進細粒度影像辨識的準確率 | zh_TW |
| dc.title | Improve Fine-grained Visual Classification Accuracy by Controllable Location Knowledge Distillation | en_US |
| dc.type | 學術論文 |
Files
Original bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- 202400045988-108452.pdf
- Size:
- 2.13 MB
- Format:
- Adobe Portable Document Format
- Description:
- 學術論文