Abstract
To learn more comprehensive representational knowledge,a multi-level knowledge distillation method was proposed for student model,which divided the knowledge of teacher model into high,middle and low levels for distillation.These levels of knowledge were respectively referred to as model predictions,multi-scale fusion feature map and activations in intermediate feature layers.Then,distillation items were designed on the basis of multi-level knowledge.Feature distillation based on low-level knowledge ensured the feature distribution between student model and teacher model could be as close as possible.Afterwards,based on middle-level knowledge,spatial structure knowledge of images was transferred to the student model.The high-level knowledge was used to encode the dependency between adjacent frames,and then this implicit knowledge was transmitted to the student model.Moreover,additional semantic consistency loss effectively improved the inconsistent predictions between adjacent frames.Experiments showed that the proposed distillation method could significantly improve the accuracy of the student model and achieved a better balance between accuracy and efficiency,which had a good application prospect.
| Translated title of the contribution | Semantic segmentation method for continuous images based on multi-level knowledge distillation |
|---|---|
| Original language | Chinese (Traditional) |
| Pages (from-to) | 1244-1253 |
| Number of pages | 10 |
| Journal | Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS |
| Volume | 29 |
| Issue number | 4 |
| DOIs | |
| State | Published - 30 Apr 2023 |
| Externally published | Yes |
Fingerprint
Dive into the research topics of 'Semantic segmentation method for continuous images based on multi-level knowledge distillation'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver