Abstract
Real-world data usually present long-tailed distributions. Training on imbalanced data tends to render neural networks perform well on head classes while much worse on tail classes. The severe sparseness of training instances for the tail classes is the main challenge, which results in biased distribution estimation during training. Plenty of efforts have been devoted to ameliorating the challenge, including data resampling and synthesizing new training instances for tail classes. However, no prior research has exploited the transferable knowledge from head classes to tail classes for calibrating the distribution of tail classes. In this article, we suppose that tail classes can be enriched by similar head classes and propose a novel distribution calibration (DC) approach named as label-aware DC (LADC). LADC transfers the statistics from relevant head classes to infer the distribution of tail classes. Sampling from calibrated distribution further facilitates rebalancing the classifier. Experiments on both image and text long-tailed datasets demonstrate that LADC significantly outperforms existing methods. The visualization also shows that LADC provides a more accurate distribution estimation.
| Original language | English |
|---|---|
| Pages (from-to) | 6963-6975 |
| Number of pages | 13 |
| Journal | IEEE Transactions on Neural Networks and Learning Systems |
| Volume | 35 |
| Issue number | 5 |
| DOIs | |
| State | Published - 1 May 2024 |
| Externally published | Yes |
Keywords
- Deep learning
- long-tailed classification
- neural networks
Fingerprint
Dive into the research topics of 'Label-Aware Distribution Calibration for Long-Tailed Classification'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver