Abstract
Federated domain generalization (FedDG) aims to enhance the generalization ability of federally trained models when facing new clients with domain shifts. Loss landscape flatness is a critical factor for generalization, but pursuing flatness of the global model in FedDG is challenging due to privacy restrictions. To address this, we propose DCF, a Data-Centric Flatness optimization algorithm that approaches global flatness via upper bounds on surrogates of global data. It generates surrogate data within local clients through a globally constrained adversarial data augmentation strategy, splitting the seeking of flatness to local models. Theoretical analysis further supports its rationality by demonstrating that the proposed objective serves as an upper bound on the robust risk of the global model on global data distribution. Extensive experiments on multiple FedDG benchmarks demonstrate that our method consistently outperforms previous FedDG and federated learning methods.
| Original language | English |
|---|---|
| Article number | 113023 |
| Journal | Pattern Recognition |
| Volume | 174 |
| DOIs | |
| State | Published - Jun 2026 |
| Externally published | Yes |
Keywords
- Data augmentation
- Federated domain generalization
- Flatness
Fingerprint
Dive into the research topics of 'Federated domain generalization via data-centric flatness optimization'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver