Skip to main navigation Skip to search Skip to main content

Federated domain generalization via data-centric flatness optimization

  • School of Computer Science and Technology, Harbin Institute of Technology
  • Tsinghua University

Research output: Contribution to journalArticlepeer-review

Abstract

Federated domain generalization (FedDG) aims to enhance the generalization ability of federally trained models when facing new clients with domain shifts. Loss landscape flatness is a critical factor for generalization, but pursuing flatness of the global model in FedDG is challenging due to privacy restrictions. To address this, we propose DCF, a Data-Centric Flatness optimization algorithm that approaches global flatness via upper bounds on surrogates of global data. It generates surrogate data within local clients through a globally constrained adversarial data augmentation strategy, splitting the seeking of flatness to local models. Theoretical analysis further supports its rationality by demonstrating that the proposed objective serves as an upper bound on the robust risk of the global model on global data distribution. Extensive experiments on multiple FedDG benchmarks demonstrate that our method consistently outperforms previous FedDG and federated learning methods.

Original languageEnglish
Article number113023
JournalPattern Recognition
Volume174
DOIs
StatePublished - Jun 2026
Externally publishedYes

Keywords

  • Data augmentation
  • Federated domain generalization
  • Flatness

Fingerprint

Dive into the research topics of 'Federated domain generalization via data-centric flatness optimization'. Together they form a unique fingerprint.

Cite this