PARS: Performance-driven adam-based resilient scaffold for federated learning


Akpinar E., Taşkıran M., Bolat B.

INFORMATION SCIENCES, sa.744, ss.123374, 2026 (Scopus)

  • Yayın Türü: Makale / Tam Makale
  • Basım Tarihi: 2026
  • Doi Numarası: 10.1016/j.ins.2026.123374
  • Dergi Adı: INFORMATION SCIENCES
  • Derginin Tarandığı İndeksler: Scopus, Compendex, Library, Information Science & Technology Abstracts (LISTA), MLA - Modern Language Association Database, zbMATH
  • Sayfa Sayıları: ss.123374
  • Yıldız Teknik Üniversitesi Adresli: Evet

Özet

Federated learning suffers from two persistent challenges: client drift under heterogeneous data distributions and unstable training dynamics due to exploding gradients. While SCAFFOLD mitigates drift through control variates, it has been limited to SGD-based updates, and Adam-based optimizers in federated learning remain vulnerable to drift. To address these issues, we propose Performance-driven Adam-based Resilient Scaffold (PARS), a federated optimization framework that unifies variance reduction with adaptive moment estimation. PARS incorporates control variate corrections directly into Adam’s update rule, leading to higher accuracy in early communication rounds and providing a natural remedy for the exploding gradient issue observed in SCAFFOLD. Furthermore, we introduce Directional Confidence (DC), a lightweight mechanism that adaptively scales control variates using cosine similarity between consecutive gradient updates. This design accelerates aligned updates and suppresses conflicting ones, thereby reducing oscillations and improving the reliability of control variates. Extensive experiments with 10 and 100 clients under IID and non-IID partitions demonstrate that PARS consistently outperforms FedAvg, FedProx, FedNova, FedYogi, FedAdam, and SCAFFOLD, particularly by achieving higher accuracy early on. Moreover, the heuristic-based DC mechanism further improves performance in both SCAFFOLD and PARS, particularly under certain scenarios.