Version 1
: Received: 30 October 2024 / Approved: 30 October 2024 / Online: 30 October 2024 (11:37:32 CET)
How to cite:
najafi, M. Enhancing Privacy-Preserving of Heterogeneous Federated Learning Algorithms Using Data-Free Knowledge Distillation. Preprints2024, 2024102389. https://doi.org/10.20944/preprints202410.2389.v1
najafi, M. Enhancing Privacy-Preserving of Heterogeneous Federated Learning Algorithms Using Data-Free Knowledge Distillation. Preprints 2024, 2024102389. https://doi.org/10.20944/preprints202410.2389.v1
najafi, M. Enhancing Privacy-Preserving of Heterogeneous Federated Learning Algorithms Using Data-Free Knowledge Distillation. Preprints2024, 2024102389. https://doi.org/10.20944/preprints202410.2389.v1
APA Style
najafi, M. (2024). Enhancing Privacy-Preserving of Heterogeneous Federated Learning Algorithms Using Data-Free Knowledge Distillation. Preprints. https://doi.org/10.20944/preprints202410.2389.v1
Chicago/Turabian Style
najafi, M. 2024 "Enhancing Privacy-Preserving of Heterogeneous Federated Learning Algorithms Using Data-Free Knowledge Distillation" Preprints. https://doi.org/10.20944/preprints202410.2389.v1
Abstract
Federated learning (FL) is a decentralized machine learning paradigm that allows multiple local clients to collaboratively train a global model by sharing their model parameters instead of private data, thereby mitigating privacy leakage. However, recent studies have shown that gradient-based data reconstruction attacks (DRA) can still expose private information by exploiting model parameters from local clients. Existing privacy-preserving FL strategies provide some defence against these attacks but at the cost of significantly reduced model accuracy. Moreover, the issue of client heterogeneity further exacerbates these FL methods, resulting in drifted global models, slower convergence, and decreased performance. This study aims to address the two main challenges of FL: data heterogeneity, particularly in Non-Identical and Independent Distributions (Non-IID) clients, and client privacy through DRA. By leveraging the Lagrange duality approach and employing a generator model to facilitate knowledge distillation (KD) between clients, thereby enhancing local model performance, this method aims to concurrently address the primary challenges encountered by FL in real-world applications.
Keywords
Federated learning; knowledge distillation; Non-Identical and Independent Distributions; data reconstruction attacks; Dual Decomposition Optimization
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.