WANG Yajie, TANG Xiangyun, ZHU Liehuang. A knowledge distillation based heterogeneous federated forgetting learning algorithm[J]. Journal of Beijing Normal University(Natural Science), 2025, 61(3): 300-306. DOI: 10.12202/j.0476-0301.2025059
Citation: WANG Yajie, TANG Xiangyun, ZHU Liehuang. A knowledge distillation based heterogeneous federated forgetting learning algorithm[J]. Journal of Beijing Normal University(Natural Science), 2025, 61(3): 300-306. DOI: 10.12202/j.0476-0301.2025059

A knowledge distillation based heterogeneous federated forgetting learning algorithm

  • To address privacy concerns in federated learning under heterogeneous data environments, we propose a heterogeneous federated unlearning algorithm that integrates knowledge distillation (KD) with a forgetting mechanism. The proposed method extracts generic knowledge from the global model and distills it to local clients, thereby preserving the learning capability of local models, whilst effectively removing information associated with sensitive data, to enable privacy-oriented unlearning. The proposed approach is found to maintain strong model performance across diverse data structures and device configurations, while significantly enhancing data privacy. This study provides an effective technical solution for privacy preservation in heterogeneous federated learning settings.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return