一种基于知识蒸馏的异构联邦遗忘学习算法

A knowledge distillation-based heterogeneous federated unearning algorithm

  • 摘要: 为解决异构数据环境下联邦学习中的隐私保护问题,提出了一种融合知识蒸馏(knowledge distillation,KD)与遗忘机制的异构联邦遗忘学习算法.该算法通过提取全局模型中的通用知识并将其传递至客户端,在保持模型学习能力的同时,有效剥离本地模型中与敏感数据相关的信息,从而实现隐私遗忘.试验结果表明,该方法在各类数据结构和设备配置下均能保持良好的模型性能,同时显著提升了数据的隐私性,为异构联邦学习环境中的隐私保护提供了1种有效的技术方案.

     

    Abstract: To address privacy concerns in federated learning under heterogeneous data environments, we propose a heterogeneous federated unlearning algorithm that integrates knowledge distillation (KD) with a forgetting mechanism. The proposed method extracts generic knowledge from the global model and distills it to local clients, thereby preserving the learning capability of local models while effectively removing information associated with sensitive data. This enables privacy-oriented unlearning. Experimental results demonstrate that the proposed approach maintains strong model performance across diverse data structures and device configurations, while significantly enhancing data privacy. This study provides an effective technical solution for privacy preservation in heterogeneous Federated Learning settings.

     

/

返回文章
返回