• 卓越行动计划二期中文领军期刊
  • 中国科学引文数据库核心期刊
  • 中文核心期刊、中国科技核心期刊
  • 第1、2届国家期刊奖
  • 第3届国家期刊奖百种重点期刊奖
  • 中国精品科技期刊、中国百强报刊
  • 百种中国杰出学术期刊
WANG Yajie, TANG Xiangyun, ZHU Liehuang. A knowledge distillation-based heterogeneous federated unearning algorithm[J]. Journal of Beijing Normal University(Natural Science). DOI: 10.12202/j.0476-0301.2025059
Citation: WANG Yajie, TANG Xiangyun, ZHU Liehuang. A knowledge distillation-based heterogeneous federated unearning algorithm[J]. Journal of Beijing Normal University(Natural Science). DOI: 10.12202/j.0476-0301.2025059

A knowledge distillation-based heterogeneous federated unearning algorithm

More Information
  • Received Date: April 08, 2025
  • Available Online: May 29, 2025
  • To address privacy concerns in federated learning under heterogeneous data environments, we propose a heterogeneous federated unlearning algorithm that integrates knowledge distillation (KD) with a forgetting mechanism. The proposed method extracts generic knowledge from the global model and distills it to local clients, thereby preserving the learning capability of local models while effectively removing information associated with sensitive data. This enables privacy-oriented unlearning. Experimental results demonstrate that the proposed approach maintains strong model performance across diverse data structures and device configurations, while significantly enhancing data privacy. This study provides an effective technical solution for privacy preservation in heterogeneous Federated Learning settings.

  • [1]
    MOOR M,BANERJEE O,ABAD Z S H,et al. Foundation models for generalist medical artificial intelligence[J]. Nature,2023,616(7956):259 doi: 10.1038/s41586-023-05881-4
    [2]
    HINTON G,VINYALS O,DEAN J. Distilling the knowledge in a neural network[EB/OL]. (2015-03-09)[2025-03-26]. https://arxiv.org/abs/1503.02531
    [3]
    CHEN Z H,QIU G H,LI P,et al. MNGNAS:distilling adaptive combination of multiple searched networks for one-shot neural architecture search[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2023,45(11):13489
    [4]
    BHARATHI MADAVARAPU J,ISLAM H,APPATHURAI A,et al. Heterogeneous energy harvesting techniques for smart home IoT acceleration[J]. IEEE Access,2024,12:73667 doi: 10.1109/ACCESS.2024.3397664
    [5]
    邵仁荣,刘宇昂,张伟,等. 深度学习中知识蒸馏研究综述[J]. 计算机学报,2022,45(8):1638 doi: 10.11897/SP.J.1016.2022.01638
    [6]
    SHUVO M M H,ISLAM S K,CHENG J L,et al. Efficient acceleration of deep learning inference on resource-constrained edge devices:a review[J]. Proceedings of the IEEE,2023,111(1):42 doi: 10.1109/JPROC.2022.3226481
    [7]
    GHIMIRE D,KIL D,KIM S H. A survey on efficient convolutional neural networks and hardware acceleration[J]. Electronics,2022,11(6):945 doi: 10.3390/electronics11060945
    [8]
    WANG L,YOON K J. Knowledge distillation and student-teacher learning for visual intelligence:a review and new outlooks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2022,44(6):3048 doi: 10.1109/TPAMI.2021.3055564
    [9]
    LI Z H,XU P F,CHANG X J,et al. When object detection meets knowledge distillation:a survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2023,45(8):10555 doi: 10.1109/TPAMI.2023.3257546
    [10]
    黄震华,杨顺志,林威,等. 知识蒸馏研究综述[J]. 计算机学报,2022,45(3):624 doi: 10.11897/SP.J.1016.2022.00624
    [11]
    GOU J P,YU B S,MAYBANK S J,et al. Knowledge distillation:a survey[J]. International Journal of Computer Vision,2021,129(6):1789 doi: 10.1007/s11263-021-01453-z
    [12]
    YU R N,LIU S H,WANG X C. Dataset distillation:a comprehensive review[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2024,46(1):150 doi: 10.1109/TPAMI.2023.3323376
    [13]
    CHEN Y J,ZHENG B L,ZHANG Z H,et al. Deep learning on mobile and embedded devices[J]. ACM Computing Surveys,2021,53(4):1
    [14]
    KATARE D,PERINO D,NURMI J,et al. A survey on approximate edge AI for energy efficient autonomous driving services[J]. IEEE Communications Surveys & Tutorials,2023,25(4):2714
    [15]
    TIAN Y J,PEI S C,ZHANG X L,et al. Knowledge distillation on graphs:a survey[J]. ACM Computing Surveys,2025,57(8):1
    [16]
    张瑞麟,杜晋华,尹浩. 跨设备联邦学习中的客户端选择算法[J]. 软件学报,2024,35(12):5725
    [17]
    孙兵,刘艳,王田,等. 移动边缘网络中联邦学习效率优化综述[J]. 计算机研究与发展,2022,59(7):1439 doi: 10.7544/issn1000-1239.20210119
    [18]
    YE M,FANG X W,DU B,et al. Heterogeneous federated learning:state-of-the-art and research challenges[J]. ACM Computing Surveys,2024,56(3):1
    [19]
    IFTIKHAR S,GILL S S,SONG C H,et al. AI-based fog and edge computing:a systematic review,taxonomy and future directions[J]. Internet of Things,2023,21:100674 doi: 10.1016/j.iot.2022.100674
    [20]
    PFEIFFER K,RAPP M,KHALILI R,et al. Federated learning for computationally constrained heterogeneous devices:a survey[J]. ACM Computing Surveys,2023,55(14s):1
    [21]
    CHOQUETTE-CHOO C A,DVIJOTHAM K,PILLUTLA K,GANESH A,STEINKE T,THAKURTA A. Correlated noise provably beats independent noise for differentially private learning[EB/OL]. (2024-05-07)[2025-03-26]. https://arxiv.org/abs/2310.06771
    [22]
    XU C H,QU Y Y,XIANG Y,et al. Asynchronous federated learning on heterogeneous devices:a survey[J]. Computer Science Review,2023,50:100595 doi: 10.1016/j.cosrev.2023.100595
    [23]
    LEE J,SOLAT F,KIM T Y,et al. Federated learning-empowered mobile network management for 5G and beyond networks:from access to core[J]. IEEE Communications Surveys & Tutorials,2024,26(3):2176
    [24]
    GUENDOUZI B S,OUCHANI S,EL ASSAAD H,et al. A systematic review of federated learning:Challenges,aggregation methods,and development tools[J]. Journal of Network and Computer Applications,2023,220:103714 doi: 10.1016/j.jnca.2023.103714
    [25]
    QI P,CHIARO D,PICCIALLI F. Small models,big impact:a review on the power of lightweight Federated Learning[J]. Future Generation Computer Systems,2025,162:107484 doi: 10.1016/j.future.2024.107484
    [26]
    YANG L X,BELIARD C,ROSSI D. Heterogeneous data-aware federated learning[EB/OL]. (2020-11-12)[2025-03-26]. https://arxiv.org/abs/2011.06393
    [27]
    LIM W Y B,XIONG Z H,NIYATO D,et al. Realizing the metaverse with edge intelligence:a match made in heaven[J]. IEEE Wireless Communications,2023,30(4):64 doi: 10.1109/MWC.018.2100716
    [28]
    MU X T,SHEN Y L,CHENG K,et al. FedProc:Prototypical contrastive federated learning on non-IID data[J]. Future Generation Computer Systems,2023,143:93 doi: 10.1016/j.future.2023.01.019
    [29]
    BUYUKATES B,ULUKUS S. Timely communication in federated learning[EB/OL]. [2025-03-26]. https://ieeexplore.ieee.org/abstract/document/9484497

Catalog

    Article Metrics

    Article views (19) PDF downloads (14) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return