๐๐๐ฏ๐๐ง๐๐ข๐ง๐ ๐ ๐๐๐๐ซ๐๐ญ๐๐ ๐๐๐๐ซ๐ง๐ข๐ง๐ ๐ฐ๐ข๐ญ๐ก ๐๐๐๐ข๐๐ข๐๐ง๐ญ '๐๐ง๐ฅ๐๐๐ซ๐ง๐ข๐ง๐ '๐
Reference Paper : here
With privacy regulations empowering users with the right to be forgotten, it's essential to enable federated learning models to forget specific clients without retraining from scratch. This paper introduces a novel method that removes the influence of a client's data while significantly reducing communication and computation costs.
๐๐๐ฎ ๐๐๐๐๐ก๐๐๐๐ฉ๐จ:
Efficient Federated Unlearning: Removes a client's data influence using local unlearning followed by minimal federated learning rounds.
Performance: Achieves results comparable to complete retraining but with 5x to 24x reduced costs.
Privacy-Enhanced: No need to store global or client update histories.
Innovative Approach: Utilizes Projected Gradient Descent (PGD) for local unlearning by maximizing the loss function to effectively remove a clientโs data influence.
๐ผ๐๐๐๐ฉ๐๐ค๐ฃ๐๐ก ๐๐ฃ๐จ๐๐๐๐ฉ๐จ:
Scalability and Generalizability: Needs further evaluation across diverse datasets and real-world applications.
Single Client Focus: Primarily designed for unlearning a single client; multiple client unlearning requires exploration.
These advancements are crucial for developing robust, privacy-preserving federated learning systems, especially in sectors with stringent privacy requirements.
