Abstract
Model training in the Internet of Vehicles (IoV) requires federated unlearning under three scenarios: (1) vehicles want to erase their historical updates to protect their privacy, (2) servers eliminate the influence of dropout vehicles, and (3) servers recover the global model from the poisoning attack launched by malicious clients. However, the existing federated unlearning methods based on retraining, such as FedRecover[1] and FedEraser[2], cannot be applied directly to the IoV scenarios. Firstly, in these schemes, the server needs to store all the local gradients uploaded by clients, resulting in significant storage space occupation. Secondly, their methods assume that all clients engaged in Federated learning (FL) from the beginning and will not exit, which does not align with the characteristics of FL in IoV where vehicles can join and leave FL at any time. To address these challenges, we propose a federated unlearning scheme suitable for IoV scenarios. We use a backtracking mechanism to achieve unlearning instead of reinitializing like other approaches. Then, we build our function over Cauchy mean value theorem to recover the performance of the global model. To reduce the storage burden of the server, we present a novel method that only saves the direction of the local updates, which can spare approximately 95% of storage overhead. The experimental results proved the effectiveness of our scheme that uses just the direction of historical gradients and historical models. Therefore, our work enables federated unlearning to be applied in practical applications in the IoV.
Original language | English |
---|---|
Title of host publication | 54th Annual IEEE/IFIP International Conference on Dependable Systems and Networks - Supplemental Volume (DSN-S) |
Publisher | IEEE |
Publication date | 2024 |
Pages | 96-103 |
ISBN (Print) | 979-8-3503-9571-6 |
ISBN (Electronic) | 979-8-3503-9570-9 |
DOIs | |
Publication status | Published - 2024 |
Keywords
- Faculty of Science