Data Poisoning Attack Using Hybrid Particle Swarm Optimization in Connected and Autonomous Vehicles
journal contribution
posted on 2024-11-17, 13:54authored byChi Cui, Haiping Du, Zhijuan Jia, Yuchu He, Yanyan Yang, Menglu Jin
The development of connected and autonomous vehicles (CAV s) relies heavily on deep learning technology, which has been widely applied to perform a variety of tasks in CAYs. On the other hand, deep learning faces some security concerns. Data poisoning attack, as one of the security threats, can compromise the deep learning models by injecting poisoned training samples. The poisoned models may make more false predictions, and may cause fatal accidents of CA V s in the worst case. Therefore, the principles of poisoning attacks are worth studying in order to propose counter measures. In this work, we propose a black-box and clean-label data poisoning attack method that uses hybrid particle swarm optimization with simulated annealing to generate perturbations for poisoning. The attacking method is evaluated by experiments on the deep learning models of traffic sign recognition systems on CA V s, and the results show that the classification accuracies of the target deep learning models are obviously downgraded using our method, with a small portion of poisoned data samples in GTSRB dataset.
Funding
Natural Science Foundation of Henan Province (23A520043)
History
Journal title
Proceedings of IEEE Asia-Pacific Conference on Computer Science and Data Engineering, CSDE 2022