Privacy-Preserving federated learning in medical diagnosis with homomorphic re-Encryption

Publication Name

Computer Standards and Interfaces


Unlike traditional centralized machine learning, distributed machine learning provides more efficient and useful application scenarios. However, distributed learning may not meet some security requirements. For example, in medical treatment and diagnosis, an increasing number of people are using IoT devices to record their personal data, when training medical data, the users are not willing to reveal their private data to the training party. How to collect and train the data securely has become the main problem to be resolved. Federated learning can combine a large amount of scattered data for training, and protect user data. Compared with general distributed learning, federated learning is more suitable for training on scattered data. In this paper, we propose a privacy-preserving federated learning scheme that is based on the cryptographic primitive of homomorphic re-encryption, which can protect user data through homomorphic re-encryption and trains user data through batch gradient descent (BGD). In our scheme, we use the IoT device to encrypt and upload user data, the fog node to collect user data, and the server to complete data aggregation and re-encrypting. Besides, the security analysis and experimental results show that our scheme can complete model training while preserving user data and local models.

Open Access Status

This publication is not available as open access



Article Number


Funding Number


Funding Sponsor

National Natural Science Foundation of China



Link to publisher version (DOI)