FedTD: Efficiently Share Telemedicine Data with Federated Distillation Learning
4th International Conference on Machine Learning for Cyber Security, ML4CS 2022
; 13655 LNCS:501-515, 2023.
Article
in English
| Scopus | ID: covidwho-2268770
ABSTRACT
With the Internet of Things and medical technology development, patients use wearable telemedicine devices to transmit health data to hospitals. The need for data sharing for public health has become more urgent under the COVID-19 pandemic. Previously, security protection technology was difficult to solve the increasing security risks and challenges of telemedicine. To address the above hindrances, Federated learning (FL) solves the difficulty for companies and institutions to share user data securely. The global server iterative aggregates the model parameters from the local server instead of uploading the user's data directly to the cloud server. We propose a new model of federated distillation learning called FedTD, which allows the different models between local hospital servers and global servers. Unlike traditional federated learning, we combine the knowledge distillation method to solve the non-Independent Identically Distribution (non-IID) problem of patient medical data. It provides a security solution for sharing patients' medical information among hospitals. We tested our approach on the COVID-19 Radiography and COVID-Chestxray datasets to improve the model performance and reduce communication costs. Extensive experiments show that our FedTD significantly outperforms the state-of-the-art. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
Full text:
Available
Collection:
Databases of international organizations
Database:
Scopus
Language:
English
Journal:
4th International Conference on Machine Learning for Cyber Security, ML4CS 2022
Year:
2023
Document Type:
Article
Similar
MEDLINE
...
LILACS
LIS