A Novel Decentralized Federated Learning Approach to Train on Globally Distributed, Poor Quality, and Protected Private Medical Data

Presented at: Published in Nature Scientific Reports, May 25, 2022

Authors: T.V. Nguyen, M. A. Dakka, S. M. Diakiw, M. D. VerMilyea, M. Perugini, J. M. M. Hall, and D. Perugini

Training on multiple diverse data sources is critical to ensure unbiased and generalizable AI. In healthcare, data privacy laws prohibit data from being moved outside the country of origin, preventing global medical datasets being centralized for AI training. Data-centric, cross-silo federated learning represents a pathway forward for training on distributed medical datasets. Existing approaches typically require updates to a training model to be transferred to a central server, potentially breaching data privacy laws unless the updates are sufficiently disguised or abstracted to prevent reconstruction of the dataset.

Here we present a completely decentralized federated learning approach, using knowledge distillation, ensuring data privacy and protection. Each node operates independently without needing to access external data. AI accuracy using this approach is found to be comparable to centralized training, and when nodes comprise poor-quality data, which is common in healthcare, AI accuracy can exceed the performance of traditional centralized training.

 

See the publication in Nature here.

Previous
Previous

Comparison of implantation rate and preimplantation genetic testing between hatching and hatched blastocysts

Next
Next

The Effect of Day of Blastulation as a Metric of Embryo Success