The future of digital health with federated learning

Federated Learning in research

The federated learning performs very well to improve the site performance in multicenter deep learning without data sharing.  The main objective of  federated learning is to enable multi-institutional  training without centralizing or sharing the physical data. In this concept the individual institution used their image data or clinical data to train their local deep learning  and machine learning model . These trained model are sent to the center and get back more robust and well train model. The importance of federated learning is to avoid the privacy risk related with the sharing and pooling of patient and sensitive data. Federated learning model concept is an effective methodology that merits further study  to make enable and boost the development of various model in any institution or organization, especially in medical domain The rapid bloom of software and technology used in medical domain for medical diagnostics, device, and other intervention. The main problem now a days to develop such a technology and research  is the clinical data.

Federated Learning in medical domain

A significant supporter of the interest for medical care information is the quick advent of Artificial intelligence  to improved applications. For instance, in medical imaging investigation has been driven forward lately by the coming of Deep learning . Deep learning  has empowered a speed of advancement in imaging decision, with ongoing significant outcomes in the fields of ophthalmology, , pathology, radiology, and dermatology.

Limitation of deep learning approach in medical domain

The main limitation of deep learning approach in medical domain is bulk of  medical dataset to train the model. All other data are excessively available in raw form and also can be capture by digital camera for analysis, but it is not same in case of medical data. In any case, clinical imaging information much of the time is siloed inside provider institution, and, therefore, collecting enormous scope datasets generally requires the exchange of information between these storehouses. Such exchanges present moral and lawful difficulties around preserving patient security. In the result there very low public medical image datasets exist. This has prompted a test of generalizability for Deep learning models in clinical imaging research, which are frequently prepared on single-organization datasets. There is a requirement for techniques to empower the advancement of general models for clinical use, without requiring the making of pooled datasets.

Distributed learning concept in federated learning

The method to centralize the multicenter datasets is called as “distributed” learning. In this worldview, information is not consolidated into a single, pooled dataset.  The data at a various institutions used to train the Deep learning  model by distributing the computational training operations across each sites. This approach requires just the exchange of learned model loads between institutions, which is the main idea of federated learning , hence it eliminates the direct sharing of individual data.  The federated learning can be decreased to work on utilizing certifiable private clinical information across different foundations, and that this approach makes a model that exhibits further developed generalizability both inside the taking part organizations and with outside information.

Example of federated learning

In this architecture illustrate the application of Federated learning at different 3 organization, UPSATAE medical university, UCLA health and National Cancer institution (NCI). To explain this model, they used medical images investigation for prostrate segmentation, and MRI diagnoses of cancer . This research elaborate the  federated learning  training and fusion of all individual model is able to produce general predictive models, which improved generalizability when it tested by external test dataset.

federated learning
image source

How to implement federated learning model

The implement federated learning in practice through frameworks and federated data sets. We’ll end by looking at how federated learning fits into our privacy preserving machine learning toolkits let’s get started normally to train a machine learning model you host both the model and the data on the same device and we call this centralized machine learning however for us that means apple and google upload our private conversations to the cloud to train their machine learning models federated learning flips the paradigm instead of sending our data to the cloud. We send the models to our devices and then we train these models locally on our devices this means the data never leaves our device. Once we’ve trained our model locally on the device then rather than sending data to the server. We send the model updates to the server and the server aggregates the model updates from each of the devices and updates the global model and then we repeat this process over multiple rounds of training and in reality. In federated learning devices have to communicate over wi-fi and so this is much slower than the computation and becomes the bottleneck in practice. Federated learning as a field has seen an explosion in the number of papers submitted to archive year on year.

Google and Apple using Federated learning

In practice google and apple are the biggest users of federated learning as they have access to millions of android and ios devices federated learning is typically done at scale with tens of thousands of devices and the more devices you have the longer it takes to converge and it can take tens of days to converge in practice. The google ai blog post on how they use federated learning with these techniques in practice.

Federated learning implementation platform

If we want to implement federated learning ourselves luckily for us there are a few frameworks that can help us out for TensorFlow we have TensorFlow federated and for pytorch we have piscift developed by the open mind community both frameworks have vibrant communities backing them that are growing these frameworks integrate tightly with their respective deep learning. The third  federation learning framework spun out the university of Cambridge called flour that takes a different approach instead of being tied to a particular framework it’s agnostic and you can plug and play different components.


The federated learning technology is one of the most powerful and advance technique in Artificial intelligence domain. Now the  useful data is very rare pubically available . Especially clinical time series data as well as medical imaging data is easily accessible to everyone and no anyone wants to share due to various kind of reason and privacy. But in current scenario the usage of software or AI device to diagnosis the different disease is becoming more and more famous. So to developed more accurate and robust software/device for diagnosis purpose , it is necessary to train model on bulk of dataset. At this point the federated learning help to develop such kind of model for an individual institution by training their own model , without sharing data. Once the model is fully train then just transfer the train models/weight to the federated global server, this server fused all the incoming model and generate more robust and accurate model and send back to each institutions/organizations .  This concept solves the bottleneck of  sharing  private and privacy data , we just need to send train models/weights.


Read more :  Multi-Task Learning for Deep Learning

Add a Comment

Your email address will not be published.