Authors: Dani Manjah and Pierre Remacle
Last update: 17/03/2026
This repository documents how to run simulations and deploy Federated Learning (FL) experiments using Flower in a distributed, multi-machine setup for OMOP-CDM multi-hospital data. The 30-day readmission use case is provided as an illustrative example.
Note
This repository uses a simplified demonstration dataset.
The full experimental archive described in the paper is not publicly distributed and is planned for a future release.
You can install the project either with venv or Conda.
python -m venv fedomop
source fedomop/bin/activate
pip install --upgrade pip
pip install -e .conda create -n fedomop python=3.10
conda activate fedomop
pip install -e .This pipeline preprocesses MIMIC-IV v2.2 Electronic Health Record (EHR) data into structured static and time-series features.
The code provided here is dedicated to the readmission use case. The same overall pipeline can be adapted to other tasks such as:
- mortality prediction
- length of stay
- phenotyping
Access must first be approved through the official PhysioNet data use agreement.
PhysioNet portal:
https://physionet.org/content/mimiciv/2.2/
Scroll to the bottom of the page to find the instructions on how to become a credentialed user and which requirements must be fulfilled.
Once access is granted:
- Download MIMIC-IV v2.2 (let's assume it is
mimic-iv-2.2.zip). - Unzip it into the folder
preprocess_MIMIC. - Change
RawDataPathin the configuration fileconfig.pyto indicate the relative path, for example:"RawDataPath": "mimic-iv-2.2/". - Run the readmission dataset generation pipeline using the
base_configdefined in the code.
If you are in the root directory, run:
cd preprocess_MIMIC
python generate_dataset.py config.jsonThis generates CSV files containing the feature matrix X and the readmission target y in:
preprocess_MIMIC/data/outputFor more details about the data pipeline and outputs, see:
Simulation is the default mode in this repository.
To run a fully local federated simulation, make sure you are in the root directory where pyproject.toml is present, then execute:
flwr run . --streamThis will:
- spawn virtual clients
- partition the dataset
- train the federated model
- log metrics
The local-simulation runtime is defined in the Flower configuration file:
~/.flwr/config.tomlExample:
[superlink.local-simulation]
options.num-supernodes = 3This configuration runs the simulation locally with 3 virtual SuperNodes (clients).
You can override parameters defined in pyproject.toml with --run-config:
flwr run . --run-config='partitioner="dirichlet" dirichlet_alpha=0.8 local-epochs=2' --streamDeployment mode simulates a real multi-hospital distributed setup.
For each link and node, start a dedicated terminal.
flower-superlink --insecureExample with 3 hospitals:
flower-supernode --insecure \
--superlink 127.0.0.1:9092 \
--clientappio-api-address 127.0.0.1:9104 \
--node-config "partition-id=0 num-partitions=3"flower-supernode --insecure \
--superlink 127.0.0.1:9092 \
--clientappio-api-address 127.0.0.1:9105 \
--node-config "partition-id=1 num-partitions=3"flower-supernode --insecure \
--superlink 127.0.0.1:9092 \
--clientappio-api-address 127.0.0.1:9106 \
--node-config "partition-id=2 num-partitions=3"The local-deployment runtime must be added to config.toml.
If it is not already present, add the following:
[superlink.local-deployment]
address = "127.0.0.1:9093"
insecure = trueOnce it is included, run the following command in another terminal:
flwr run . local-deployment --streamThe framework reports both centralized and distributed metrics per round, including:
- loss
- accuracy
- AUROC
- AUPR
It also tracks summary statistics across clients, including:
- variance
- minimum
Simulation results are automatically saved in the results/ directory. The final model is also exported as a .pt file.
If you had launched with 10 nodes using our standard parameters, you should obtain:
This project is open-source under the Apache 2.0 License.
This project was developed as part of the MAIDAM BioWin project funded by the Walloon Region under grant agreement:
PIT ATMP - Convention 8881
