Implicit Model Specialization through DAG-based Decentralized Federated Learning

Overview

Federated Learning DAG Experiments

This repository contains software artifacts to reproduce the experiments presented in the Middleware '21 paper "Implicit Model Specialization through DAG-based Decentralized Federated Learning"

General Usage

Since we are still using TensorFlow 1, Python <=3.7 is required.

Depending on your setup, you can obtain the old python version using a version manager such as pyenv or using a Docker container:

cd federated-learning-dag
docker run -d --name federated-learning-dag \
  -v $PWD:/workspace \
  --workdir /workspace \
  --init --shm-size 8g \
  mcr.microsoft.com/vscode/devcontainers/python:3.7-bullseye \
    tail -f /dev/null
docker exec -it federated-learning-dag bash
# Run pipenv commands in this shell

# Clean up
docker rm -f federated-learning-dag 

Then, use pipenv to set up your environment. VS Code users can use the provided devcontainer template as a base environment. Run pipenv install to download the dependencies and run the code within a pipenv shell.

There are two execution variants: A default, single-threaded one, and an extended version using the 'ray' parallelism library.

Basic usage: python -m tangle.lab --help (or python -m tangle.ray --help).

By default, all experiments_figure_[*].py use ray for parallelism. This requires lots of main memory and a shared memory option for use within Docker. VS Code devcontainer users have to add "--shm-size", "8gb" (depending on the available memory) to the runArgs in .devcontainer/devcontainer.json.

To view a DAG (sometimes called a tangle) in a web browser, run python -m http.server in the repository root and open http://localhost:8000/viewer/. Enter the name of your experiment run and adjust the round slider to see something.

Obtaining the datasets

The contents of the ./data directory can be obtained from https://data.osmhpi.de/ipfs/QmQMe1Bd8X7tqQHWqcuS17AQZUqcfRQmNRgrenJD2o8xsS/.

Reproduction of the evaluation in the paper

The experiements in the paper can be reproduced by running python scripts in the root folder of this repository. They are organized by the figures in which the respective evaluation is presented and named experiments_figure_[*].py

The results of the federated averaging runs presented in Figure 9 as baseline can be reproduced by running run_fed_avg_[fmnist,poets,cifar].py The results presented in Table 2 are generated by the scripts for DAG-IS of Figure 9 as well.

Owner
Operating Systems and Middleware Group
Operating Systems and Middleware Group
Automatically creates genre collections for your Plex media

Plex Auto Genres Plex Auto Genres is a simple script that will add genre collection tags to your media making it much easier to search for genre speci

Shane Israel 63 Dec 31, 2022
Sequence to Sequence Models with PyTorch

Sequence to Sequence models with PyTorch This repository contains implementations of Sequence to Sequence (Seq2Seq) models in PyTorch At present it ha

Sandeep Subramanian 708 Dec 19, 2022
Algo-burn - Script to configure an Algorand address as a "burn" address for one or more ASA tokens

Algorand Burn Address This is a simple script to illustrate how a "burn address"

GSD 5 May 10, 2022
A machine learning project which can detect and predict the skin disease through image recognition.

ML-Project-2021 A machine learning project which can detect and predict the skin disease through image recognition. The dataset used for this is the H

Debshishu Ghosh 1 Jan 13, 2022
Simple renderer for use with MuJoCo (>=2.1.2) Python Bindings.

Viewer for MuJoCo in Python Interactive renderer to use with the official Python bindings for MuJoCo. Starting with version 2.1.2, MuJoCo comes with n

Rohan P. Singh 62 Dec 30, 2022
High-Resolution Image Synthesis with Latent Diffusion Models

Latent Diffusion Models Requirements A suitable conda environment named ldm can be created and activated with: conda env create -f environment.yaml co

CompVis Heidelberg 5.6k Jan 04, 2023
Simple and Robust Loss Design for Multi-Label Learning with Missing Labels

Simple and Robust Loss Design for Multi-Label Learning with Missing Labels Official PyTorch Implementation of the paper Simple and Robust Loss Design

Xinyu Huang 28 Oct 27, 2022
FridaHookAppTool - Frida Hook App Tool With Python

FridaHookAppTool(以下是Hook mpaas框架的例子) mpaas移动开发框架ios端抓包hook脚本 使用方法:链接数据线,开启burp设置

13 Nov 30, 2022
Tensorflow Implementation for "Pre-trained Deep Convolution Neural Network Model With Attention for Speech Emotion Recognition"

Tensorflow Implementation for "Pre-trained Deep Convolution Neural Network Model With Attention for Speech Emotion Recognition" Pre-trained Deep Convo

Ankush Malaker 5 Nov 11, 2022
Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D Shapes (CVPR 2021 Oral)

Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D Surfaces Official code release for NGLOD. For technical details, please refer t

659 Dec 27, 2022
Seasonal Contrast: Unsupervised Pre-Training from Uncurated Remote Sensing Data

Seasonal Contrast: Unsupervised Pre-Training from Uncurated Remote Sensing Data This is the official PyTorch implementation of the SeCo paper: @articl

ElementAI 101 Dec 12, 2022
Official implementation of "One-Shot Voice Conversion with Weight Adaptive Instance Normalization".

One-Shot Voice Conversion with Weight Adaptive Instance Normalization By Shengjie Huang, Yanyan Xu*, Dengfeng Ke*, Mingjie Chen, Thomas Hain. This rep

31 Dec 07, 2022
PyTorch Implementation of Daft-Exprt: Robust Prosody Transfer Across Speakers for Expressive Speech Synthesis

Daft-Exprt - PyTorch Implementation PyTorch Implementation of Daft-Exprt: Robust Prosody Transfer Across Speakers for Expressive Speech Synthesis The

Keon Lee 47 Dec 18, 2022
Exploring Cross-Image Pixel Contrast for Semantic Segmentation

Exploring Cross-Image Pixel Contrast for Semantic Segmentation Exploring Cross-Image Pixel Contrast for Semantic Segmentation, Wenguan Wang, Tianfei Z

Tianfei Zhou 510 Jan 02, 2023
CodeContests is a competitive programming dataset for machine-learning

CodeContests CodeContests is a competitive programming dataset for machine-learning. This dataset was used when training AlphaCode. It consists of pro

DeepMind 1.6k Jan 08, 2023
DiAne is a smart fuzzer for IoT devices

Diane Diane is a fuzzer for IoT devices. Diane works by identifying fuzzing triggers in the IoT companion apps to produce valid yet under-constrained

seclab 28 Jan 04, 2023
Official implementation of NeuralFusion: Online Depth Map Fusion in Latent Space

NeuralFusion This is the official implementation of NeuralFusion: Online Depth Map Fusion in Latent Space. We provide code to train the proposed pipel

53 Jan 01, 2023
Code for CVPR2021 paper 'Where and What? Examining Interpretable Disentangled Representations'.

PS-SC GAN This repository contains the main code for training a PS-SC GAN (a GAN implemented with the Perceptual Simplicity and Spatial Constriction c

Xinqi/Steven Zhu 40 Dec 16, 2022
UltraGCN: An Ultra Simplification of Graph Convolutional Networks for Recommendation

UltraGCN This is our Pytorch implementation for our CIKM 2021 paper: Kelong Mao, Jieming Zhu, Xi Xiao, Biao Lu, Zhaowei Wang, Xiuqiang He. UltraGCN: A

XUEPAI 93 Jan 03, 2023
Cmsc11 arcade - Final Project for CMSC11

cmsc11_arcade Final Project for CMSC11 Developers: Limson, Mark Vincent Peñafiel

Gregory 1 Jan 18, 2022