Code to reproduce results from the paper "AmbientGAN: Generative models from lossy measurements"

Overview

AmbientGAN: Generative models from lossy measurements

This repository provides code to reproduce results from the paper AmbientGAN: Generative models from lossy measurements.

The training setup is as in the following diagram:

Here are a few example results:

Measured Baseline AmbientGAN (ours)

Few more samples from AmbientGAN models trained with 1-D projections:

Pad-Rotate-Project Pad-Rotate-Project-theta

The rest of the README describes how to reproduce the results.

Requirements

  • Python 2.7
  • Tensorflow >= 1.4.0
  • matplotlib
  • scipy
  • numpy
  • cvxpy
  • scikit-learn
  • tqdm
  • opencv-python
  • pandas

For pip installation, use $ pip install -r requirements.txt

Get the data

  • MNIST data is automatically downloaded
  • Get the celebA dataset here and put the jpeg files in ./data/celebA/
  • Get the CIFAR-10 python data from here and put it in ./data/cifar10/cifar-10-batches-py/*

Get inference models

We need inference models for computing the inception score.

  • For MNIST, you can train your own by

    cd ./src/mnist/inf
    python train.py
    

    [TODO]: Provide a pretrained model.

  • Inception model for use with CIFAR-10 is automatically downloaded.

Create experiment scripts

Run ./create_scripts/create_scripts.sh

This will create scripts for all the experiments in the paper.

[Optional] If you want to run only a subset of experiments you can define the grid in ./create_scripts/DATASET_NAME/grid_*.sh or if you wish to tweak a lot of parameters, you can change ./create_scripts/DATASET_NAME/base_script.sh. Then run ./create_scripts/create_scripts.sh as above to create the corresponding scripts (remember to remove any previous files from ./scripts/)

Run experiments

We provide scripts to train on multiple GPUs in parallel. For example, if you wish to use 4 GPUs, you can run: ./run_scripts/run_sequentially_parallel.sh "0 1 2 3"

This will start 4 GNU screens. Each program within the screen will attempt to acquire and run experiments from ./scripts/, one at a time. Each experiment run will save samples, checkpoints, etc. to ./results/.

See results as you train

Samples

You can see samples for each experiment in ./results/samples/EXPT_DIR/

EXPT_DIR is defined based on the hyperparameters of the experiment. See ./src/commons/dir_def.py to see how this is done.

Quantitative plots

Run

python src/aggregator_mnist.py
python src/aggregator_cifar.py

This will create pickle files in ./results/ with the relevant data in a Pandas dataframe.

Now use the ipython notebooks ./plotting_mnist.ipynb and ./plotting_cifar.ipynb to get the relevant plots. The generated plots are also saved to ./results/plots/ (make sure this directory exists)

Owner
Ashish Bora
Ashish Bora
The most simple and minimalistic navigation dashboard.

Navigation This project follows a goal to have simple and lightweight dashboard with different links. I use it to have my own self-hosted service dash

Yaroslav 23 Dec 23, 2022
Multiple-Object Tracking with Transformer

TransTrack: Multiple-Object Tracking with Transformer Introduction TransTrack: Multiple-Object Tracking with Transformer Models Training data Training

Peize Sun 537 Jan 04, 2023
Official Code for VideoLT: Large-scale Long-tailed Video Recognition (ICCV 2021)

Pytorch Code for VideoLT [Website][Paper] Updates [10/29/2021] Features uploaded to Google Drive, for access please send us an e-mail: zhangxing18 at

Skye 26 Sep 18, 2022
Process JSON files for neural recording sessions using Medtronic's BrainSense Percept PC neurostimulator

percept_processing This code processes JSON files for streamed neural data using Medtronic's Percept PC neurostimulator with BrainSense Technology for

Maria Olaru 3 Jun 06, 2022
Predicting Auction Sale Price using the kaggle bulldozer auction sales data: Modeling with Ensembles vs Neural Network

Predicting Auction Sale Price using the kaggle bulldozer auction sales data: Modeling with Ensembles vs Neural Network The performances of tree ensemb

Mustapha Unubi Momoh 2 Sep 13, 2022
This program writes christmas wish programmatically. It is using turtle as a pen pointer draw christmas trees and stars.

Introduction This is a simple program is written in python and turtle library. The objective of this program is to wish merry Christmas programmatical

Gunarakulan Gunaretnam 1 Dec 25, 2021
Bytedance Inc. 2.5k Jan 06, 2023
The Fundamental Clustering Problems Suite (FCPS) summaries 54 state-of-the-art clustering algorithms, common cluster challenges and estimations of the number of clusters as well as the testing for cluster tendency.

FCPS Fundamental Clustering Problems Suite The package provides over sixty state-of-the-art clustering algorithms for unsupervised machine learning pu

9 Nov 27, 2022
Official Implementation of Neural Splines

Neural Splines: Fitting 3D Surfaces with Inifinitely-Wide Neural Networks This repository contains the official implementation of the CVPR 2021 (Oral)

Francis Williams 56 Nov 29, 2022
Code of 3D Shape Variational Autoencoder Latent Disentanglement via Mini-Batch Feature Swapping for Bodies and Faces

3D Shape Variational Autoencoder Latent Disentanglement via Mini-Batch Feature Swapping for Bodies and Faces Installation After cloning the repo open

37 Dec 03, 2022
Label-Free Model Evaluation with Semi-Structured Dataset Representations

Label-Free Model Evaluation with Semi-Structured Dataset Representations Prerequisites This code uses the following libraries Python 3.7 NumPy PyTorch

8 Oct 06, 2022
Tools for computational pathology

A toolkit for computational pathology and machine learning. View documentation Please cite our paper Installation There are several ways to install Pa

254 Dec 12, 2022
A collection of models for image<->text generation in ACM MM 2021.

Bi-directional Image and Text Generation UMT-BITG (image & text generator) Unifying Multimodal Transformer for Bi-directional Image and Text Generatio

Multimedia Research 63 Oct 30, 2022
PiRank: Learning to Rank via Differentiable Sorting

PiRank: Learning to Rank via Differentiable Sorting This repository provides a reference implementation for learning PiRank-based models as described

54 Dec 17, 2022
EPSANet:An Efficient Pyramid Split Attention Block on Convolutional Neural Network

EPSANet:An Efficient Pyramid Split Attention Block on Convolutional Neural Network This repo contains the official Pytorch implementaion code and conf

Hu Zhang 175 Jan 07, 2023
KeypointDeformer: Unsupervised 3D Keypoint Discovery for Shape Control

KeypointDeformer: Unsupervised 3D Keypoint Discovery for Shape Control Tomas Jakab, Richard Tucker, Ameesh Makadia, Jiajun Wu, Noah Snavely, Angjoo Ka

Tomas Jakab 87 Nov 30, 2022
High-Fidelity Pluralistic Image Completion with Transformers (ICCV 2021)

Image Completion Transformer (ICT) Project Page | Paper (ArXiv) | Pre-trained Models | Supplemental Material This repository is the official pytorch i

Ziyu Wan 243 Jan 03, 2023
Preprocessed Datasets for our Multimodal NER paper

Unified Multimodal Transformer (UMT) for Multimodal Named Entity Recognition (MNER) Two MNER Datasets and Codes for our ACL'2020 paper: Improving Mult

76 Dec 21, 2022
Code Repo for the ACL21 paper "Common Sense Beyond English: Evaluating and Improving Multilingual LMs for Commonsense Reasoning"

Common Sense Beyond English: Evaluating and Improving Multilingual LMs for Commonsense Reasoning This is the Github repository of our paper, "Common S

INK Lab @ USC 19 Nov 30, 2022
Implementation of Basic Machine Learning Algorithms on small datasets using Scikit Learn.

Basic Machine Learning Algorithms All the basic Machine Learning Algorithms are implemented in Python using libraries Acknowledgements Machine Learnin

Piyal Banik 47 Oct 16, 2022