Chainer Implementation of Fully Convolutional Networks. (Training code to reproduce the original result is available.)

Overview

fcn - Fully Convolutional Networks

PyPI Version Python Versions GitHub Actions

Chainer implementation of Fully Convolutional Networks.

Installation

pip install fcn

Inference

Inference is done as below:

# forwaring of the networks
img_file=https://farm2.staticflickr.com/1522/26471792680_a485afb024_z_d.jpg
fcn_infer.py --img-files $img_file --gpu -1 -o /tmp  # cpu mode
fcn_infer.py --img-files $img_file --gpu 0 -o /tmp   # gpu mode

Original Image: https://www.flickr.com/photos/faceme/26471792680/

Training

cd examples/voc
./download_datasets.py
./download_models.py

./train_fcn32s.py --gpu 0
# ./train_fcn16s.py --gpu 0
# ./train_fcn8s.py --gpu 0
# ./train_fcn8s_atonce.py --gpu 0

The accuracy of original implementation is computed with (evaluate.py) after converting the caffe model to chainer one using convert_caffe_to_chainermodel.py.
You can download vgg16 model from here: vgg16_from_caffe.npz.

FCN32s

Implementation Accuracy Accuracy Class Mean IU FWAVACC Model File
Original 90.4810 76.4824 63.6261 83.4580 fcn32s_from_caffe.npz
Ours (using vgg16_from_caffe.npz) 90.5668 76.8740 63.8180 83.5067 -

FCN16s

Implementation Accuracy Accuracy Class Mean IU FWAVACC Model File
Original 90.9971 78.0710 65.0050 84.2614 fcn16s_from_caffe.npz
Ours (using fcn32s_from_caffe.npz) 90.9671 78.0617 65.0911 84.2604 -
Ours (using fcn32s_voc_iter00092000.npz) 91.1009 77.2522 65.3628 84.3675 -

FCN8s

Implementation Accuracy Accuracy Class Mean IU FWAVACC Model File
Original 91.2212 77.6146 65.5126 84.5445 fcn8s_from_caffe.npz
Ours (using fcn16s_from_caffe.npz) 91.2513 77.1490 65.4789 84.5460 -
Ours (using fcn16s_voc_iter00100000.npz) 91.2608 78.1484 65.8444 84.6447 -

FCN8sAtOnce

Implementation Accuracy Accuracy Class Mean IU FWAVACC Model File
Original 91.1288 78.4979 65.3998 84.4326 fcn8s-atonce_from_caffe.npz
Ours (using vgg16_from_caffe.npz) 91.0883 77.3528 65.3433 84.4276 -

Left to right, FCN32s, FCN16s and FCN8s, which are fully trained using this repo. See above tables to see the accuracy.

License

See LICENSE.

Cite This Project

If you use this project in your research or wish to refer to the baseline results published in the README, please use the following BibTeX entry.

@misc{chainer-fcn2016,
  author =       {Ketaro Wada},
  title =        {{fcn: Chainer Implementation of Fully Convolutional Networks}},
  howpublished = {\url{https://github.com/wkentaro/fcn}},
  year =         {2016}
}
Owner
Kentaro Wada
I'm a final-year PhD student at Imperial College London working on computer vision and robotics.
Kentaro Wada
Repository for the semantic WMI loss

Installation: pip install -e . Installing DL2: First clone DL2 in a separate directory and install it using the following commands: git clone https:/

Nick Hoernle 4 Sep 15, 2022
A Pytorch reproduction of Range Loss, which is proposed in paper 《Range Loss for Deep Face Recognition with Long-Tailed Training Data》

RangeLoss Pytorch This is a Pytorch reproduction of Range Loss, which is proposed in paper 《Range Loss for Deep Face Recognition with Long-Tailed Trai

Youzhi Gu 7 Nov 27, 2021
[NeurIPS'21] Shape As Points: A Differentiable Poisson Solver

Shape As Points (SAP) Paper | Project Page | Short Video (6 min) | Long Video (12 min) This repository contains the implementation of the paper: Shape

394 Dec 30, 2022
Official implementation of "CrossPoint: Self-Supervised Cross-Modal Contrastive Learning for 3D Point Cloud Understanding" (CVPR, 2022)

CrossPoint: Self-Supervised Cross-Modal Contrastive Learning for 3D Point Cloud Understanding (CVPR'22) Paper Link | Project Page Abstract : Manual an

Mohamed Afham 152 Dec 23, 2022
Custom IMDB Dataset is extracted between 2020-2021 and custom distilBERT model is trained for movie success probability prediction

IMDB Success Predictor Project involves Web Scraping custom IMDB data between 2020 and 2021 of 10000 movies and shows sorted by number of votes ,fine

Gautam Diwan 1 Jan 18, 2022
A simplified framework and utilities for PyTorch

Here is Poutyne. Poutyne is a simplified framework for PyTorch and handles much of the boilerplating code needed to train neural networks. Use Poutyne

GRAAL/GRAIL 534 Dec 17, 2022
Experiments with differentiable stacks and queues in PyTorch

Please use stacknn-core instead! StackNN This project implements differentiable stacks and queues in PyTorch. The data structures are implemented in s

Will Merrill 141 Oct 06, 2022
Sample Code for "Pessimism Meets Invariance: Provably Efficient Offline Mean-Field Multi-Agent RL"

Sample Code for "Pessimism Meets Invariance: Provably Efficient Offline Mean-Field Multi-Agent RL" This is the official codebase for Pessimism Meets I

3 Sep 19, 2022
H&M Fashion Image similarity search with Weaviate and DocArray

H&M Fashion Image similarity search with Weaviate and DocArray This example shows how to do image similarity search using DocArray and Weaviate as Doc

Laura Ham 18 Aug 11, 2022
Implementation of the paper Recurrent Glimpse-based Decoder for Detection with Transformer.

REGO-Deformable DETR By Zhe Chen, Jing Zhang, and Dacheng Tao. This repository is the implementation of the paper Recurrent Glimpse-based Decoder for

Zhe Chen 33 Nov 30, 2022
"Learning and Analyzing Generation Order for Undirected Sequence Models" in Findings of EMNLP, 2021

undirected-generation-dev This repo contains the source code of the models described in the following paper "Learning and Analyzing Generation Order f

Yichen Jiang 0 Mar 25, 2022
This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning].

CG3 This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning]. R

12 Oct 28, 2022
JudeasRx - graphical app for doing personalized causal medicine using the methods invented by Judea Pearl et al.

JudeasRX Instructions Read the references given in the Theory and Notation section below Fire up the Jupyter Notebook judeas-rx.ipynb The notebook dra

Robert R. Tucci 19 Nov 07, 2022
Semantic-aware Grad-GAN for Virtual-to-Real Urban Scene Adaption

SG-GAN TensorFlow implementation of SG-GAN. Prerequisites TensorFlow (implemented in v1.3) numpy scipy pillow Getting Started Train Prepare dataset. W

lplcor 61 Jun 07, 2022
NICE-GAN — Official PyTorch Implementation Reusing Discriminators for Encoding: Towards Unsupervised Image-to-Image Translation

NICE-GAN-pytorch - Official PyTorch implementation of NICE-GAN: Reusing Discriminators for Encoding: Towards Unsupervised Image-to-Image Translation

Runfa Chen 208 Nov 25, 2022
Easy and comprehensive assessment of predictive power, with support for neuroimaging features

Documentation: https://raamana.github.io/neuropredict/ News As of v0.6, neuropredict now supports regression applications i.e. predicting continuous t

Pradeep Reddy Raamana 93 Nov 29, 2022
XViT - Space-time Mixing Attention for Video Transformer

XViT - Space-time Mixing Attention for Video Transformer This is the official implementation of the XViT paper: @inproceedings{bulat2021space, title

Adrian Bulat 33 Dec 23, 2022
Deconfounding Temporal Autoencoder: Estimating Treatment Effects over Time Using Noisy Proxies

Deconfounding Temporal Autoencoder (DTA) This is a repository for the paper "Deconfounding Temporal Autoencoder: Estimating Treatment Effects over Tim

Milan Kuzmanovic 3 Feb 04, 2022
Lexical Substitution Framework

LexSubGen Lexical Substitution Framework This repository contains the code to reproduce the results from the paper: Arefyev Nikolay, Sheludko Boris, P

Samsung 37 Sep 15, 2022
PyTorch implementations of algorithms for density estimation

pytorch-flows A PyTorch implementations of Masked Autoregressive Flow and some other invertible transformations from Glow: Generative Flow with Invert

Ilya Kostrikov 546 Dec 05, 2022