A curated (most recent) list of resources for Learning with Noisy Labels

Overview

Learning-with-Noisy-Labels

A curated list of most recent papers & codes in Learning with Noisy Labels


Papers & Code in 2021

This repo focus on papers after 2019, for previous works, please refer to (https://github.com/subeeshvasu/Awesome-Learning-with-Label-Noise).

ICML 2021

Conference date: Jul 18, 2021 -- Jul 24, 2021

  • [UCSC REAL Lab] The importance of understanding instance-level noisy labels. [Paper]
  • [UCSC REAL Lab] Clusterability as an Alternative to Anchor Points When Learning with Noisy Labels. [Paper][Code]
  • Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision. [Paper][Code]
  • Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization. [Paper][Code]
  • Class2Simi: A Noise Reduction Perspective on Learning with Noisy Labels. [Paper]
  • Provably End-to-end Label-noise Learning without Anchor Points. [Paper]
  • Asymmetric Loss Functions for Learning with Noisy Labels. [Paper][Code]
  • Confidence Scores Make Instance-dependent Label-noise Learning Possible. [Paper]
  • Provable Generalization of SGD-trained Neural Networks of Any Width in the Presence of Adversarial Label Noise. [Paper]
  • Wasserstein Distributional Normalization For Robust Distributional Certification of Noisy Labeled Data. [Paper]
  • Learning from Noisy Labels with No Change to the Training Process. [Paper]

ICLR 2021

  • [UCSC REAL Lab] When Optimizing f-Divergence is Robust with Label Noise. [Paper][Code]
  • [UCSC REAL Lab] Learning with Instance-Dependent Label Noise: A Sample Sieve Approach. [Paper][Code]
  • Noise against noise: stochastic label noise helps combat inherent label noise. [Paper][Code]
  • Learning with Feature-Dependent Label Noise: A Progressive Approach. [Paper][Code]
  • Robust early-learning: Hindering the memorization of noisy labels. [Paper][Code]
  • MoPro: Webly Supervised Learning with Momentum Prototypes. [Paper] [Code]
  • Robust Curriculum Learning: from clean label detection to noisy label self-correction. [Paper]
  • How Does Mixup Help With Robustness and Generalization? [Paper]
  • Theoretical Analysis of Self-Training with Deep Networks on Unlabeled Data. [Paper]

CVPR 2021

Conference date: Jun 19, 2021 -- Jun 25, 2021

  • [UCSC REAL Lab] A Second-Order Approach to Learning with Instance-Dependent Label Noise. [Paper][Code]
  • Improving Unsupervised Image Clustering With Robust Learning. [Paper]
  • Multi-Objective Interpolation Training for Robustness to Label Noise. [Paper][Code]
  • Noise-resistant Deep Metric Learning with Ranking-based Instance Selection. [Paper][Code]
  • Augmentation Strategies for Learning with Noisy Labels. [Paper][Code]
  • Jo-SRC: A Contrastive Approach for Combating Noisy Labels. [Paper][Code]
  • Multi-Objective Interpolation Training for Robustness to Label Noise. [Paper][Code]
  • Partially View-aligned Representation Learning with Noise-robust Contrastive Loss. [Paper][Code]
  • Correlated Input-Dependent Label Noise in Large-Scale Image Classification. [Paper]
  • DAT: Training Deep Networks Robust To Label-Noise by Matching the Feature Distributions.[Paper]
  • Faster Meta Update Strategy for Noise-Robust Deep Learning. [Paper][Code]
  • DualGraph: A graph-based method for reasoning about label noise. [Paper]
  • Background-Aware Pooling and Noise-Aware Loss for Weakly-Supervised Semantic Segmentation. [Paper]
  • Joint Negative and Positive Learning for Noisy Labels. [Paper]
  • Faster Meta Update Strategy for Noise-Robust Deep Learning. [Paper]
  • AutoDO: Robust AutoAugment for Biased Data with Label Noise via Scalable Probabilistic Implicit Differentiation. [Paper][Code]
  • Meta Pseudo Labels. [Paper][Code]
  • All Labels Are Not Created Equal: Enhancing Semi-supervision via Label Grouping and Co-training. [Paper][Code]
  • SimPLE: Similar Pseudo Label Exploitation for Semi-Supervised Classification. [Paper][Code]

AISTATS 2021

Conference date: Apr 13, 2021 -- Apr 15, 2021

  • Collaborative Classification from Noisy Labels. [Paper]
  • Linear Models are Robust Optimal Under Strategic Behavior. [Paper]

AAAI 2021

  • Beyond Class-Conditional Assumption: A Primary Attempt to Combat Instance-Dependent Label Noise. [Paper][Code]
  • Learning to Purify Noisy Labels via Meta Soft Label Corrector. [Paper][Code]
  • Robustness of Accuracy Metric and its Inspirations in Learning with Noisy Labels. [Paper][Code]
  • Learning from Noisy Labels with Complementary Loss Functions. [Paper][Code]
  • Analysing the Noise Model Error for Realistic Noisy Label Data. [Paper][Code]
  • Tackling Instance-Dependent Label Noise via a Universal Probabilistic Model. [Paper]
  • Learning with Group Noise. [Paper]
  • Meta Label Correction for Noisy Label Learning. [Paper]

ArXiv 2021

  • [UCSC REAL Lab] Understanding (Generalized) Label Smoothing when Learning with Noisy Labels. [Paper]
  • Pervasive Label Errors in Test Sets Destabilize Machine Learning Benchmarks. [Paper][Code]
  • Estimating Instance-dependent Label-noise Transition Matrix using DNNs. [Paper]
  • A Theoretical Analysis of Learning with Noisily Labeled Data. [Paper]
  • Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels. [Paper]
  • A Survey of Label-noise Representation Learning: Past, Present and Future. [Paper]
  • Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization. [Paper][Code]
  • Noisy-Labeled NER with Confidence Estimation. [Paper][Code]
  • Study Group Learning: Improving Retinal Vessel Segmentation Trained with Noisy Labels. [Paper][Code]
  • Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels. [Paper][Code]
  • Exponentiated Gradient Reweighting for Robust Training Under Label Noise and Beyond. [Paper]
  • Understanding the Interaction of Adversarial Training with Noisy Labels. [Paper]
  • Learning from Noisy Labels via Dynamic Loss Thresholding. [Paper]
  • Evaluating Multi-label Classifiers with Noisy Labels. [Paper]
  • Self-Supervised Noisy Label Learning for Source-Free Unsupervised Domain Adaptation. [Paper]
  • Transform consistency for learning with noisy labels. [Paper]
  • Learning to Combat Noisy Labels via Classification Margins. [Paper]
  • Joint Negative and Positive Learning for Noisy Labels. [Paper]
  • Robust Classification from Noisy Labels: Integrating Additional Knowledge for Chest Radiography Abnormality Assessment. [Paper]
  • DST: Data Selection and joint Training for Learning with Noisy Labels. [Paper]
  • LongReMix: Robust Learning with High Confidence Samples in a Noisy Label Environment. [Paper]
  • A Novel Perspective for Positive-Unlabeled Learning via Noisy Labels. [Paper]
  • Ensemble Learning with Manifold-Based Data Splitting for Noisy Label Correction. [Paper]
  • MetaLabelNet: Learning to Generate Soft-Labels from Noisy-Labels. [Paper]
  • On the Robustness of Monte Carlo Dropout Trained with Noisy Labels. [Paper]
  • Co-matching: Combating Noisy Labels by Augmentation Anchoring. [Paper]
  • Pathological Image Segmentation with Noisy Labels. [Paper]
  • CrowdTeacher: Robust Co-teaching with Noisy Answers & Sample-specific Perturbations for Tabular Data. [Paper]
  • Approximating Instance-Dependent Noise via Instance-Confidence Embedding. [Paper]
  • Rethinking Noisy Label Models: Labeler-Dependent Noise with Adversarial Awareness. [Paper]
  • ScanMix: Learning from Severe Label Noise viaSemantic Clustering and Semi-Supervised Learning. [Paper]
  • Friends and Foes in Learning from Noisy Labels. [Paper]
  • Learning from Noisy Labels for Entity-Centric Information Extraction. [Paper]
  • A Fremework Using Contrastive Learning for Classification with Noisy Labels. [Paper]
  • Contrastive Learning Improves Model Robustness Under Label Noise. [Paper][Code]
  • Noise-Resistant Deep Metric Learning with Probabilistic Instance Filtering. [Paper]
  • Compensation Learning. [Paper]
  • kNet: A Deep kNN Network To Handle Label Noise. [Paper]
  • Temporal-aware Language Representation Learning From Crowdsourced Labels. [Paper]
  • Memorization in Deep Neural Networks: Does the Loss Function matter?. [Paper]
  • Mitigating Memorization in Sample Selection for Learning with Noisy Labels. [Paper]
  • P-DIFF: Learning Classifier with Noisy Labels based on Probability Difference Distributions. [Paper][Code]
  • Decoupling Representation and Classifier for Noisy Label Learning. [Paper]
  • Contrastive Representations for Label Noise Require Fine-Tuning. [Paper]
  • NGC: A Unified Framework for Learning with Open-World Noisy Data. [Paper]
  • Learning From Long-Tailed Data With Noisy Labels. [Paper]
  • Robust Long-Tailed Learning Under Label Noise. [Paper]
  • Instance-dependent Label-noise Learning under a Structural Causal Model. [Paper]
  • Assessing the Quality of the Datasets by Identifying Mislabeled Samples. [Paper]
  • Learning to Aggregate and Refine Noisy Labels for Visual Sentiment Analysis. [Paper]
  • Assessing the Quality of the Datasets by Identifying Mislabeled Samples. [Paper]

Papers & Code in 2020


ICML 2020

  • [UCSC REAL Lab] Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates. [Paper][Code 1] [Code 2]
  • Normalized Loss Functions for Deep Learning with Noisy Labels. [Paper][Code]
  • SIGUA: Forgetting May Make Learning with Noisy Labels More Robust. [Paper][Code]
  • Error-Bounded Correction of Noisy Labels. [Paper][Code]
  • Training Binary Neural Networks through Learning with Noisy Supervision. [Paper][Code]
  • Improving generalization by controlling label-noise information in neural network weights. [Paper][Code]
  • Self-PU: Self Boosted and Calibrated Positive-Unlabeled Training. [Paper][Code]
  • Searching to Exploit Memorization Effect in Learning with Noisy Labels. [Paper][Code]
  • Learning with Bounded Instance and Label-dependent Label Noise. [Paper]
  • Label-Noise Robust Domain Adaptation. [Paper]
  • Beyond Synthetic Noise: Deep Learning on Controlled Noisy Labels. [Paper]
  • Does label smoothing mitigate label noise?. [Paper]
  • Learning with Multiple Complementary Labels. [Paper]
  • Deep k-NN for Noisy Labels. [Paper]
  • Extreme Multi-label Classification from Aggregated Labels. [Paper]

ICLR 2020

  • DivideMix: Learning with Noisy Labels as Semi-supervised Learning. [Paper][Code]
  • Learning from Rules Generalizing Labeled Exemplars. [Paper] [Code]
  • Robust training with ensemble consensus. [Paper][Code]
  • Self-labelling via simultaneous clustering and representation learning. [Paper][Code]
  • Can gradient clipping mitigate label noise? [Paper][Code]
  • Mutual Mean-Teaching: Pseudo Label Refinery for Unsupervised Domain Adaptation on Person Re-identification. [Paper][Code]
  • Curriculum Loss: Robust Learning and Generalization against Label Corruption. [Paper]
  • Simple and Effective Regularization Methods for Training on Noisily Labeled Data with Generalization Guarantee. [Paper]
  • SELF: Learning to Filter Noisy Labels with Self-Ensembling. [Paper]

Nips 2020

  • Part-dependent Label Noise: Towards Instance-dependent Label Noise. [Paper][Code]
  • Identifying Mislabeled Data using the Area Under the Margin Ranking. [Paper][Code]
  • Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning. [Paper]
  • Early-Learning Regularization Prevents Memorization of Noisy Labels. [Paper][Code]
  • Coresets for Robust Training of Deep Neural Networks against Noisy Labels. [Paper][Code]
  • Modeling Noisy Annotations for Crowd Counting. [Paper][Code]
  • Robust Optimization for Fairness with Noisy Protected Groups. [Paper][Code]
  • Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping. [Paper][Code]
  • A Topological Filter for Learning with Label Noise. [Paper][Code]
  • Self-Adaptive Training: beyond Empirical Risk Minimization. [Paper][Code]
  • Disentangling Human Error from the Ground Truth in Segmentation of Medical Images. [Paper][Code]
  • Non-Convex SGD Learns Halfspaces with Adversarial Label Noise. [Paper]
  • Efficient active learning of sparse halfspaces with arbitrary bounded noise. [Paper]
  • Semi-Supervised Partial Label Learning via Confidence-Rated Margin Maximization. [Paper]
  • Labelling unlabelled videos from scratch with multi-modal self-supervision. [Paper][Code]
  • Distribution Aligning Refinery of Pseudo-label for Imbalanced Semi-supervised Learning. [Paper][Code]
  • MetaPoison: Practical General-purpose Clean-label Data Poisoning. [Paper][Code 1][Code 2]
  • Provably Consistent Partial-Label Learning. [Paper]
  • A Variational Approach for Learning from Positive and Unlabeled Data. [Paper][Code]

AAAI 2020

  • [UCSC REAL Lab] Reinforcement Learning with Perturbed Rewards. [Paper] [Code]
  • Less Is Better: Unweighted Data Subsampling via Influence Function. [Paper] [Code]
  • Weakly Supervised Sequence Tagging from Noisy Rules. [Paper][Code]
  • Coupled-View Deep Classifier Learning from Multiple Noisy Annotators. [Paper]
  • Partial multi-label learning with noisy label identification. [Paper]
  • Self-Paced Robust Learning for Leveraging Clean Labels in Noisy Data. [Paper]
  • Label Error Correction and Generation Through Label Relationships. [Paper]

CVPR 2020

  • Combating noisy labels by agreement: A joint training method with co-regularization. [Paper][Code]
  • Distilling Effective Supervision From Severe Label Noise. [Paper][Code]
  • Self-Training With Noisy Student Improves ImageNet Classification. [Paper][Code]
  • Noise Robust Generative Adversarial Networks. [Paper][Code]
  • Global-Local GCN: Large-Scale Label Noise Cleansing for Face Recognition. [Paper]
  • DLWL: Improving Detection for Lowshot Classes With Weakly Labelled Data. [Paper]
  • Spherical Space Domain Adaptation With Robust Pseudo-Label Loss. [Paper][Code]
  • Training Noise-Robust Deep Neural Networks via Meta-Learning. [Paper][Code]
  • Shoestring: Graph-Based Semi-Supervised Classification With Severely Limited Labeled Data. [Paper][Code]
  • Noise-Aware Fully Webly Supervised Object Detection. [Paper][Code]
  • Learning From Noisy Anchors for One-Stage Object Detection. [Paper][Code]
  • Generating Accurate Pseudo-Labels in Semi-Supervised Learning and Avoiding Overconfident Predictions via Hermite Polynomial Activations. [Paper][Code]
  • Revisiting Knowledge Distillation via Label Smoothing Regularization. [Paper][Code]

ECCV 2020

  • 2020-ECCV - Learning with Noisy Class Labels for Instance Segmentation. [Paper][Code]
  • 2020-ECCV - Suppressing Mislabeled Data via Grouping and Self-Attention. [Paper][Code]
  • 2020-ECCV - NoiseRank: Unsupervised Label Noise Reduction with Dependence Models. [Paper]
  • 2020-ECCV - Weakly Supervised Learning with Side Information for Noisy Labeled Images. [Paper]
  • 2020-ECCV - Learning Noise-Aware Encoder-Decoder from Noisy Labels by Alternating Back-Propagation for Saliency Detection. [Paper]
  • 2020-ECCV - Graph convolutional networks for learning with few clean and many noisy labels. [Paper]

ArXiv 2020

  • No Regret Sample Selection with Noisy Labels. [Paper][Code]
  • Meta Soft Label Generation for Noisy Labels. [Paper][Code]
  • Learning from Noisy Labels with Deep Neural Networks: A Survey. [Paper]
  • RAR-U-Net: a Residual Encoder to Attention Decoder by Residual Connections Framework for Spine Segmentation under Noisy Labels. [Paper]
  • Learning from Small Amount of Medical Data with Noisy Labels: A Meta-Learning Approach. [Paper]

Owner
Jiaheng Wei
Ph.D@ UCSC CSE
Jiaheng Wei
The official MegEngine implementation of the ICCV 2021 paper: GyroFlow: Gyroscope-Guided Unsupervised Optical Flow Learning

[ICCV 2021] GyroFlow: Gyroscope-Guided Unsupervised Optical Flow Learning This is the official implementation of our ICCV2021 paper GyroFlow. Our pres

MEGVII Research 36 Sep 07, 2022
HyperSeg: Patch-wise Hypernetwork for Real-time Semantic Segmentation Official PyTorch Implementation

: We present a novel, real-time, semantic segmentation network in which the encoder both encodes and generates the parameters (weights) of the decoder. Furthermore, to allow maximal adaptivity, the w

Yuval Nirkin 182 Dec 14, 2022
Repository for "Exploring Sparsity in Image Super-Resolution for Efficient Inference", CVPR 2021

SMSR Reposity for "Exploring Sparsity in Image Super-Resolution for Efficient Inference" [arXiv] Highlights Locate and skip redundant computation in S

Longguang Wang 225 Dec 26, 2022
A embed able annotation tool for end to end cross document co-reference

CoRefi CoRefi is an emebedable web component and stand alone suite for exaughstive Within Document and Cross Document Coreference Anntoation. For a de

PythicCoder 39 Dec 12, 2022
A boosting-based Multiple Instance Learning (MIL) package that includes MIL-Boost and MCIL-Boost

A boosting-based Multiple Instance Learning (MIL) package that includes MIL-Boost and MCIL-Boost

Jun-Yan Zhu 27 Aug 08, 2022
Source code for our paper "Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for Molecular Structures"

Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for Molecular Structures Code for the Multiplex Molecular Graph Neural Network (M

shzhang 59 Dec 10, 2022
We simulate traveling back in time with a modern camera to rephotograph famous historical subjects.

[SIGGRAPH Asia 2021] Time-Travel Rephotography [Project Website] Many historical people were only ever captured by old, faded, black and white photos,

298 Jan 02, 2023
[ICCV 2021] Focal Frequency Loss for Image Reconstruction and Synthesis

Focal Frequency Loss - Official PyTorch Implementation This repository provides the official PyTorch implementation for the following paper: Focal Fre

Liming Jiang 460 Jan 04, 2023
A clean implementation based on AlphaZero for any game in any framework + tutorial + Othello/Gobang/TicTacToe/Connect4 and more

Alpha Zero General (any game, any framework!) A simplified, highly flexible, commented and (hopefully) easy to understand implementation of self-play

Surag Nair 3.1k Jan 05, 2023
COVID-Net Open Source Initiative

The COVID-Net models provided here are intended to be used as reference models that can be built upon and enhanced as new data becomes available

Linda Wang 1.1k Dec 26, 2022
Official implementation of SynthTIGER (Synthetic Text Image GEneratoR) ICDAR 2021

🐯 SynthTIGER: Synthetic Text Image GEneratoR Official implementation of SynthTIGER | Paper | Datasets Moonbin Yim1, Yoonsik Kim1, Han-cheol Cho1, Sun

Clova AI Research 256 Jan 05, 2023
Repository accompanying the "Sign Pose-based Transformer for Word-level Sign Language Recognition" paper

by Matyáš Boháček and Marek Hrúz, University of West Bohemia Should you have any questions or inquiries, feel free to contact us here. Repository acco

Matyáš Boháček 30 Dec 30, 2022
When Does Pretraining Help? Assessing Self-Supervised Learning for Law and the CaseHOLD Dataset of 53,000+ Legal Holdings

When Does Pretraining Help? Assessing Self-Supervised Learning for Law and the CaseHOLD Dataset of 53,000+ Legal Holdings This is the repository for t

RegLab 39 Jan 07, 2023
Generalized Decision Transformer for Offline Hindsight Information Matching

Generalized Decision Transformer for Offline Hindsight Information Matching [arxiv] If you use this codebase for your research, please cite the paper:

Hiroki Furuta 35 Dec 12, 2022
Tensorflow Tutorials using Jupyter Notebook

Tensorflow Tutorials using Jupyter Notebook TensorFlow tutorials written in Python (of course) with Jupyter Notebook. Tried to explain as kindly as po

Sungjoon 2.6k Dec 22, 2022
Zalo AI challenge 2021 task hum to song

Zalo AI challenge 2021 task Hum to Song pipeline: Chuẩn bị dữ liệu cho quá trình train: Sửa các file đường dẫn trong config/preprocess.yaml raw_path:

Vo Van Phuc 105 Dec 16, 2022
《Unsupervised 3D Human Pose Representation with Viewpoint and Pose Disentanglement》(ECCV 2020) GitHub: [fig9]

Unsupervised 3D Human Pose Representation [Paper] The implementation of our paper Unsupervised 3D Human Pose Representation with Viewpoint and Pose Di

42 Nov 24, 2022
Arquitetura e Desenho de Software.

S203 Este é um repositório dedicado às aulas de Arquitetura e Desenho de Software, cuja sigla é "S203". E agora, José? Como não tenho muito a falar aq

Fabio 7 Oct 23, 2021
Latent Network Models to Account for Noisy, Multiply-Reported Social Network Data

VIMuRe Latent Network Models to Account for Noisy, Multiply-Reported Social Network Data. If you use this code please cite this article (preprint). De

6 Dec 15, 2022
GeneDisco is a benchmark suite for evaluating active learning algorithms for experimental design in drug discovery.

GeneDisco is a benchmark suite for evaluating active learning algorithms for experimental design in drug discovery.

22 Dec 12, 2022