EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks

Related tags

Deep LearningEncT5
Overview

EncT5

(Unofficial) Pytorch Implementation of EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks

About

  • Finetune T5 model for classification & regression by only using the encoder layers.
  • Implemented of Tokenizer and Model for EncT5.
  • Add BOS Token () for tokenizer, and use this token for classification & regression.
    • Need to resize embedding as vocab size is changed. (model.resize_token_embeddings())
  • BOS and EOS token will be automatically added as below.
    • single sequence: X
    • pair of sequences: A B

Requirements

Highly recommend to use the same version of transformers.

transformers==4.15.0
torch==1.8.1
sentencepiece==0.1.96
datasets==1.17.0
scikit-learn==0.24.2

How to Use

from enc_t5 import EncT5ForSequenceClassification, EncT5Tokenizer

model = EncT5ForSequenceClassification.from_pretrained("t5-base")
tokenizer = EncT5Tokenizer.from_pretrained("t5-base")

# Resize embedding size as we added bos token
if model.config.vocab_size < len(tokenizer.get_vocab()):
    model.resize_token_embeddings(len(tokenizer.get_vocab()))

Finetune on GLUE

Setup

  • Use T5 1.1 base for finetuning.
  • Evaluate on TPU. See run_glue_tpu.sh for more details.
  • Use AdamW optimizer instead of Adafactor.
  • Check best checkpoint on every epoch by using EarlyStoppingCallback.

Results

Metric Result (Paper) Result (Implementation)
CoLA Matthew 53.1 52.4
SST-2 Acc 94.0 94.5
MRPC F1/Acc 91.5/88.3 91.7/88.0
STS-B PCC/SCC 80.5/79.3 88.0/88.3
QQP F1/Acc 72.9/89.8 88.4/91.3
MNLI Mis/Matched 88.0/86.7 87.5/88.1
QNLI Acc 93.3 93.2
RTE Acc 67.8 69.7
You might also like...
Black-Box-Tuning - Black-Box Tuning for Language-Model-as-a-Service

Black-Box-Tuning Source code for paper "Black-Box Tuning for Language-Model-as-a

Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning.

xTune Code for ACL2021 paper Consistency Regularization for Cross-Lingual Fine-Tuning. Environment DockerFile: dancingsoul/pytorch:xTune Install the f

 Cartoon-StyleGan2 ๐Ÿ™ƒ : Fine-tuning StyleGAN2 for Cartoon Face Generation Official codebase for Legged Robots that Keep on Learning: Fine-Tuning Locomotion Policies in the Real World
Official codebase for Legged Robots that Keep on Learning: Fine-Tuning Locomotion Policies in the Real World

Legged Robots that Keep on Learning Official codebase for Legged Robots that Keep on Learning: Fine-Tuning Locomotion Policies in the Real World, whic

Fine-tuning StyleGAN2 for Cartoon Face Generation
Fine-tuning StyleGAN2 for Cartoon Face Generation

Cartoon-StyleGAN ๐Ÿ™ƒ : Fine-tuning StyleGAN2 for Cartoon Face Generation Abstract Recent studies have shown remarkable success in the unsupervised imag

This repository is the official implementation of Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regularized Fine-Tuning (NeurIPS21).
This repository is the official implementation of Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regularized Fine-Tuning (NeurIPS21).

Core-tuning This repository is the official implementation of ``Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regular

Example Of Fine-Tuning BERT For Named-Entity Recognition Task And Preparing For Cloud Deployment Using Flask, React, And Docker
Example Of Fine-Tuning BERT For Named-Entity Recognition Task And Preparing For Cloud Deployment Using Flask, React, And Docker

Example Of Fine-Tuning BERT For Named-Entity Recognition Task And Preparing For Cloud Deployment Using Flask, React, And Docker This repository contai

Implementation of the paper "Fine-Tuning Transformers: Vocabulary Transfer"

Transformer-vocabulary-transfer Implementation of the paper "Fine-Tuning Transfo

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning This repository is official Tensorflow implementation of paper: Ensemb

Comments
  • Enable tokenizer to be loaded by sentence-transformer

    Enable tokenizer to be loaded by sentence-transformer

    ๐Ÿš€ Feature Request

    Integration into sentence-transformer library.

    ๐Ÿ“Ž Additional context

    I tried to load this tokenizer with sentence-transformer library but it failed. AutoTokenizer couldn't load this tokenizer. So, I simply added code to override save_pretrained and its dependencies so that this tokenizer is saved as T5Tokenizer, its super class.

            def save_pretrained(
            self,
            save_directory,
            legacy_format: Optional[bool] = None,
            filename_prefix: Optional[str] = None,
            push_to_hub: bool = False,
            **kwargs,
        ):
            if os.path.isfile(save_directory):
                logger.error(f"Provided path ({save_directory}) should be a directory, not a file")
                return
    
            if push_to_hub:
                commit_message = kwargs.pop("commit_message", None)
                repo = self._create_or_get_repo(save_directory, **kwargs)
    
            os.makedirs(save_directory, exist_ok=True)
    
            special_tokens_map_file = os.path.join(
                save_directory, (filename_prefix + "-" if filename_prefix else "") + SPECIAL_TOKENS_MAP_FILE
            )
            tokenizer_config_file = os.path.join(
                save_directory, (filename_prefix + "-" if filename_prefix else "") + TOKENIZER_CONFIG_FILE
            )
    
            tokenizer_config = copy.deepcopy(self.init_kwargs)
            if len(self.init_inputs) > 0:
                tokenizer_config["init_inputs"] = copy.deepcopy(self.init_inputs)
            for file_id in self.vocab_files_names.keys():
                tokenizer_config.pop(file_id, None)
    
            # Sanitize AddedTokens
            def convert_added_tokens(obj: Union[AddedToken, Any], add_type_field=True):
                if isinstance(obj, AddedToken):
                    out = obj.__getstate__()
                    if add_type_field:
                        out["__type"] = "AddedToken"
                    return out
                elif isinstance(obj, (list, tuple)):
                    return list(convert_added_tokens(o, add_type_field=add_type_field) for o in obj)
                elif isinstance(obj, dict):
                    return {k: convert_added_tokens(v, add_type_field=add_type_field) for k, v in obj.items()}
                return obj
    
            # add_type_field=True to allow dicts in the kwargs / differentiate from AddedToken serialization
            tokenizer_config = convert_added_tokens(tokenizer_config, add_type_field=True)
    
            # Add tokenizer class to the tokenizer config to be able to reload it with from_pretrained
            ############################################################################
            tokenizer_class = self.__class__.__base__.__name__
            ############################################################################
            # Remove the Fast at the end unless we have a special `PreTrainedTokenizerFast`
            if tokenizer_class.endswith("Fast") and tokenizer_class != "PreTrainedTokenizerFast":
                tokenizer_class = tokenizer_class[:-4]
            tokenizer_config["tokenizer_class"] = tokenizer_class
            if getattr(self, "_auto_map", None) is not None:
                tokenizer_config["auto_map"] = self._auto_map
            if getattr(self, "_processor_class", None) is not None:
                tokenizer_config["processor_class"] = self._processor_class
    
            # If we have a custom model, we copy the file defining it in the folder and set the attributes so it can be
            # loaded from the Hub.
            if self._auto_class is not None:
                custom_object_save(self, save_directory, config=tokenizer_config)
    
            with open(tokenizer_config_file, "w", encoding="utf-8") as f:
                f.write(json.dumps(tokenizer_config, ensure_ascii=False))
            logger.info(f"tokenizer config file saved in {tokenizer_config_file}")
    
            # Sanitize AddedTokens in special_tokens_map
            write_dict = convert_added_tokens(self.special_tokens_map_extended, add_type_field=False)
            with open(special_tokens_map_file, "w", encoding="utf-8") as f:
                f.write(json.dumps(write_dict, ensure_ascii=False))
            logger.info(f"Special tokens file saved in {special_tokens_map_file}")
    
            file_names = (tokenizer_config_file, special_tokens_map_file)
    
            save_files = self._save_pretrained(
                save_directory=save_directory,
                file_names=file_names,
                legacy_format=legacy_format,
                filename_prefix=filename_prefix,
            )
    
            if push_to_hub:
                url = self._push_to_hub(repo, commit_message=commit_message)
                logger.info(f"Tokenizer pushed to the hub in this commit: {url}")
    
            return save_files
    
    enhancement 
    opened by kwonmha 0
Releases(v1.0.0)
  • v1.0.0(Jan 22, 2022)

    Whatโ€™s Changed

    :rocket: Features

    • Add GLUE Trainer (#2) @monologg
    • Add Template & EncT5 model and tokenizer (#1) @monologg

    :pencil: Documentation

    • Add readme & script (#3) @monologg
    Source code(tar.gz)
    Source code(zip)
Owner
Jangwon Park
Jangwon Park
Easy Parallel Library (EPL) is a general and efficient deep learning framework for distributed model training.

English | ็ฎ€ไฝ“ไธญๆ–‡ Easy Parallel Library Overview Easy Parallel Library (EPL) is a general and efficient library for distributed model training. Usability

Alibaba 185 Dec 21, 2022
PyTorch implementation of adversarial patch

adversarial-patch PyTorch implementation of adversarial patch This is an implementation of the Adversarial Patch paper. Not official and likely to hav

Jamie Hayes 172 Nov 29, 2022
Implementation of ICCV19 Paper "Learning Two-View Correspondences and Geometry Using Order-Aware Network"

OANet implementation Pytorch implementation of OANet for ICCV'19 paper "Learning Two-View Correspondences and Geometry Using Order-Aware Network", by

Jiahui Zhang 225 Dec 05, 2022
Project page for our ICCV 2021 paper "The Way to my Heart is through Contrastive Learning"

The Way to my Heart is through Contrastive Learning: Remote Photoplethysmography from Unlabelled Video This is the official project page of our ICCV 2

36 Jan 06, 2023
TF2 implementation of knowledge distillation using the "function matching" hypothesis from the paper Knowledge distillation: A good teacher is patient and consistent by Beyer et al.

FunMatch-Distillation TF2 implementation of knowledge distillation using the "function matching" hypothesis from the paper Knowledge distillation: A g

Sayak Paul 67 Dec 20, 2022
CTRL-C: Camera calibration TRansformer with Line-Classification

CTRL-C: Camera calibration TRansformer with Line-Classification This repository contains the official code and pretrained models for CTRL-C (Camera ca

57 Nov 14, 2022
[3DV 2021] Channel-Wise Attention-Based Network for Self-Supervised Monocular Depth Estimation

Channel-Wise Attention-Based Network for Self-Supervised Monocular Depth Estimation This is the official implementation for the method described in Ch

Jiaxing Yan 27 Dec 30, 2022
Expand human face editing via Global Direction of StyleCLIP, especially to maintain similarity during editing.

Oh-My-Face This project is based on StyleCLIP, RIFE, and encoder4editing, which aims to expand human face editing via Global Direction of StyleCLIP, e

AiLin Huang 51 Nov 17, 2022
Resilient projection-based consensus actor-critic (RPBCAC) algorithm

Resilient projection-based consensus actor-critic (RPBCAC) algorithm We implement the RPBCAC algorithm with nonlinear approximation from [1] and focus

Martin Figura 5 Jul 12, 2022
Making a music video with Wav2CLIP and VQGAN-CLIP

music2video Overview A repo for making a music video with Wav2CLIP and VQGAN-CLIP. The base code was derived from VQGAN-CLIP The CLIP embedding for au

Joel Jang | ์žฅ์š”์—˜ 163 Dec 26, 2022
Rotated Box Is Back : Accurate Box Proposal Network for Scene Text Detection

Rotated Box Is Back : Accurate Box Proposal Network for Scene Text Detection This material is supplementray code for paper accepted in ICDAR 2021 We h

NCSOFT 30 Dec 21, 2022
Rethinking Nearest Neighbors for Visual Classification

Rethinking Nearest Neighbors for Visual Classification arXiv Environment settings Check out scripts/env_setup.sh Setup data Download the following fin

Menglin Jia 29 Oct 11, 2022
Annotated notes and summaries of the TensorFlow white paper, along with SVG figures and links to documentation

TensorFlow White Paper Notes Features Notes broken down section by section, as well as subsection by subsection Relevant links to documentation, resou

Sam Abrahams 437 Oct 09, 2022
The spiritual successor to knockknock for PyTorch Lightning, get notified when your training ends

Who's there? The spiritual successor to knockknock for PyTorch Lightning, to get a notification when your training is complete or when it crashes duri

twsl 70 Oct 06, 2022
Replication package for the manuscript "Using Personality Detection Tools for Software Engineering Research: How Far Can We Go?" submitted to TOSEM

tosem2021-personality-rep-package Replication package for the manuscript "Using Personality Detection Tools for Software Engineering Research: How Far

Collaborative Development Group 1 Dec 13, 2021
The MLOps platform for innovators ๐Ÿš€

โ€‹ DS2.ai is an integrated AI operation solution that supports all stages from custom AI development to deployment. It is an AI-specialized platform service that collects data, builds a training datas

9 Jan 03, 2023
Fully Convolutional Networks for Semantic Segmentation by Jonathan Long*, Evan Shelhamer*, and Trevor Darrell. CVPR 2015 and PAMI 2016.

Fully Convolutional Networks for Semantic Segmentation This is the reference implementation of the models and code for the fully convolutional network

Evan Shelhamer 3.2k Jan 08, 2023
Code and Data for NeurIPS2021 Paper "A Dataset for Answering Time-Sensitive Questions"

Time-Sensitive-QA The repo contains the dataset and code for NeurIPS2021 (dataset track) paper Time-Sensitive Question Answering dataset. The dataset

wenhu chen 35 Nov 14, 2022
Instant neural graphics primitives: lightning fast NeRF and more

Instant Neural Graphics Primitives Ever wanted to train a NeRF model of a fox in under 5 seconds? Or fly around a scene captured from photos of a fact

NVIDIA Research Projects 10.6k Jan 01, 2023
Source code of generalized shuffled linear regression

Generalized-Shuffled-Linear-Regression Code for the ICCV 2021 paper: Generalized Shuffled Linear Regression. Authors: Feiran Li, Kent Fujiwara, Fumio

FEI 7 Oct 26, 2022