A repo for open resources & information for people to succeed in PhD in CS & career in AI / NLP

Overview

Resources to Help Global Equality for PhDs in NLP / AI

This repo originates with a wish to promote Global Equality for people who want to do a PhD in NLP, following the idea that mentorship programs are an effective way to fight against segregation, according to The Human Networks (Jackson, 2019). Specifically, we wish people from all over the world and with all types of backgrounds can share the same source of information, so that success will be a reward to those who are determined and hardworking, regardless of external contrainsts.

One non-negligible reason for success is access to information, such as (1) knowing what a PhD in NLP is like, (2) knowing what top grad schools look for when reviewing PhD applications, (3) broadening your horizon of what is good work, (4) knowing how careers in NLP in both academia and industry are like, and many others.

Contributor: Zhijing Jin (PhD student in NLP at Max Planck Institute, co-organizer of the ACL Year-Round Mentorship Program).

You are welcome to be a collaborator, -- you can make an issue/pull request, and I can add you :).

Endorsers of this repo: Prof Rada Mihalcea (University of Michigan). Please add your name here (by a pull request) if you endorse this repo :).

Contents (Actively Updating)

Top Resources

  1. Online ACL Year-Round Mentorship Program: https://acl-mentorship.github.io (You can apply as a mentee, as a mentor, or as a volunteer. For mentees, you will be able to attend monthly zoom Q&A sessions hosted senior researchers in NLP. You will also join a global slack channel, where you can constantly post your questions, and we will collect answers from senior NLP researchers.)

Stage 1. (Non-PhD -> PhD) How to Apply to PhD?

  1. (Prof Philip [email protected]) Finding CS Ph.D. programs to apply to. [Video]

  2. (Prof Mor Harchol-Balter@CMU) Applying to Ph.D. Programs in Computer Science (2014). [Guide]

  3. (Prof Jason [email protected]) Advice for Research Students (last updated: 2021). [List of suggestions]

  4. (CS Rankings) Advice on Applying to Grad School in Computer Science. [Pointers]

  5. (Nelson Liu, [email protected]) Student Perspectives on Applying to NLP PhD Programs (2019). [Suggestions Based on Surveys]

  6. A Princeton CS Major's Guide to Applying to Graduate School. [List of suggestions]

  7. (John Hewitt, [email protected]) Undergrad to PhD, or not - advice for undergrads interested in research (2018). [Suggestions]

  8. (Kalpesh Krishna, [email protected] Amherst) Grad School Resources (2018). [Article] (This list lots of useful pointers!)

  9. (Prof Scott E. [email protected]) Quora answers on the LTI program at CMU (2017). [Article]

  10. (Albert Webson et al., [email protected] University) Resources for Underrepresented Groups, including Brown's Own Applicant Mentorship Program (2020, but we will keep updating it throughout the 2021 application season.) [List of Resources]

Specific Suggestions

  1. (Prof Nathan [email protected] University) Inside Ph.D. admissions: What readers look for in a Statement of Purpose. [Article]

Improve Your Proficiency with Tools

  1. (MIT 2020) The Missing Semester of Your CS Education (e.g., master the command-line, ssh into remote machines, use fancy features of version control systems).

Stage 2. (Doing PhD) How to Succeed in PhD?

  1. (Maxwell Forbes, [email protected]) Every PhD Is Different. [Suggestions]

  2. (Prof Mark [email protected], Prof Hanna M. [email protected] Amherst) How to be a successful PhD student (in computer science (in NLP/ML)). [Suggestions]

  3. (Andrej Karpathy) A Survival Guide to a PhD (2016). [Suggestions]

  4. (Prof Kevin [email protected]) Kevin Gimpel's Advice to PhD Students. [Suggestions]

  5. (Prof Marie [email protected] University) How to Succeed in Graduate School: A Guide for Students and Advisors (1994). [Article] [Part II]

  6. (Prof Eric [email protected]) Syllabus for Eric’s PhD students (incl. Prof's expectation for PhD students). [syllabus]

  7. (Prof H.T. [email protected]) Useful Thoughts about Research (1987). [Suggestions]

  8. (Prof Phil [email protected]) Networking on the Network: A Guide to Professional Skills for PhD Students (last updated: 2015). [Suggestions]

  9. (Prof Stephen C. [email protected]) Some Modest Advice for Graduate Students. [Article]

  10. (Prof Tao [email protected]) Graduate Student Survival/Success Guide. [Slides]

  11. (Mu [email protected]) 博士这五年 (A Chinese article about five years in PhD at CMU). [Article]

  12. (Karl Stratos) A Note to a Prospective Student. [Suggestions]

What Is Weekly Meeting with Advisors like?

  1. (Prof Jason [email protected]) What do PhD students talk about in their once-a-week meetings with their advisers during their first year? (2015). [Article]

  2. (Brown University) Guide to Meetings with Your Advisor. [Suggestions]

Practical Guides

  1. (Prof Srinivasan [email protected]) How to Read a Paper (2007). [Suggestions]

  2. (Prof Jason [email protected]) How to Read a Technical Paper (2009). [Suggestions]

  3. (Prof Jason [email protected]) How to write a paper? (2010). [Suggestions]

Memoir-Like Narratives

  1. (Prof Philip [email protected]) The Ph.D. Grind: A Ph.D. Student Memoir (last updated: 2015). [Video] (For the book, you have to dig deeply, and then you will find the book.)

  2. (Prof Tianqi [email protected]) 陈天奇:机器学习科研的十年 (2019) (A Chinese article about ten years of research in ML). [Article]

  3. (Jean Yang) What My PhD Was Like. [Article]

How to Excel Your Research

  1. The most important step: (Prof Jason [email protected]) How to Find Research Problems (1997). [Suggestions]

Grad School Fellowships

  1. (List compiled by CMU) Graduate Fellowship Opportunities [link]
  2. CYD Fellowship for Grad Students in Switzerland [link]

Other Books

  1. The craft of Research by Wayne Booth, Greg Colomb and Joseph Williams.

  2. How to write a better thesis by Paul Gruba and David Evans

  3. Helping Doctoral Students to write by Barbara Kamler and Pat Thomson

  4. The unwritten rules of PhD research by Marian Petre and Gordon Rugg

Stage 3. (After PhD -> Industry) How is life as an industry researcher?

  1. (Mu [email protected]) 工作五年反思 (A Chinese article about reflections on the five years working in industry). [Article]

Stage 4. (Being a Prof) How to get an academic position? And how to be a good prof?

  1. (Prof Jason [email protected]) How to write an academic research statement (when applying for a faculty job) (2017). [Article]

  2. (Prof Jason [email protected]) How to Give a Talk (2015). [Suggestions]

  3. (Prof Jason [email protected]) Teaching Philosophy. [Article]

Stage 5. (Whole Career Path) How to live out a life career as an NLP research?

  1. (Prof Charles [email protected] University, Prof Qiang [email protected])Crafting Your Research Future: A Guide to Successful Master's and Ph.D. Degrees in Science & Engineering. [Book]

Further Readings: Technical Materials to Improve Your NLP Research Skills

  1. (Prof Jason [email protected]) Technical Tutorials, Notes, and Suggested Reading (last updated: 2018) [Reading list]

Contributions

All types of contributions to this resource list is welcome. Feel free to open a Pull Request.

Contact: Zhijing Jin, PhD in NLP at Max Planck Institute for Intelligent Systems, working on NLP & Causality.

How to Cite This Repo

@misc{resources2021jin,
  author = {Zhijing Jin},
  title = {Resources to Help Global Equality for PhDs in NLP},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/zhijing-jin/nlp-phd-global-equality}}
}
Owner
PhD in NLP & Causality. Affiliated with Max Planck Institute, Germany & ETH & UMich. Supervised by Bernhard Schoelkopf, Rada Mihalcea, and Mrinmaya Sachan.
An extension for asreview implements a version of the tf-idf feature extractor that saves the matrix and the vocabulary.

Extension - matrix and vocabulary extractor for TF-IDF and Doc2Vec An extension for ASReview that adds a tf-idf extractor that saves the matrix and th

ASReview 4 Jun 17, 2022
ElasticBERT: A pre-trained model with multi-exit transformer architecture.

This repository contains finetuning code and checkpoints for ElasticBERT. Towards Efficient NLP: A Standard Evaluation and A Strong Baseli

fastNLP 48 Dec 14, 2022
TalkNet: Audio-visual active speaker detection Model

Is someone talking? TalkNet: Audio-visual active speaker detection Model This repository contains the code for our ACM MM 2021 paper, TalkNet, an acti

142 Dec 14, 2022
Th2En & Th2Zh: The large-scale datasets for Thai text cross-lingual summarization

Th2En & Th2Zh: The large-scale datasets for Thai text cross-lingual summarization 📥 Download Datasets 📥 Download Trained Models INTRODUCTION TH2ZH (

Nakhun Chumpolsathien 5 Jan 03, 2022
Demo programs for the Talking Head Anime from a Single Image 2: More Expressive project.

Demo Code for "Talking Head Anime from a Single Image 2: More Expressive" This repository contains demo programs for the Talking Head Anime

Pramook Khungurn 901 Jan 06, 2023
STonKGs is a Sophisticated Transformer that can be jointly trained on biomedical text and knowledge graphs

STonKGs STonKGs is a Sophisticated Transformer that can be jointly trained on biomedical text and knowledge graphs. This multimodal Transformer combin

STonKGs 27 Aug 11, 2022
MRC approach for Aspect-based Sentiment Analysis (ABSA)

B-MRC MRC approach for Aspect-based Sentiment Analysis (ABSA) Paper: Bidirectional Machine Reading Comprehension for Aspect Sentiment Triplet Extracti

Phuc Phan 1 Apr 05, 2022
Implementation of paper Does syntax matter? A strong baseline for Aspect-based Sentiment Analysis with RoBERTa.

RoBERTaABSA This repo contains the code for NAACL 2021 paper titled Does syntax matter? A strong baseline for Aspect-based Sentiment Analysis with RoB

106 Nov 28, 2022
vits chinese, tts chinese, tts mandarin

vits chinese, tts chinese, tts mandarin 史上训练最简单,音质最好的语音合成系统

AmorTX 12 Dec 14, 2022
This repository contains the code for "Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference"

Pattern-Exploiting Training (PET) This repository contains the code for Exploiting Cloze Questions for Few-Shot Text Classification and Natural Langua

Timo Schick 1.4k Dec 30, 2022
To classify the News into Real/Fake using Features from the Text Content of the article

Hoax-Detector Authenticity of news has now become a major problem. The Idea is to classify the News into Real/Fake using Features from the Text Conten

Aravindhan 1 Feb 09, 2022
Learning Spatio-Temporal Transformer for Visual Tracking

STARK The official implementation of the paper Learning Spatio-Temporal Transformer for Visual Tracking Highlights The strongest performances Tracker

Multimedia Research 485 Jan 04, 2023
Autoregressive Entity Retrieval

The GENRE (Generative ENtity REtrieval) system as presented in Autoregressive Entity Retrieval implemented in pytorch. @inproceedings{decao2020autoreg

Meta Research 611 Dec 16, 2022
German Text-To-Speech Engine using Tacotron and Griffin-Lim

jotts JoTTS is a German text-to-speech engine using tacotron and griffin-lim. The synthesizer model has been trained on my voice using Tacotron1. Due

padmalcom 6 Aug 28, 2022
Code for hyperboloid embeddings for knowledge graph entities

Implementation for the papers: Self-Supervised Hyperboloid Representations from Logical Queries over Knowledge Graphs, Nurendra Choudhary, Nikhil Rao,

30 Dec 10, 2022
This is a really simple text-to-speech app made with python and tkinter.

Tkinter Text-to-Speech App by Souvik Roy This is a really simple tkinter app which converts the text you have entered into a speech. It is created wit

Souvik Roy 1 Dec 21, 2021
BERT score for text generation

BERTScore Automatic Evaluation Metric described in the paper BERTScore: Evaluating Text Generation with BERT (ICLR 2020). News: Features to appear in

Tianyi 1k Jan 08, 2023
Chinese Pre-Trained Language Models (CPM-LM) Version-I

CPM-Generate 为了促进中文自然语言处理研究的发展,本项目提供了 CPM-LM (2.6B) 模型的文本生成代码,可用于文本生成的本地测试,并以此为基础进一步研究零次学习/少次学习等场景。[项目首页] [模型下载] [技术报告] 若您想使用CPM-1进行推理,我们建议使用高效推理工具BMI

Tsinghua AI 1.4k Jan 03, 2023
Google and Stanford University released a new pre-trained model called ELECTRA

Google and Stanford University released a new pre-trained model called ELECTRA, which has a much compact model size and relatively competitive performance compared to BERT and its variants. For furth

Yiming Cui 1.2k Dec 30, 2022
Chinese named entity recognization (bert/roberta/macbert/bert_wwm with Keras)

Chinese named entity recognization (bert/roberta/macbert/bert_wwm with Keras)

2 Jul 05, 2022