TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)

Overview

TensorFlow Examples

This tutorial was designed for easily diving into TensorFlow, through examples. For readability, it includes both notebooks and source codes with explanation, for both TF v1 & v2.

It is suitable for beginners who want to find clear and concise examples about TensorFlow. Besides the traditional 'raw' TensorFlow implementations, you can also find the latest TensorFlow API practices (such as layers, estimator, dataset, ...).

Update (05/16/2020): Moving all default examples to TF2. For TF v1 examples: check here.

Tutorial index

0 - Prerequisite

1 - Introduction

  • Hello World (notebook). Very simple example to learn how to print "hello world" using TensorFlow 2.0+.
  • Basic Operations (notebook). A simple example that cover TensorFlow 2.0+ basic operations.

2 - Basic Models

  • Linear Regression (notebook). Implement a Linear Regression with TensorFlow 2.0+.
  • Logistic Regression (notebook). Implement a Logistic Regression with TensorFlow 2.0+.
  • Word2Vec (Word Embedding) (notebook). Build a Word Embedding Model (Word2Vec) from Wikipedia data, with TensorFlow 2.0+.
  • GBDT (Gradient Boosted Decision Trees) (notebooks). Implement a Gradient Boosted Decision Trees with TensorFlow 2.0+ to predict house value using Boston Housing dataset.

3 - Neural Networks

Supervised
  • Simple Neural Network (notebook). Use TensorFlow 2.0 'layers' and 'model' API to build a simple neural network to classify MNIST digits dataset.
  • Simple Neural Network (low-level) (notebook). Raw implementation of a simple neural network to classify MNIST digits dataset.
  • Convolutional Neural Network (notebook). Use TensorFlow 2.0+ 'layers' and 'model' API to build a convolutional neural network to classify MNIST digits dataset.
  • Convolutional Neural Network (low-level) (notebook). Raw implementation of a convolutional neural network to classify MNIST digits dataset.
  • Recurrent Neural Network (LSTM) (notebook). Build a recurrent neural network (LSTM) to classify MNIST digits dataset, using TensorFlow 2.0 'layers' and 'model' API.
  • Bi-directional Recurrent Neural Network (LSTM) (notebook). Build a bi-directional recurrent neural network (LSTM) to classify MNIST digits dataset, using TensorFlow 2.0+ 'layers' and 'model' API.
  • Dynamic Recurrent Neural Network (LSTM) (notebook). Build a recurrent neural network (LSTM) that performs dynamic calculation to classify sequences of variable length, using TensorFlow 2.0+ 'layers' and 'model' API.
Unsupervised
  • Auto-Encoder (notebook). Build an auto-encoder to encode an image to a lower dimension and re-construct it.
  • DCGAN (Deep Convolutional Generative Adversarial Networks) (notebook). Build a Deep Convolutional Generative Adversarial Network (DCGAN) to generate images from noise.

4 - Utilities

  • Save and Restore a model (notebook). Save and Restore a model with TensorFlow 2.0+.
  • Build Custom Layers & Modules (notebook). Learn how to build your own layers / modules and integrate them into TensorFlow 2.0+ Models.
  • Tensorboard (notebook). Track and visualize neural network computation graph, metrics, weights and more using TensorFlow 2.0+ tensorboard.

5 - Data Management

  • Load and Parse data (notebook). Build efficient data pipeline with TensorFlow 2.0 (Numpy arrays, Images, CSV files, custom data, ...).
  • Build and Load TFRecords (notebook). Convert data into TFRecords format, and load them with TensorFlow 2.0+.
  • Image Transformation (i.e. Image Augmentation) (notebook). Apply various image augmentation techniques with TensorFlow 2.0+, to generate distorted images for training.

6 - Hardware

  • Multi-GPU Training (notebook). Train a convolutional neural network with multiple GPUs on CIFAR-10 dataset.

TensorFlow v1

The tutorial index for TF v1 is available here: TensorFlow v1.15 Examples. Or see below for a list of the examples.

Dataset

Some examples require MNIST dataset for training and testing. Don't worry, this dataset will automatically be downloaded when running examples. MNIST is a database of handwritten digits, for a quick description of that dataset, you can check this notebook.

Official Website: http://yann.lecun.com/exdb/mnist/.

Installation

To download all the examples, simply clone this repository:

git clone https://github.com/aymericdamien/TensorFlow-Examples

To run them, you also need the latest version of TensorFlow. To install it:

pip install tensorflow

or (with GPU support):

pip install tensorflow_gpu

For more details about TensorFlow installation, you can check TensorFlow Installation Guide

TensorFlow v1 Examples - Index

The tutorial index for TF v1 is available here: TensorFlow v1.15 Examples.

0 - Prerequisite

1 - Introduction

  • Hello World (notebook) (code). Very simple example to learn how to print "hello world" using TensorFlow.
  • Basic Operations (notebook) (code). A simple example that cover TensorFlow basic operations.
  • TensorFlow Eager API basics (notebook) (code). Get started with TensorFlow's Eager API.

2 - Basic Models

  • Linear Regression (notebook) (code). Implement a Linear Regression with TensorFlow.
  • Linear Regression (eager api) (notebook) (code). Implement a Linear Regression using TensorFlow's Eager API.
  • Logistic Regression (notebook) (code). Implement a Logistic Regression with TensorFlow.
  • Logistic Regression (eager api) (notebook) (code). Implement a Logistic Regression using TensorFlow's Eager API.
  • Nearest Neighbor (notebook) (code). Implement Nearest Neighbor algorithm with TensorFlow.
  • K-Means (notebook) (code). Build a K-Means classifier with TensorFlow.
  • Random Forest (notebook) (code). Build a Random Forest classifier with TensorFlow.
  • Gradient Boosted Decision Tree (GBDT) (notebook) (code). Build a Gradient Boosted Decision Tree (GBDT) with TensorFlow.
  • Word2Vec (Word Embedding) (notebook) (code). Build a Word Embedding Model (Word2Vec) from Wikipedia data, with TensorFlow.

3 - Neural Networks

Supervised
  • Simple Neural Network (notebook) (code). Build a simple neural network (a.k.a Multi-layer Perceptron) to classify MNIST digits dataset. Raw TensorFlow implementation.
  • Simple Neural Network (tf.layers/estimator api) (notebook) (code). Use TensorFlow 'layers' and 'estimator' API to build a simple neural network (a.k.a Multi-layer Perceptron) to classify MNIST digits dataset.
  • Simple Neural Network (eager api) (notebook) (code). Use TensorFlow Eager API to build a simple neural network (a.k.a Multi-layer Perceptron) to classify MNIST digits dataset.
  • Convolutional Neural Network (notebook) (code). Build a convolutional neural network to classify MNIST digits dataset. Raw TensorFlow implementation.
  • Convolutional Neural Network (tf.layers/estimator api) (notebook) (code). Use TensorFlow 'layers' and 'estimator' API to build a convolutional neural network to classify MNIST digits dataset.
  • Recurrent Neural Network (LSTM) (notebook) (code). Build a recurrent neural network (LSTM) to classify MNIST digits dataset.
  • Bi-directional Recurrent Neural Network (LSTM) (notebook) (code). Build a bi-directional recurrent neural network (LSTM) to classify MNIST digits dataset.
  • Dynamic Recurrent Neural Network (LSTM) (notebook) (code). Build a recurrent neural network (LSTM) that performs dynamic calculation to classify sequences of different length.
Unsupervised
  • Auto-Encoder (notebook) (code). Build an auto-encoder to encode an image to a lower dimension and re-construct it.
  • Variational Auto-Encoder (notebook) (code). Build a variational auto-encoder (VAE), to encode and generate images from noise.
  • GAN (Generative Adversarial Networks) (notebook) (code). Build a Generative Adversarial Network (GAN) to generate images from noise.
  • DCGAN (Deep Convolutional Generative Adversarial Networks) (notebook) (code). Build a Deep Convolutional Generative Adversarial Network (DCGAN) to generate images from noise.

4 - Utilities

  • Save and Restore a model (notebook) (code). Save and Restore a model with TensorFlow.
  • Tensorboard - Graph and loss visualization (notebook) (code). Use Tensorboard to visualize the computation Graph and plot the loss.
  • Tensorboard - Advanced visualization (notebook) (code). Going deeper into Tensorboard; visualize the variables, gradients, and more...

5 - Data Management

  • Build an image dataset (notebook) (code). Build your own images dataset with TensorFlow data queues, from image folders or a dataset file.
  • TensorFlow Dataset API (notebook) (code). Introducing TensorFlow Dataset API for optimizing the input data pipeline.
  • Load and Parse data (notebook). Build efficient data pipeline (Numpy arrays, Images, CSV files, custom data, ...).
  • Build and Load TFRecords (notebook). Convert data into TFRecords format, and load them.
  • Image Transformation (i.e. Image Augmentation) (notebook). Apply various image augmentation techniques, to generate distorted images for training.

6 - Multi GPU

  • Basic Operations on multi-GPU (notebook) (code). A simple example to introduce multi-GPU in TensorFlow.
  • Train a Neural Network on multi-GPU (notebook) (code). A clear and simple TensorFlow implementation to train a convolutional neural network on multiple GPUs.

More Examples

The following examples are coming from TFLearn, a library that provides a simplified interface for TensorFlow. You can have a look, there are many examples and pre-built operations and layers.

Tutorials

  • TFLearn Quickstart. Learn the basics of TFLearn through a concrete machine learning task. Build and train a deep neural network classifier.

Examples

Owner
Aymeric Damien
Deep Learning Enthusiast. MLE @Snapchat. Past: Tsinghua University, EISTI
Aymeric Damien
TAPEX: Table Pre-training via Learning a Neural SQL Executor

TAPEX: Table Pre-training via Learning a Neural SQL Executor The official repository which contains the code and pre-trained models for our paper TAPE

Microsoft 157 Dec 28, 2022
System Combination for Grammatical Error Correction Based on Integer Programming

System Combination for Grammatical Error Correction Based on Integer Programming This repository contains the code and scripts that implement the syst

NUS NLP Group 0 Mar 29, 2022
Research code for CVPR 2021 paper "End-to-End Human Pose and Mesh Reconstruction with Transformers"

MeshTransformer ✨ This is our research code of End-to-End Human Pose and Mesh Reconstruction with Transformers. MEsh TRansfOrmer is a simple yet effec

Microsoft 473 Dec 31, 2022
Seg-Torch for Image Segmentation with Torch

Seg-Torch for Image Segmentation with Torch This work was sparked by my personal research on simple segmentation methods based on deep learning. It is

Eren Gölge 37 Dec 12, 2022
Classification of EEG data using Deep Learning

Graduation-Project Classification of EEG data using Deep Learning Epilepsy is the most common neurological disease in the world. Epilepsy occurs as a

Osman Alpaydın 5 Jun 24, 2022
The PyTorch implementation for paper "Neural Texture Extraction and Distribution for Controllable Person Image Synthesis" (CVPR2022 Oral)

ArXiv | Get Start Neural-Texture-Extraction-Distribution The PyTorch implementation for our paper "Neural Texture Extraction and Distribution for Cont

Ren Yurui 111 Dec 10, 2022
Tutorial: Introduction to Graph Machine Learning, with Jupyter notebooks

GraphMLTutorialNLDL22 Tutorial NLDL22: Introduction to Graph Machine Learning, with Jupyter notebooks This tutorial takes place during the conference

UiT Machine Learning Group 3 Jan 10, 2022
Implementation of the ALPHAMEPOL algorithm, presented in Unsupervised Reinforcement Learning in Multiple Environments.

ALPHAMEPOL This repository contains the implementation of the ALPHAMEPOL algorithm, presented in Unsupervised Reinforcement Learning in Multiple Envir

3 Dec 23, 2021
Deep Occlusion-Aware Instance Segmentation with Overlapping BiLayers [CVPR 2021]

Deep Occlusion-Aware Instance Segmentation with Overlapping BiLayers [BCNet, CVPR 2021] This is the official pytorch implementation of BCNet built on

Lei Ke 434 Dec 01, 2022
This project uses Template Matching technique for object detecting by detection of template image over base image.

Object Detection Project Using OpenCV This project uses Template Matching technique for object detecting by detection the template image over base ima

Pratham Bhatnagar 7 May 29, 2022
PyTorch code of "SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks"

SLAPS-GNN This repo contains the implementation of the model proposed in SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks

60 Dec 22, 2022
This repository contains PyTorch models for SpecTr (Spectral Transformer).

SpecTr: Spectral Transformer for Hyperspectral Pathology Image Segmentation This repository contains PyTorch models for SpecTr (Spectral Transformer).

Boxiang Yun 45 Dec 13, 2022
Information Gain Filtration (IGF) is a method for filtering domain-specific data during language model finetuning. IGF shows significant improvements over baseline fine-tuning without data filtration.

Information Gain Filtration Information Gain Filtration (IGF) is a method for filtering domain-specific data during language model finetuning. IGF sho

4 Jul 28, 2022
Kaggle Feedback Prize - Evaluating Student Writing 15th solution

Kaggle Feedback Prize - Evaluating Student Writing 15th solution First of all, I would like to thank the excellent notebooks and discussions from http

Lingyuan Zhang 6 Mar 24, 2022
Code for ICML 2021 paper: How could Neural Networks understand Programs?

OSCAR This repository contains the source code of our ICML 2021 paper How could Neural Networks understand Programs?. Environment Run following comman

Dinglan Peng 115 Dec 17, 2022
The official implementation code of "PlantStereo: A Stereo Matching Benchmark for Plant Surface Dense Reconstruction."

PlantStereo This is the official implementation code for the paper "PlantStereo: A Stereo Matching Benchmark for Plant Surface Dense Reconstruction".

Wang Qingyu 14 Nov 28, 2022
Code to use Augmented Shapiro Wilks Stopping, as well as code for the paper "Statistically Signifigant Stopping of Neural Network Training"

This codebase is being actively maintained, please create and issue if you have issues using it Basics All data files are included under losses and ea

J K Terry 32 Nov 09, 2021
Official implementation of "Implicit Neural Representations with Periodic Activation Functions"

Implicit Neural Representations with Periodic Activation Functions Project Page | Paper | Data Vincent Sitzmann*, Julien N. P. Martel*, Alexander W. B

Vincent Sitzmann 1.4k Jan 06, 2023
Code for "Neural 3D Scene Reconstruction with the Manhattan-world Assumption" CVPR 2022 Oral

News 05/10/2022 To make the comparison on ScanNet easier, we provide all quantitative and qualitative results of baselines here, including COLMAP, COL

ZJU3DV 365 Dec 30, 2022
Deep Distributed Control of Port-Hamiltonian Systems

De(e)pendable Distributed Control of Port-Hamiltonian Systems (DeepDisCoPH) This repository is associated to the paper [1] and it contains: The full p

Dependable Control and Decision group - EPFL 3 Aug 17, 2022