TensorFlow Tutorials with YouTube Videos

Overview

TensorFlow Tutorials

Original repository on GitHub

Original author is Magnus Erik Hvass Pedersen

Introduction

  • These tutorials are intended for beginners in Deep Learning and TensorFlow.
  • Each tutorial covers a single topic.
  • The source-code is well-documented.
  • There is a YouTube video for each tutorial.

Tutorials for TensorFlow 2

The following tutorials have been updated and work with TensorFlow 2 (some of them run in "v.1 compatibility mode").

  1. Simple Linear Model (Notebook) (Google Colab)

  2. Convolutional Neural Network (Notebook) (Google Colab)

3-C. Keras API (Notebook) (Google Colab)

  1. Fine-Tuning (Notebook) (Google Colab)

13-B. Visual Analysis for MNIST (Notebook) (Google Colab)

  1. Reinforcement Learning (Notebook) (Google Colab)

  2. Hyper-Parameter Optimization (Notebook) (Google Colab)

  3. Natural Language Processing (Notebook) (Google Colab)

  4. Machine Translation (Notebook) (Google Colab)

  5. Image Captioning (Notebook) (Google Colab)

  6. Time-Series Prediction (Notebook) (Google Colab)

Tutorials for TensorFlow 1

The following tutorials only work with the older TensorFlow 1 API, so you would need to install an older version of TensorFlow to run these. It would take too much time and effort to convert these tutorials to TensorFlow 2.

  1. Pretty Tensor (Notebook) (Google Colab)

3-B. Layers API (Notebook) (Google Colab)

  1. Save & Restore (Notebook) (Google Colab)

  2. Ensemble Learning (Notebook) (Google Colab)

  3. CIFAR-10 (Notebook) (Google Colab)

  4. Inception Model (Notebook) (Google Colab)

  5. Transfer Learning (Notebook) (Google Colab)

  6. Video Data (Notebook) (Google Colab)

  7. Adversarial Examples (Notebook) (Google Colab)

  8. Adversarial Noise for MNIST (Notebook) (Google Colab)

  9. Visual Analysis (Notebook) (Google Colab)

  10. DeepDream (Notebook) (Google Colab)

  11. Style Transfer (Notebook) (Google Colab)

  12. Estimator API (Notebook) (Google Colab)

  13. TFRecords & Dataset API (Notebook) (Google Colab)

Videos

These tutorials are also available as YouTube videos.

Translations

These tutorials have been translated to the following languages:

New Translations

You can help by translating the remaining tutorials or reviewing the ones that have already been translated. You can also help by translating to other languages.

It is a very big job to translate all the tutorials, so you should just start with Tutorials #01, #02 and #03-C which are the most important for beginners.

New Videos

You are also very welcome to record your own YouTube videos in other languages. It is strongly recommended that you get a decent microphone because good sound quality is very important. I used vokoscreen for recording the videos and the free DaVinci Resolve for editing the videos.

Forks

See the selected list of forks for community modifications to these tutorials.

Installation

There are different ways of installing and running TensorFlow. This section describes how I did it for these tutorials. You may want to do it differently and you can search the internet for instructions.

If you are new to using Python and Linux then this may be challenging to get working and you may need to do internet searches for error-messages, etc. It will get easier with practice. You can also run the tutorials without installing anything by using Google Colab, see further below.

Some of the Python Notebooks use source-code located in different files to allow for easy re-use across multiple tutorials. It is therefore recommended that you download the whole repository from GitHub, instead of just downloading the individual Python Notebooks.

Git

The easiest way to download and install these tutorials is by using git from the command-line:

git clone https://github.com/Hvass-Labs/TensorFlow-Tutorials.git

This will create the directory TensorFlow-Tutorials and download all the files to it.

This also makes it easy to update the tutorials, simply by executing this command inside that directory:

git pull

Download Zip-File

You can also download the contents of the GitHub repository as a Zip-file and extract it manually.

Environment

I use Anaconda because it comes with many Python packages already installed and it is easy to work with. After installing Anaconda, you should create a conda environment so you do not destroy your main installation in case you make a mistake somewhere:

conda create --name tf python=3

When Python gets updated to a new version, it takes a while before TensorFlow also uses the new Python version. So if the TensorFlow installation fails, then you may have to specify an older Python version for your new environment, such as:

conda create --name tf python=3.6

Now you can switch to the new environment by running the following (on Linux):

source activate tf

Required Packages

The tutorials require several Python packages to be installed. The packages are listed in requirements.txt

To install the required Python packages and dependencies you first have to activate the conda-environment as described above, and then you run the following command in a terminal:

pip install -r requirements.txt

Starting with TensorFlow 2.1 it includes both the CPU and GPU versions and will automatically switch if you have a GPU. But this requires the installation of various NVIDIA drivers, which is a bit complicated and is not described here.

Python Version 3.5 or Later

These tutorials were developed on Linux using Python 3.5 / 3.6 (the Anaconda distribution) and PyCharm.

There are reports that Python 2.7 gives error messages with these tutorials. Please make sure you are using Python 3.5 or later!

How To Run

If you have followed the above installation instructions, you should now be able to run the tutorials in the Python Notebooks:

cd ~/development/TensorFlow-Tutorials/  # Your installation directory.
jupyter notebook

This should start a web-browser that shows the list of tutorials. Click on a tutorial to load it.

Run in Google Colab

If you do not want to install anything on your own computer, then the Notebooks can be viewed, edited and run entirely on the internet by using Google Colab. There is a YouTube video explaining how to do this. You click the "Google Colab"-link next to each tutorial listed above. You can view the Notebook on Colab but in order to run it you need to login using your Google account. Then you need to execute the following commands at the top of the Notebook, which clones the contents of this repository to your work-directory on Colab.

# Clone the repository from GitHub to Google Colab's temporary drive.
import os
work_dir = "/content/TensorFlow-Tutorials/"
if not os.path.exists(work_dir):
    !git clone https://github.com/Hvass-Labs/TensorFlow-Tutorials.git
os.chdir(work_dir)

All required packages should already be installed on Colab, otherwise you can run the following command:

!pip install -r requirements.txt

Older Versions

Sometimes the source-code has changed from that shown in the YouTube videos. This may be due to bug-fixes, improvements, or because code-sections are moved to separate files for easy re-use.

If you want to see the exact versions of the source-code that were used in the YouTube videos, then you can browse the history of commits to the GitHub repository.

License (MIT)

These tutorials and source-code are published under the MIT License which allows very broad use for both academic and commercial purposes.

A few of the images used for demonstration purposes may be under copyright. These images are included under the "fair usage" laws.

You are very welcome to modify these tutorials and use them in your own projects. Please keep a link to the original repository.

Data & Code for ACCENTOR Adding Chit-Chat to Enhance Task-Oriented Dialogues

ACCENTOR: Adding Chit-Chat to Enhance Task-Oriented Dialogues Overview ACCENTOR consists of the human-annotated chit-chat additions to the 23.8K dialo

Facebook Research 69 Dec 29, 2022
ML models implementation practice

Let's implement various ML algorithms with numpy/tf Vanilla Neural Network https://towardsdatascience.com/lets-code-a-neural-network-in-plain-numpy-ae

Jinsoo Heo 4 Jul 04, 2021
Deep Learning Tutorial for Kaggle Ultrasound Nerve Segmentation competition, using Keras

Deep Learning Tutorial for Kaggle Ultrasound Nerve Segmentation competition, using Keras This tutorial shows how to use Keras library to build deep ne

Marko Jocić 922 Dec 19, 2022
Repository for tackling Kaggle Ultrasound Nerve Segmentation challenge using Torchnet.

Ultrasound Nerve Segmentation Challenge using Torchnet This repository acts as a starting point for someone who wants to start with the kaggle ultraso

Qure.ai 46 Jul 18, 2022
Code for our paper A Transformer-Based Feature Segmentation and Region Alignment Method For UAV-View Geo-Localization,

FSRA This repository contains the dataset link and the code for our paper A Transformer-Based Feature Segmentation and Region Alignment Method For UAV

Dmmm 32 Dec 18, 2022
Orchestrating Distributed Materials Acceleration Platform Tutorial

Orchestrating Distributed Materials Acceleration Platform Tutorial This tutorial for orchestrating distributed materials acceleration platform was pre

BIG-MAP 1 Jan 25, 2022
Score refinement for confidence-based 3D multi-object tracking

Score refinement for confidence-based 3D multi-object tracking Our video gives a brief explanation of our Method. This is the official code for the pa

Cognitive Systems Research Group 47 Dec 26, 2022
Easy to use Audio Tagging in PyTorch

Audio Classification, Tagging & Sound Event Detection in PyTorch Progress: Fine-tune on audio classification Fine-tune on audio tagging Fine-tune on s

sithu3 15 Dec 22, 2022
TensorFlow implementation of original paper : https://github.com/hszhao/PSPNet

Keras implementation of PSPNet(caffe) Implemented Architecture of Pyramid Scene Parsing Network in Keras. For the best compability please use Python3.

VladKry 386 Dec 29, 2022
Semi-automated OpenVINO benchmark_app with variable parameters

Semi-automated OpenVINO benchmark_app with variable parameters. User can specify multiple options for any parameters in the benchmark_app and the progam runs the benchmark with all combinations of gi

Yasunori Shimura 8 Apr 11, 2022
The project of phase's key role in complex and real NN

Phase-in-NN This is the code for our project at Princeton (co-authors: Yuqi Nie, Hui Yuan). The paper title is: "Neural Network is heterogeneous: Phas

YuqiNie-lab 1 Nov 04, 2021
Funnels: Exact maximum likelihood with dimensionality reduction.

Funnels This repository contains the code needed to reproduce the experiments from the paper: Funnels: Exact maximum likelihood with dimensionality re

2 Apr 21, 2022
A PoC Corporation Relationship Knowledge Graph System on top of Nebula Graph.

Corp-Rel is a PoC of Corpartion Relationship Knowledge Graph System. It's built on top of the Open Source Graph Database: Nebula Graph with a dataset

Wey Gu 20 Dec 11, 2022
DLWP: Deep Learning Weather Prediction

DLWP: Deep Learning Weather Prediction DLWP is a Python project containing data-

Kushal Shingote 3 Aug 14, 2022
Source code of the paper PatchGraph: In-hand tactile tracking with learned surface normals.

PatchGraph This repository contains the source code of the paper PatchGraph: In-hand tactile tracking with learned surface normals. Installation Creat

Paloma Sodhi 11 Dec 15, 2022
Pytorch implementation of Learning with Opponent-Learning Awareness

Pytorch implementation of Learning with Opponent-Learning Awareness using DiCE

Alexis David Jacq 82 Sep 15, 2022
Modified fork of Xuebin Qin's U-2-Net Repository. Used for demonstration purposes.

U^2-Net (U square net) Modified version of U2Net used for demonstation purposes. Paper: U^2-Net: Going Deeper with Nested U-Structure for Salient Obje

Shreyas Bhat Kera 13 Aug 28, 2022
Pytorch implementation of few-shot semantic image synthesis

Few-shot Semantic Image Synthesis Using StyleGAN Prior Our method can synthesize photorealistic images from dense or sparse semantic annotations using

40 Sep 26, 2022
TransMIL: Transformer based Correlated Multiple Instance Learning for Whole Slide Image Classification

TransMIL: Transformer based Correlated Multiple Instance Learning for Whole Slide Image Classification [NeurIPS 2021] Abstract Multiple instance learn

132 Dec 30, 2022
Pretrained models for Jax/Haiku; MobileNet, ResNet, VGG, Xception.

Pre-trained image classification models for Jax/Haiku Jax/Haiku Applications are deep learning models that are made available alongside pre-trained we

Alper Baris CELIK 14 Dec 20, 2022