Mô tả

This course is a comprehensive guide to Deep Learning and Neural Networks. The theories are explained in depth and in a friendly manner. After that, we'll have the hands-on session, where we will be learning how to code Neural Networks in PyTorch, a very advanced and powerful deep learning framework!

The course includes the following Sections:

--------------------------------------------------------------------------------------------------------

Section 1 - How Neural Networks and Backpropagation Works

In this section, you will deeply understand the theories of how neural networks  and the backpropagation algorithm works, in a friendly manner. We will walk through an example and do the calculations step-by-step. We will also discuss the activation functions used in Neural Networks, with their advantages and disadvantages!

Section 2 - Loss Functions

In this section, we will introduce the famous loss functions that are used in Deep Learning and Neural Networks. We will walk through when to use them and how they work.

Section 3 - Optimization

In this section, we will discuss the optimization techniques used in Neural Networks, to reach the optimal Point, including Gradient Descent, Stochastic Gradient Descent, Momentum, RMSProp, Adam, AMSGrad, Weight Decay and Decoupling Weight Decay, LR Scheduler and others.

Section 4 - Weight Initialization

In this section,we will introduce you to the concepts of weight initialization in neural networks, and we will discuss some techniques of weights initialization including Xavier initialization and He norm initialization.

Section 5 - Regularization Techniques

In this section, we will introduce you to the regularization techniques in neural networks. We will first introduce overfitting and then introduce how to prevent overfitting by using regularization techniques, inclusing L1, L2 and Dropout. We'll also talk about normalization as well as batch normalization and Layer Normalization.

Section 6- Introduction to PyTorch

In this section, we will introduce the deep learning framework we'll be using through this course, which is PyTorch. We will show you how to install it, how it works and why it's special, and then we will code some PyTorch tensors and show you some operations on tensors, as well as show you Autograd in code!

Section 7 - Practical Neural Networks in PyTorch - Application 1

In this section, you will apply what you've learned to build a Feed Forward Neural Network to classify handwritten digits. This is the first application of Feed Forward Networks we will be showing.

Section 8 - Practical Neural Networks in PyTorch - Application 2

In this section, we will build a feed forward Neural Network to classify weather a person has diabetes or not. We will train the network on a large dataset of diabetes!

Section 9 - Visualize the Learning Process

In this section, we will visualize how neural networks are learning, and how good they are at separating non-linear data!

Section 10 - Implementing a Neural Network from Scratch with Python and Numpy

In this section, we will understand and code up a neural network without using any deep learning library (from scratch using only python and numpy). This is necessary to understand how the underlying structure works.

Section 11 - Convolutional Neural Networks

In this section, we will introduce you to Convolutional Networks that are used for images. We will show you first the relationship to Feed Forward Networks, and then we will introduce you the concepts of Convolutional Networks one by one!

Section 12 - Practical Convolutional Networks in PyTorch

In this section, we will apply Convolutional Networks to classify handwritten digits. This is the first application of CNNs we will do.

Section 13- Deeper into CNN: Improving and Plotting

In this section, we will improve the CNN that we built in the previous section, as well show you how to plot the results of training and testing! Moreover, we will show you how to classify your own handwritten images through the network!

Section 14 - CNN Architectures

In this section, we will introduce the CNN architectures that are widely used in all deep learning applications. These architectures are: AlexNet, VGG net, Inception Net, Residual Networks and Densely Connected Networks. We will also discuss some object detection architectures.

Section 15- Residual Networks

In this section, we will dive deep into the details and theory of Residual Networks, and then we'll build a Residual Network in PyTorch from scratch!

Section 16 - Transfer Learning in PyTorch - Image Classification

In this section, we will apply transfer learning on a Residual Network, to classify ants and bees. We will also show you how to use your own dataset and apply image augmentation. After completing this section, you will be able to classify any images you want!

Section 17- Convolutional Networks Visualization

In this section, we will visualize what the neural networks output, and what they are really learning. We will observe the feature maps of the network of every layer!

Section 18 - YOLO Object Detection (Theory)

In this section, we will learn one of the most famous Object Detection Frameworks: YOLO!! This section covers the theory of YOLO in depth.

Section 19 - Autoencoders and Variational Autoencoders

In this section, we will cover Autoencoders and Denoising Autoencoders. We will then see the problem they face and learn how to mitigate it with Variational Autoencoders.

Section 20 - Recurrent Neural Networks

In this section, we will introduce you to Recurrent Neural Networks and all their concepts. We will then discuss the Backpropagation through  time, the vanishing gradient problem, and finally about Long Short Term Memory (LSTM) that solved the problems RNN suffered from.

Section 21 - Word Embeddings

In this section, we will discuss how words are represented as features. We will then show you some Word Embedding models.  We will also show you how to implement word embedding in PyTorch!

Section 22 - Practical Recurrent Networks in PyTorch

In this section, we will apply Recurrent Neural Networks using LSTMs in PyTorch to generate text similar to the story of Alice in Wonderland! You can just replace the story with any other text you want, and the RNN will be able to generate text similar to it!

Section 23 - Sequence Modelling

In this section, we will learn about Sequence-to-Sequence Modelling. We will see how Seq2Seq models work and where they are applied. We'll also talk about Attention mechanisms and see how they work.

Section 24 - Practical Sequence Modelling in PyTorch - Build a Chatbot

In this section, we will apply what we learned about sequence modeling and build a Chatbot with Attention Mechanism.

Section 25 - Saving and Loading Models

In this section, we will show you how to save and load models in PyTorch, so you can use these models either for later testing, or for resuming training!

Section 26 - Transformers

In this section, we will cover the Transformer, which is the current state-of-art model for NLP and language modeling tasks. We will go through each component of a transformer.

Section 27 - Build a Chatbot with Transformers

In this section, we will implement all what we learned in the previous section to build a Chatbot using Transformers.

Bạn sẽ học được gì

Understand How Neural Networks Work (Theory and Applications)

Understand How Convolutional Networks Work (Theory and Applications)

Understand How Recurrent Networks and LSTMs work (Theory and Applications)

Learn how to use PyTorch in depth

Understand how the Backpropagation algorithm works

Understand Loss Functions in Neural Networks

Understand Weight Initialization and Regularization Techniques

Code-up a Neural Network from Scratch using Numpy

Apply Transfer Learning to CNNs

CNN Visualization

Learn the CNN Architectures that are widely used nowadays

Understand Residual Networks in Depth

Understand YOLO Object Detection in Depth

Visualize the Learning Process of Neural Networks

Learn how to Save and Load trained models

Learn Sequence Modeling with Attention Mechanisms

Build a Chatbot with Attention

Transformers

Build a Chatbot with Transformers

BERT

Build an Image Captioning Model

Yêu cầu

  • Some Basic Python Expreience is preferable
  • Some High School Mathematics

Nội dung khoá học

39 sections

How Neural Networks and Backpropagation Works

10 lectures
BEFORE STARTING...PLEASE READ THIS
00:26
What Can Deep Learning Do?
13:58
The Rise of Deep Learning
06:02
The Essence of Neural Networks
09:15
The Perceptron
16:03
Gradient Descent
11:39
The Forward Propagation
10:43
Before Proceeding with the Backpropagation
00:14
Backpropagation Part 1
10:30
Backpropagation Part 2
09:27

Loss Functions

12 lectures
Mean Squared Error (MSE)
07:01
L1 Loss (MAE)
08:26
Huber Loss
05:55
Binary Cross Entropy Loss
12:56
Cross Entropy Loss
07:56
Softmax Function
06:36
Softmax with Temperature: Controlling your distribution
00:16
KL divergence Loss
06:48
Contrastive Loss
11:09
Hinge Loss
11:14
Triplet Ranking Loss
11:35
Practical Loss Functions Note
00:06

Activation Functions

8 lectures
Why we need activation functions
03:58
Sigmoid Activation
05:47
Tanh Activation
03:03
ReLU and PReLU
06:44
Exponentially Linear Units (ELU)
03:43
Gated Linear Units (GLU)
03:07
Swish Activation
03:59
Mish Activation
05:52

Regularization and Normalization

8 lectures
Overfitting
04:51
L1 and L2 Regularization
09:30
Dropout
08:56
DropConnect
01:48
Normalization
03:51
Batch Normalization
11:58
Layer Normalization
07:12
Group Normalization
05:40

Optimization

13 lectures
Batch Gradient Descent
06:05
Stochastic Gradient Descent
04:53
Mini-Batch Gradient Descent
02:21
Exponentially Weighted Average Intuition
05:16
Exponentially Weighted Average Implementation
08:02
Bias Correction in Exponentially Weighted Averages
05:37
Momentum
06:05
RMSProp
10:25
Adam Optimization
07:01
SWATS - Switching from Adam to SGD
01:33
Weight Decay
06:45
Decoupling Weight Decay
04:13
AMSGrad
08:29

Hyperparameter Tuning and Learning Rate Scheduling

5 lectures
Introduction to Hyperparameter Tuning and Learning Rate Recap
05:02
Step Learning Rate Decay
10:21
Cyclic Learning Rate
09:13
Cosine Annealing with Warm Restarts
05:05
Batch Size vs Learning Rate
02:55

Weight Initialization

5 lectures
Normal Distribution
05:54
What happens when all weights are initialized to the same value?
09:27
Xavier Initialization
10:05
He Norm Initialization
03:52
Practical Weight Initialization Note
00:06

Introduction to PyTorch

10 lectures
CODE FOR THIS COURSE
00:24
Computation Graphs and Deep Learning Frameworks
13:23
Installing PyTorch and an Introduction
10:16
How PyTorch Works
18:38
Torch Tensors - Part 1
09:55
Torch Tensors - Part 2
08:48
Numpy Bridge, Tensor Concatenation and Adding Dimensions
09:23
Automatic Differentiation
08:10
Loss Functions in PyTorch
28:10
Weight Initialization in PyTorch
11:51

Data Augmentation

4 lectures
1_Introduction to Data Augmentation
07:09
2_Data Augmentation Techniques Part 1
09:28
2_Data Augmentation Techniques Part 2
17:10
2_Data Augmentation Techniques Part 3
07:37

Practical Neural Networks in PyTorch - Application 1: Diabetes

6 lectures
Download the Dataset
00:06
Part 1: Data Preprocessing
13:55
Part 2: Data Normalization
07:16
Part 3: Creating and Loading the Dataset
07:36
Part 4: Building the Network
16:23
Part 5: Training the Network
17:09

Visualize the Learning Process

7 lectures
Visualize Learning Part 1
08:58
Visualize Learning Part 2
01:49
Visualize Learning Part 3
07:35
Visualize Learning Part 4
05:17
Visualize Learning Part 5
11:16
Visualize Learning Part 6
07:27
Neural Networks Playground
04:45

Implementing a Neural Network from Scratch with Numpy

10 lectures
The Dataset and Hyperparameters
11:43
Understanding the Implementation
07:32
Forward Propagation
12:16
Loss Function
15:11
Prediction
04:51
Notebook for the following Lecture
00:13
Backpropagation Equations
11:29
Backpropagation
20:58
Initializing the Network
06:27
Training the Model
03:58

Practical Neural Networks in PyTorch - Application 2: Handwritten Digits

7 lectures
The MNIST Dataset
00:11
Code Details
01:52
Importing and Defining Parameters
10:37
Defining the Network Class
08:16
Creating the network class and the network functions
05:10
Training the Network
22:03
Testing the Network
04:03

Convolutional Neural Networks

16 lectures
Prerequisite: Filters
04:54
Introduction to Convolutional Networks and the need for them
07:04
Filters and Features
09:24
Convolution over Volume Animation Resource
00:07
Convolution over Volume Animation
03:31
More on Convolutions
05:58
Test your Understanding
3 questions
Quiz Solution Discussion
03:04
A Tool for Convolution Visualization
04:05
Activation, Pooling and FC
12:17
CNN Visualization
01:58
Important formulas
04:43
CNN Characteristics
07:39
Regularization and Batch Normalization in CNNs
03:26
DropBlock: Dropout in CNNs
11:14
Softmax with Temperature
09:24

Practical Convolutional Networks in PyTorch - Image Classification

10 lectures
Loading and Normalizing the Dataset
11:19
Visualizing and Loading the Dataset
08:41
Building the CNN
22:29
Defining the Model
03:48
Understanding the Propagation
05:22
Training the CNN
15:13
Testing the CNN
06:46
Plotting and Putting into Action
04:30
Predicting an image
04:35
Classifying your own Handwritten images
11:25

CNN Architectures

11 lectures
CNN Architectures Part 1
11:33
Residual Networks Part 1
09:55
Residual Networks Part 2
16:55
Note on Residual Networks Implementation
00:04
Stochastic Depth
13:32
CNN Architectures Part 2
03:40
Densely Connected Networks
13:07
Squeeze-Excite Networks
09:21
Seperable Convolutions
10:24
Transfer Learning
08:31
Is a 1x1 convolutional filter equivalent to a FC layer?
09:48

Practical Residual Networks in PyTorch

4 lectures
Practical ResNet Part 1
12:43
Practical ResNet Part 2
11:29
Practical ResNet Part 3
11:55
Practical ResNet Part 4
12:34

Transposed Convolutions

3 lectures
Introduction to Transposed Convolutions
07:05
Convolution Operation as Matrix Multiplication
08:10
Transposed Convolutions
06:15

Transfer Learning in PyTorch - Image Classification

6 lectures
Data Augmentation
12:46
Loading the Dataset
11:02
Modifying the Network
07:59
Understanding the data
09:49
Finetuning the Network
05:29
Testing and Visualizing the results
09:49

Convolutional Networks Visualization

3 lectures
Data and the Model
07:11
Processing the Model
12:02
Visualizing the Feature Maps
11:47

YOLO Object Detection (Theory)

13 lectures
YOLO Theory Part 1
04:49
YOLO Theory Part 2
11:47
YOLO Theory Part 3
09:03
YOLO Theory Part 4
06:33
YOLO Theory Part 5
07:38
YOLO Theory Part 6
08:56
YOLO Theory Part 7
06:23
YOLO Theory Part 8
05:31
YOLO Theory Part 9
03:52
YOLO Theory Part 10
02:04
YOLO Theory Part 11
05:36
YOLO Theory Part 12
10:27
YOLO Code Note
00:41

Autoencoders and Variational Autoencoders

7 lectures
Autoencoders
08:47
Denoising Autoencoders
06:43
The Problem in Autoencoders
04:51
Variational Autoencoders
10:13
Probability Distributions Recap
30:58
Loss Function Derivation for VAE
27:37
Deep Fake
07:33

Practical Variational Autoencoders in PyTorch

3 lectures
Practical VAE Part 1
19:09
Practical VAE Part 2
11:22
Practical VAE Part 3
11:38

Neural Style Transfer

3 lectures
NST Theory Part 1
06:33
NST Theory Part 2
06:00
NST Theory Part 3
10:09

Practical Neural Style Transfer in PyTorch

5 lectures
NST Practical Part 1
11:27
NST Practical Part 2
10:28
NST Practical Part 3
11:52
NST Practical Part 4
15:10
Fast Neural Style Transfer
03:42

Recurrent Neural Networks

11 lectures
Why do we need RNNs
05:09
Vanilla RNNs
07:42
Test your understanding
4 questions
Quiz Solution Discussion
03:37
Backpropagation Through Time
11:54
Stacked RNNs
02:37
Vanishing and Exploding Gradient Problem
09:36
LSTMs
19:44
Bidirectional RNNs
03:48
GRUs
06:35
CNN-LSTM
04:46

Word Embeddings

5 lectures
What are Word Embeddings
09:00
Visualizing Word Embeddings
03:12
Measuring Word Embeddings
01:49
Word Embeddings Models
03:16
Word Embeddings in PyTorch
05:54

Practical Recurrent Networks in PyTorch

7 lectures
Download the Dataset
00:06
Creating the Dictionary
06:02
Processing the Text
09:59
Defining and Visualizing the Parameters
07:13
Creating the Network
10:41
Training the Network
10:05
Generating Text
12:45

Saving and Loading Models

3 lectures
Saving and Loading Part 1
13:20
Saving and Loading Part 2
07:15
Saving and Loading Part 3
05:15

Sequence Modelling

4 lectures
Sequence Modeling
13:13
Image Captioning
04:49
Attention Mechanisms
05:45
How Attention Mechanisms Work
11:14

Practical Sequence Modelling in PyTorch: Chatbot Application

9 lectures
Download the Dataset
00:05
Introduction
05:21
Understanding the Encoder
05:40
Defining the Encoder
23:12
Understanding Pack Padded Sequence
07:26
Designing the Attention Model
15:21
Designing the Decoder Part 1
13:52
Designing the Decoder Part 2
16:12
Teacher Forcing
05:01

Practical Sequence Modelling in PyTorch: Image Captioning

14 lectures
Implementation Details
10:41
Utility Functions
13:05
Accuracy Calculation
09:45
Constructing the Dataset Part 1
12:51
Constructing the Dataset Part 2
11:12
Creating the Encoder
16:03
Creating the Decoder Part 1
14:38
Creating the Decoder Part 2
09:50
Creating the Decoder Part 3
12:00
Train Function
14:33
Defining Hyperparameters
12:27
Evaluation Function
15:03
Training
02:16
Results
02:11

Transformers

17 lectures
SANITY CHECK ON PREVIOUS SECTIONS
00:07
Introduction to Transformers
11:57
Input Embeddings
06:43
Positional Encoding
13:30
MultiHead Attention Part 1
09:51
MultiHead Attention Part 2
07:57
Concat and Linear
03:09
Residual Learning
06:26
Layer Normalization
06:35
Feed Forward
03:17
Masked MultiHead Attention
06:36
MultiHead Attention in Decoder
02:35
Cross Entropy Loss
11:52
KL Divergence Loss
05:52
Label Smoothing
04:10
Dropout
08:56
Learning Rate Warmup
06:08

Build a Chatbot with Transformers

22 lectures
CODE
00:05
Dataset Preprocessing Part 1
10:04
Dataset Preprocessing Part 2
16:16
Dataset Preprocessing Part 3
11:17
Dataset Preprocessing Part 4
05:01
Dataset Preprocessing Part 5
09:44
Data Loading and Masking
13:16
Embeddings
14:16
MultiHead Attention Implementation Part 1
07:02
MultiHead Attention Implementation Part 2
08:15
MultiHead Attention Implementation Part 3
13:13
Feed Forward Implementation
03:30
Encoder Layer
07:43
Decoder Layer
05:27
Transformer
11:51
AdamWarmup
06:49
Loss with Label Smoothing
19:15
Defining the Model
06:40
Training Function
11:24
Evaluation Function
17:02
Main Function and User Evaluation
10:15
Action
03:02

Universal Transformers

3 lectures
Universal Transformers
06:29
Practical Universal Transformers: Modifying the Transformers code
12:04
Transformers for other tasks
08:42

Google Colab and Gradient Accumulation

2 lectures
Running your models on Google Colab
08:22
Gradient Accumulation
15:44

BERT

5 lectures
What is BERT and its structure
08:24
Masked Language Modelling
05:03
Next Sentence Prediction
07:52
Fine-tuning BERT
06:28
Exploring Transformers
15:24

Vision Transformers

3 lectures
Vision Transformer Part 1
12:50
Vision Transformer Part 2
08:58
Vision Transformer Part 3
12:06

GPT

14 lectures
GPT Part 1
10:57
GPT Part 2
09:00
Zero-Shot Predictions with GPT
07:57
Byte-Pair Encoding
08:01
Technical Details of GPT
06:25
Playing with HuggingFace models
07:12
Implementation
00:05
(1) GPT Implementation Part 1
09:11
(2) GPT Implementation Part 1
13:38
(3) GPT Implementation Part 1
13:53
(4) GPT Implementation Part 1
09:03
(5) GPT Implementation Part 1
12:17
(6) GPT Implementation Part 1
15:31
(7) GPT Implementation Part 1
14:10

Đánh giá của học viên

Chưa có đánh giá
Course Rating
5
0%
4
0%
3
0%
2
0%
1
0%

Bình luận khách hàng

Viết Bình Luận

Bạn đánh giá khoá học này thế nào?

image

Đăng ký get khoá học Udemy - Unica - Gitiho giá chỉ 50k!

Get khoá học giá rẻ ngay trước khi bị fix.