Mô tả

What is PyTorch and why should I learn it?

PyTorch is a machine learning and deep learning framework written in Python.

PyTorch enables you to craft new and use existing state-of-the-art deep learning algorithms like neural networks powering much of today’s Artificial Intelligence (AI) applications.

Plus it's so hot right now, so there's lots of jobs available!

PyTorch is used by companies like:

  • Tesla to build the computer vision systems for their self-driving cars

  • Meta to power the curation and understanding systems for their content timelines

  • Apple to create computationally enhanced photography.

Want to know what's even cooler?

Much of the latest machine learning research is done and published using PyTorch code so knowing how it works means you’ll be at the cutting edge of this highly in-demand field.

And you'll be learning PyTorch in good company.

Graduates of Zero To Mastery are now working at Google, Tesla, Amazon, Apple, IBM, Uber, Meta, Shopify + other top tech companies at the forefront of machine learning and deep learning.

This can be you.

By enrolling today, you’ll also get to join our exclusive live online community classroom to learn alongside thousands of students, alumni, mentors, TAs and Instructors.

Most importantly, you will be learning PyTorch from a professional machine learning engineer, with real-world experience, and who is one of the best teachers around!

What will this PyTorch course be like?

This PyTorch course is very hands-on and project based. You won't just be staring at your screen. We'll leave that for other PyTorch tutorials and courses.

In this course you'll actually be:

  • Running experiments

  • Completing exercises to test your skills

  • Building real-world deep learning models and projects to mimic real life scenarios

By the end of it all, you'll have the skillset needed to identify and develop modern deep learning solutions that Big Tech companies encounter.

⚠ Fair warning: this course is very comprehensive. But don't be intimidated, Daniel will teach you everything from scratch and step-by-step!

Here's what you'll learn in this PyTorch course:

1. PyTorch Fundamentals — We start with the barebone fundamentals, so even if you're a beginner you'll get up to speed.

In machine learning, data gets represented as a tensor (a collection of numbers). Learning how to craft tensors with PyTorch is paramount to building machine learning algorithms. In PyTorch Fundamentals we cover the PyTorch tensor datatype in-depth.

2. PyTorch Workflow — Okay, you’ve got the fundamentals down, and you've made some tensors to represent data, but what now?

With PyTorch Workflow you’ll learn the steps to go from data -> tensors -> trained neural network model. You’ll see and use these steps wherever you encounter PyTorch code as well as for the rest of the course.

3. PyTorch Neural Network Classification — Classification is one of the most common machine learning problems.

  • Is something one thing or another?

  • Is an email spam or not spam?

  • Is credit card transaction fraud or not fraud?

With PyTorch Neural Network Classification you’ll learn how to code a neural network classification model using PyTorch so that you can classify things and answer these questions.

4. PyTorch Computer Vision — Neural networks have changed the game of computer vision forever. And now PyTorch drives many of the latest advancements in computer vision algorithms.

For example, Tesla use PyTorch to build the computer vision algorithms for their self-driving software.

With PyTorch Computer Vision you’ll build a PyTorch neural network capable of seeing patterns in images of and classifying them into different categories.

5. PyTorch Custom Datasets — The magic of machine learning is building algorithms to find patterns in your own custom data. There are plenty of existing datasets out there, but how do you load your own custom dataset into PyTorch?

This is exactly what you'll learn with the PyTorch Custom Datasets section of this course.

You’ll learn how to load an image dataset for FoodVision Mini: a PyTorch computer vision model capable of classifying images of pizza, steak and sushi (am I making you hungry to learn yet?!).

We’ll be building upon FoodVision Mini for the rest of the course.

6. PyTorch Going Modular — The whole point of PyTorch is to be able to write Pythonic machine learning code.

There are two main tools for writing machine learning code with Python:

  1. A Jupyter/Google Colab notebook (great for experimenting)

  2. Python scripts (great for reproducibility and modularity)

In the PyTorch Going Modular section of this course, you’ll learn how to take your most useful Jupyter/Google Colab Notebook code and turn it reusable Python scripts. This is often how you’ll find PyTorch code shared in the wild.

7. PyTorch Transfer Learning — What if you could take what one model has learned and leverage it for your own problems? That’s what PyTorch Transfer Learning covers.

You’ll learn about the power of transfer learning and how it enables you to take a machine learning model trained on millions of images, modify it slightly, and enhance the performance of FoodVision Mini, saving you time and resources.

8. PyTorch Experiment Tracking — Now we're going to start cooking with heat by starting Part 1 of our Milestone Project of the course!

At this point you’ll have built plenty of PyTorch models. But how do you keep track of which model performs the best?

That’s where PyTorch Experiment Tracking comes in.

Following the machine learning practitioner’s motto of experiment, experiment, experiment! you’ll setup a system to keep track of various FoodVision Mini experiment results and then compare them to find the best.

9. PyTorch Paper Replicating — The field of machine learning advances quickly. New research papers get published every day. Being able to read and understand these papers takes time and practice.

So that’s what PyTorch Paper Replicating covers. You’ll learn how to go through a machine learning research paper and replicate it with PyTorch code.

At this point you'll also undertake Part 2 of our Milestone Project, where you’ll replicate the groundbreaking Vision Transformer architecture!

10. PyTorch Model Deployment — By this stage your FoodVision model will be performing quite well. But up until now, you’ve been the only one with access to it.

How do you get your PyTorch models in the hands of others?

That’s what PyTorch Model Deployment covers. In Part 3 of your Milestone Project, you’ll learn how to take the best performing FoodVision Mini model and deploy it to the web so other people can access it and try it out with their own food images.

What's the bottom line?

Machine learning's growth and adoption is exploding, and deep learning is how you take your machine learning knowledge to the next level. More and more job openings are looking for this specialized knowledge.

Companies like Tesla, Microsoft, OpenAI, Meta (Facebook + Instagram), Airbnb and many others are currently powered by PyTorch.

And this is the most comprehensive online bootcamp to learn PyTorch and kickstart your career as a Deep Learning Engineer.

So why wait? Advance your career and earn a higher salary by mastering PyTorch and adding deep learning to your toolkit?

Bạn sẽ học được gì

Everything from getting started with using PyTorch to building your own real-world models

Understand how to integrate Deep Learning into tools and applications

Build and deploy your own custom trained PyTorch neural network accessible to the public

Master deep learning and become a top candidate for recruiters seeking Deep Learning Engineers

The skills you need to become a Deep Learning Engineer and get hired with a chance of making US$100,000+ / year

Why PyTorch is a fantastic way to start working in machine learning

Create and utilize machine learning algorithms just like you would write a Python program

How to take data, build a ML algorithm to find patterns, and then use that algorithm as an AI to enhance your applications

To expand your Machine Learning and Deep Learning skills and toolkit

Yêu cầu

  • A computer (Linux/Windows/Mac) with an internet connection is required
  • Basic Python knowledge is required
  • Previous Machine Learning knowledge is recommended, but not required (we provide sufficient supplementary resources to get you up to speed!)

Nội dung khoá học

14 sections

Introduction

7 lectures
PyTorch for Deep Learning
03:33
Course Welcome and What Is Deep Learning
05:53
Join Our Online Classroom!
04:01
Exercise: Meet Your Classmates + Instructor
01:42
Free Course Book + Code Resources + Asking Questions + Getting Help
01:05
ZTM Resources
04:23
Machine Learning + Python Monthly Newsletters
00:58

PyTorch Fundamentals

32 lectures
Why Use Machine Learning or Deep Learning
03:33
The Number 1 Rule of Machine Learning and What Is Deep Learning Good For
05:39
Machine Learning vs. Deep Learning
06:06
Anatomy of Neural Networks
09:21
Different Types of Learning Paradigms
04:30
What Can Deep Learning Be Used For
06:21
What Is and Why PyTorch
10:12
What Are Tensors
04:15
What We Are Going To Cover With PyTorch
06:05
How To and How Not To Approach This Course
05:09
Important Resources For This Course
05:21
Getting Setup to Write PyTorch Code
07:39
Introduction to PyTorch Tensors
13:24
Creating Random Tensors in PyTorch
09:58
Creating Tensors With Zeros and Ones in PyTorch
03:08
Creating a Tensor Range and Tensors Like Other Tensors
05:17
Dealing With Tensor Data Types
09:24
Getting Tensor Attributes
08:22
Manipulating Tensors (Tensor Operations)
05:59
Matrix Multiplication (Part 1)
09:34
Matrix Multiplication (Part 2): The Two Main Rules of Matrix Multiplication
07:51
Matrix Multiplication (Part 3): Dealing With Tensor Shape Errors
12:56
Finding the Min Max Mean and Sum of Tensors (Tensor Aggregation)
06:09
Finding The Positional Min and Max of Tensors
03:16
Reshaping, Viewing and Stacking Tensors
13:40
Squeezing, Unsqueezing and Permuting Tensors
11:55
Selecting Data From Tensors (Indexing)
09:31
PyTorch Tensors and NumPy
09:08
PyTorch Reproducibility (Taking the Random Out of Random)
10:46
Different Ways of Accessing a GPU in PyTorch
11:50
Setting up Device-Agnostic Code and Putting Tensors On and Off the GPU
07:43
PyTorch Fundamentals: Exercises and Extra-Curriculum
04:49

PyTorch Workflow

28 lectures
Introduction and Where You Can Get Help
02:45
Getting Setup and What We Are Covering
07:14
Creating a Simple Dataset Using the Linear Regression Formula
09:40
Splitting Our Data Into Training and Test Sets
08:19
Building a function to Visualize Our Data
07:45
Creating Our First PyTorch Model for Linear Regression
14:09
Breaking Down What's Happening in Our PyTorch Linear regression Model
06:10
Discussing Some of the Most Important PyTorch Model Building Classes
06:26
Checking Out the Internals of Our PyTorch Model
09:50
Making Predictions With Our Random Model Using Inference Mode
11:12
Training a Model Intuition (The Things We Need)
08:14
Setting Up an Optimizer and a Loss Function
12:51
PyTorch Training Loop Steps and Intuition
13:53
Writing Code for a PyTorch Training Loop
08:46
Reviewing the Steps in a Training Loop Step by Step
14:57
Running Our Training Loop Epoch by Epoch and Seeing What Happens
09:25
Writing Testing Loop Code and Discussing What's Happening Step by Step
11:37
Reviewing What Happens in a Testing Loop Step by Step
14:42
Writing Code to Save a PyTorch Model
13:45
Writing Code to Load a PyTorch Model
08:44
Setting Up to Practice Everything We Have Done Using Device Agnostic code
06:02
Putting Everything Together (Part 1): Data
06:07
Putting Everything Together (Part 2): Building a Model
10:07
Putting Everything Together (Part 3): Training a Model
12:39
Putting Everything Together (Part 4): Making Predictions With a Trained Model
05:17
Putting Everything Together (Part 5): Saving and Loading a Trained Model
09:10
Exercise: Imposter Syndrome
02:55
PyTorch Workflow: Exercises and Extra-Curriculum
03:57

PyTorch Neural Network Classification

32 lectures
Introduction to Machine Learning Classification With PyTorch
09:41
Classification Problem Example: Input and Output Shapes
09:06
Typical Architecture of a Classification Neural Network (Overview)
06:30
Making a Toy Classification Dataset
12:18
Turning Our Data into Tensors and Making a Training and Test Split
11:55
Laying Out Steps for Modelling and Setting Up Device-Agnostic Code
04:19
Coding a Small Neural Network to Handle Our Classification Data
10:57
Making Our Neural Network Visual
06:57
Recreating and Exploring the Insides of Our Model Using nn.Sequential
13:17
Loss Function Optimizer and Evaluation Function for Our Classification Network
14:50
Going from Model Logits to Prediction Probabilities to Prediction Labels
16:06
Coding a Training and Testing Optimization Loop for Our Classification Model
15:26
Writing Code to Download a Helper Function to Visualize Our Models Predictions
14:13
Discussing Options to Improve a Model
08:02
Creating a New Model with More Layers and Hidden Units
09:06
Writing Training and Testing Code to See if Our Upgraded Model Performs Better
12:45
Creating a Straight Line Dataset to See if Our Model is Learning Anything
08:07
Building and Training a Model to Fit on Straight Line Data
10:01
Evaluating Our Models Predictions on Straight Line Data
05:23
Introducing the Missing Piece for Our Classification Model Non-Linearity
10:00
Building Our First Neural Network with Non-Linearity
10:25
Writing Training and Testing Code for Our First Non-Linear Model
15:12
Making Predictions with and Evaluating Our First Non-Linear Model
05:47
Replicating Non-Linear Activation Functions with Pure PyTorch
09:34
Putting It All Together (Part 1): Building a Multiclass Dataset
11:24
Creating a Multi-Class Classification Model with PyTorch
12:27
Setting Up a Loss Function and Optimizer for Our Multi-Class Model
06:39
Logits to Prediction Probabilities to Prediction Labels with a Multi-Class Model
11:01
Training a Multi-Class Classification Model and Troubleshooting Code on the Fly
16:17
Making Predictions with and Evaluating Our Multi-Class Classification Model
07:59
Discussing a Few More Classification Metrics
09:17
PyTorch Classification: Exercises and Extra-Curriculum
02:58

PyTorch Computer Vision

34 lectures
What Is a Computer Vision Problem and What We Are Going to Cover
11:47
Computer Vision Input and Output Shapes
10:08
What Is a Convolutional Neural Network (CNN)
05:02
Discussing and Importing the Base Computer Vision Libraries in PyTorch
09:19
Getting a Computer Vision Dataset and Checking Out Its- Input and Output Shapes
14:30
Visualizing Random Samples of Data
09:51
DataLoader Overview Understanding Mini-Batches
07:17
Turning Our Datasets Into DataLoaders
12:23
Model 0: Creating a Baseline Model with Two Linear Layers
14:38
Creating a Loss Function: an Optimizer for Model 0
10:29
Creating a Function to Time Our Modelling Code
05:34
Writing Training and Testing Loops for Our Batched Data
21:25
Writing an Evaluation Function to Get Our Models Results
12:58
Setup Device-Agnostic Code for Running Experiments on the GPU
03:46
Model 1: Creating a Model with Non-Linear Functions
09:03
Mode 1: Creating a Loss Function and Optimizer
03:04
Turing Our Training Loop into a Function
08:28
Turing Our Testing Loop into a Function
06:35
Training and Testing Model 1 with Our Training and Testing Functions
11:52
Getting a Results Dictionary for Model 1
04:08
Model 2: Convolutional Neural Networks High Level Overview
08:24
Model 2: Coding Our First Convolutional Neural Network with PyTorch
19:48
Model 2: Breaking Down Conv2D Step by Step
14:59
Model 2: Breaking Down MaxPool2D Step by Step
15:48
Mode 2: Using a Trick to Find the Input and Output Shapes of Each of Our Layers
13:45
Model 2: Setting Up a Loss Function and Optimizer
02:38
Model 2: Training Our First CNN and Evaluating Its Results
07:54
Comparing the Results of Our Modelling Experiments
07:23
Making Predictions on Random Test Samples with the Best Trained Model
11:39
Plotting Our Best Model Predictions on Random Test Samples and Evaluating Them
08:10
Making Predictions and Importing Libraries to Plot a Confusion Matrix
15:20
Evaluating Our Best Models Predictions with a Confusion Matrix
06:54
Saving and Loading Our Best Performing Model
11:27
Recapping What We Have Covered Plus Exercises and Extra-Curriculum
06:01

PyTorch Custom Datasets

37 lectures
What Is a Custom Dataset and What We Are Going to Cover
09:53
Importing PyTorch and Setting Up Device Agnostic Code
05:54
Downloading a Custom Dataset of Pizza, Steak and Sushi Images
14:04
Becoming One With the Data (Part 1): Exploring the Data Format
08:41
Becoming One With the Data (Part 2): Visualizing a Random Image
11:40
Becoming One With the Data (Part 3): Visualizing a Random Image with Matplotlib
04:47
Transforming Data (Part 1): Turning Images Into Tensors
08:53
Transforming Data (Part 2): Visualizing Transformed Images
11:30
Loading All of Our Images and Turning Them Into Tensors With ImageFolder
09:17
Visualizing a Loaded Image From the Train Dataset
07:18
Turning Our Image Datasets into PyTorch Dataloaders
09:03
Creating a Custom Dataset Class in PyTorch High Level Overview
07:59
Creating a Helper Function to Get Class Names From a Directory
09:06
Writing a PyTorch Custom Dataset Class from Scratch to Load Our Images
17:46
Compare Our Custom Dataset Class. to the Original Imagefolder Class
07:13
Writing a Helper Function to Visualize Random Images from Our Custom Dataset
14:18
Turning Our Custom Datasets Into DataLoaders
06:58
Exploring State of the Art Data Augmentation With Torchvision Transforms
14:23
Building a Baseline Model (Part 1): Loading and Transforming Data
08:15
Building a Baseline Model (Part 2): Replicating Tiny VGG from Scratch
11:24
Building a Baseline Model (Part 3):Doing a Forward Pass to Test Our Model Shapes
08:09
Using the Torchinfo Package to Get a Summary of Our Model
06:38
Creating Training and Testing loop Functions
13:03
Creating a Train Function to Train and Evaluate Our Models
10:14
Training and Evaluating Model 0 With Our Training Functions
09:53
Plotting the Loss Curves of Model 0
09:02
The Balance Between Overfitting and Underfitting and How to Deal With Each
14:13
Creating Augmented Training Datasets and DataLoaders for Model 1
11:03
Constructing and Training Model 1
07:10
Plotting the Loss Curves of Model 1
03:22
Plotting the Loss Curves of All of Our Models Against Each Other
10:55
Predicting on Custom Data (Part 1): Downloading an Image
05:32
Predicting on Custom Data (Part 2): Loading In a Custom Image With PyTorch
07:00
Predicting on Custom Data (Part3):Getting Our Custom Image Into the Right Format
14:06
Predicting on Custom Data (Part4):Turning Our Models Raw Outputs Into Prediction
04:24
Predicting on Custom Data (Part 5): Putting It All Together
12:47
Summary of What We Have Covered Plus Exercises and Extra-Curriculum
06:04

PyTorch Going Modular

10 lectures
What Is Going Modular and What We Are Going to Cover
11:34
Going Modular Notebook (Part 1): Running It End to End
07:39
Downloading a Dataset
04:49
Writing the Outline for Our First Python Script to Setup the Data
13:50
Creating a Python Script to Create Our PyTorch DataLoaders
10:35
Turning Our Model Building Code into a Python Script
09:18
Turning Our Model Training Code into a Python Script
06:16
Turning Our Utility Function to Save a Model into a Python Script
06:06
Creating a Training Script to Train Our Model in One Line of Code
15:46
Going Modular: Summary, Exercises and Extra-Curriculum
05:59

PyTorch Transfer Learning

19 lectures
Introduction: What is Transfer Learning and Why Use It
10:05
Where Can You Find Pretrained Models and What We Are Going to Cover
05:12
Installing the Latest Versions of Torch and Torchvision
08:05
Downloading Our Previously Written Code from Going Modular
06:41
Downloading Pizza, Steak, Sushi Image Data from Github
08:00
Turning Our Data into DataLoaders with Manually Created Transforms
14:40
Turning Our Data into DataLoaders with Automatic Created Transforms
13:06
Which Pretrained Model Should You Use
12:15
Setting Up a Pretrained Model with Torchvision
10:56
Different Kinds of Transfer Learning
07:10
Getting a Summary of the Different Layers of Our Model
06:49
Freezing the Base Layers of Our Model and Updating the Classifier Head
13:26
Training Our First Transfer Learning Feature Extractor Model
07:54
Plotting the Loss curves of Our Transfer Learning Model
06:26
Outlining the Steps to Make Predictions on the Test Images
07:56
Creating a Function Predict On and Plot Images
10:00
Making and Plotting Predictions on Test Images
07:23
Making a Prediction on a Custom Image
06:21
Main Takeaways, Exercises and Extra- Curriculum
03:20

PyTorch Experiment Tracking

22 lectures
What Is Experiment Tracking and Why Track Experiments
07:06
Getting Setup by Importing Torch Libraries and Going Modular Code
08:13
Creating a Function to Download Data
10:22
Turning Our Data into DataLoaders Using Manual Transforms
08:30
Turning Our Data into DataLoaders Using Automatic Transforms
07:47
Preparing a Pretrained Model for Our Own Problem
10:28
Setting Up a Way to Track a Single Model Experiment with TensorBoard
13:35
Training a Single Model and Saving the Results to TensorBoard
04:38
Exploring Our Single Models Results with TensorBoard
10:17
Creating a Function to Create SummaryWriter Instances
10:44
Adapting Our Train Function to Be Able to Track Multiple Experiments
04:57
What Experiments Should You Try
05:59
Discussing the Experiments We Are Going to Try
06:01
Downloading Datasets for Our Modelling Experiments
06:31
Turning Our Datasets into DataLoaders Ready for Experimentation
08:28
Creating Functions to Prepare Our Feature Extractor Models
15:54
Coding Out the Steps to Run a Series of Modelling Experiments
14:27
Running Eight Different Modelling Experiments in 5 Minutes
03:50
Viewing Our Modelling Experiments in TensorBoard
13:37
Loading the Best Model and Making Predictions on Random Images from the Test Set
10:32
Making a Prediction on Our Own Custom Image with the Best Model
03:44
Main Takeaways, Exercises and Extra- Curriculum
03:56

PyTorch Paper Replicating

50 lectures
What Is a Machine Learning Research Paper?
07:34
Why Replicate a Machine Learning Research Paper?
03:13
Where Can You Find Machine Learning Research Papers and Code?
08:18
What We Are Going to Cover
08:21
Getting Setup for Coding in Google Colab
08:21
Downloading Data for Food Vision Mini
04:02
Turning Our Food Vision Mini Images into PyTorch DataLoaders
09:47
Visualizing a Single Image
03:45
Replicating a Vision Transformer - High Level Overview
09:53
Breaking Down Figure 1 of the ViT Paper
11:12
Breaking Down the Four Equations Overview and a Trick for Reading Papers
10:55
Breaking Down Equation 1
08:14
Breaking Down Equation 2 and 3
10:03
Breaking Down Equation 4
07:27
Breaking Down Table 1
11:05
Calculating the Input and Output Shape of the Embedding Layer by Hand
15:41
Turning a Single Image into Patches (Part 1: Patching the Top Row)
15:03
Turning a Single Image into Patches (Part 2: Patching the Entire Image)
12:32
Creating Patch Embeddings with a Convolutional Layer
13:33
Exploring the Outputs of Our Convolutional Patch Embedding Layer
12:54
Flattening Our Convolutional Feature Maps into a Sequence of Patch Embeddings
09:59
Visualizing a Single Sequence Vector of Patch Embeddings
05:03
Creating the Patch Embedding Layer with PyTorch
17:01
Creating the Class Token Embedding
13:24
Creating the Class Token Embedding - Less Birds
13:24
Creating the Position Embedding
11:25
Equation 1: Putting it All Together
13:25
Equation 2: Multihead Attention Overview
14:30
Equation 2: Layernorm Overview
09:03
Turning Equation 2 into Code
14:33
Checking the Inputs and Outputs of Equation
05:40
Equation 3: Replication Overview
09:10
Turning Equation 3 into Code
11:25
Transformer Encoder Overview
08:50
Combining equation 2 and 3 to Create the Transformer Encoder
09:15
Creating a Transformer Encoder Layer with In-Built PyTorch Layer
15:54
Bringing Our Own Vision Transformer to Life - Part 1: Gathering the Pieces
18:19
Bringing Our Own Vision Transformer to Life - Part 2: The Forward Method
10:41
Getting a Visual Summary of Our Custom Vision Transformer
07:13
Creating a Loss Function and Optimizer from the ViT Paper
11:26
Training our Custom ViT on Food Vision Mini
04:29
Discussing what Our Training Setup Is Missing
09:08
Plotting a Loss Curve for Our ViT Model
06:13
Getting a Pretrained Vision Transformer from Torchvision and Setting it Up
14:37
Preparing Data to Be Used with a Pretrained ViT
05:53
Training a Pretrained ViT Feature Extractor Model for Food Vision Mini
07:15
Saving Our Pretrained ViT Model to File and Inspecting Its Size
05:13
Discussing the Trade-Offs Between Using a Larger Model for Deployments
03:46
Making Predictions on a Custom Image with Our Pretrained ViT
03:30
PyTorch Paper Replicating: Main Takeaways, Exercises and Extra-Curriculum
06:50

PyTorch Model Deployment

57 lectures
What is Machine Learning Model Deployment - Why Deploy a Machine Learning Model
09:35
Three Questions to Ask for Machine Learning Model Deployment
07:13
Where Is My Model Going to Go?
13:33
How Is My Model Going to Function?
07:59
Some Tools and Places to Deploy Machine Learning Models
05:48
What We Are Going to Cover
04:01
Getting Setup to Code
06:15
Downloading a Dataset for Food Vision Mini
03:23
Outlining Our Food Vision Mini Deployment Goals and Modelling Experiments
07:59
Creating an EffNetB2 Feature Extractor Model
09:45
Create a Function to Make an EffNetB2 Feature Extractor Model and Transforms
06:29
Creating DataLoaders for EffNetB2
03:31
Training Our EffNetB2 Feature Extractor and Inspecting the Loss Curves
09:15
Saving Our EffNetB2 Model to File
03:24
Getting the Size of Our EffNetB2 Model in Megabytes
05:51
Collecting Important Statistics and Performance Metrics for Our EffNetB2 Model
06:34
Creating a Vision Transformer Feature Extractor Model
07:51
Creating DataLoaders for Our ViT Feature Extractor Model
02:30
Training Our ViT Feature Extractor Model and Inspecting Its Loss Curves
06:19
Saving Our ViT Feature Extractor and Inspecting Its Size
05:08
Collecting Stats About Our-ViT Feature Extractor
05:51
Outlining the Steps for Making and Timing Predictions for Our Models
11:15
Creating a Function to Make and Time Predictions with Our Models
16:20
Making and Timing Predictions with EffNetB2
10:43
Making and Timing Predictions with ViT
07:34
Comparing EffNetB2 and ViT Model Statistics
11:31
Visualizing the Performance vs Speed Trade-off
15:54
Gradio Overview and Installation
08:39
Gradio Function Outline
08:49
Creating a Predict Function to Map Our Food Vision Mini Inputs to Outputs
09:51
Creating a List of Examples to Pass to Our Gradio Demo
05:26
Bringing Food Vision Mini to Life in a Live Web Application
12:12
Getting Ready to Deploy Our App Hugging Face Spaces Overview
06:26
Outlining the File Structure of Our Deployed App
08:11
Creating a Food Vision Mini Demo Directory to House Our App Files
04:11
Creating an Examples Directory with Example Food Vision Mini Images
09:13
Writing Code to Move Our Saved EffNetB2 Model File
07:42
Turning Our EffNetB2 Model Creation Function Into a Python Script
04:01
Turning Our Food Vision Mini Demo App Into a Python Script
13:27
Creating a Requirements File for Our Food Vision Mini App
04:11
Downloading Our Food Vision Mini App Files from Google Colab
11:30
Uploading Our Food Vision Mini App to Hugging Face Spaces Programmatically
13:36
Running Food Vision Mini on Hugging Face Spaces and Trying it Out
07:44
Food Vision Big Project Outline
04:17
Preparing an EffNetB2 Feature Extractor Model for Food Vision Big
09:37
Downloading the Food 101 Dataset
07:45
Creating a Function to Split Our Food 101 Dataset into Smaller Portions
13:36
Turning Our Food 101 Datasets into DataLoaders
07:23
Training Food Vision Big: Our Biggest Model Yet!
20:15
Outlining the File Structure for Our Food Vision Big
05:48
Downloading an Example Image and Moving Our Food Vision Big Model File
03:33
Saving Food 101 Class Names to a Text File and Reading them Back In
06:56
Turning Our EffNetB2 Feature Extractor Creation Function into a Python Script
02:20
Creating an App Script for Our Food Vision Big Model Gradio Demo
10:41
Zipping and Downloading Our Food Vision Big App Files
03:45
Deploying Food Vision Big to Hugging Face Spaces
13:34
PyTorch Mode Deployment: Main Takeaways, Extra-Curriculum and Exercises
06:13

Introduction to PyTorch 2.0 and torch.compile

25 lectures
Introduction to PyTorch 2.0
06:01
What We Are Going to Cover and PyTorch 2 Reference Materials
01:21
Getting Started with PyTorch 2 in Google Colab
04:19
PyTorch 2.0 - 30 Second Intro
03:20
Getting Setup for PyTorch 2
02:22
Getting Info from Our GPUs and Seeing if They're Capable of Using PyTorch 2
06:49
Setting the Default Device in PyTorch 2
09:40
Discussing the Experiments We Are Going to Run for PyTorch 2
06:42
Introduction to PyTorch 2
06:01
Creating a Function to Setup Our Model and Transforms
10:17
Discussing How to Get Better Relative Speedups for Training Models
08:23
Setting the Batch Size and Data Size Programmatically
07:15
Getting More Potential Speedups with TensorFloat-32
09:53
Downloading the CIFAR10 Dataset
07:00
Creating Training and Test DataLoaders
07:38
Preparing Training and Testing Loops with Timing Steps for PyTorch 2.0 timing
04:58
Experiment 1 - Single Run without torch.compile
08:22
Experiment 2 - Single Run with torch.compile
10:38
Comparing the Results of Experiment 1 and 2
11:19
Saving the Results of Experiment 1 and 2
04:38
Preparing Functions for Experiment 3 and 4
12:41
Experiment 3 - Training a Non-Compiled Model for Multiple Runs
12:44
Experiment 4 - Training a Compiled Model for Multiple Runs
09:57
Comparing the Results of Experiment 3 and 4
05:23
Potential Extensions and Resources to Learn More
05:50

Bonus Section

1 lectures
Special Bonus Lecture
00:16

Where To Go From Here?

4 lectures
Thank You!
01:17
Become An Alumni
00:37
Endorsements on LinkedIn
00:40
Learning Guideline
00:10

Đánh giá của học viên

Chưa có đánh giá
Course Rating
5
0%
4
0%
3
0%
2
0%
1
0%

Bình luận khách hàng

Viết Bình Luận

Bạn đánh giá khoá học này thế nào?

image

Đăng ký get khoá học Udemy - Unica - Gitiho giá chỉ 50k!

Get khoá học giá rẻ ngay trước khi bị fix.