Description
In this course, you will learn :
- Knowledge of the most popular Deep Learning models
- A solid understanding of the mathematics as well as the intuition behind the algorithms
- A good understanding of Deep Learning Programming and Pytorch
- About basic and intermediate concepts such as convolutional neural networks, recurrent neural networks, generative adversarial networks, and transformers.
- You will have a thorough understanding of the fundamental architectural components of deep learning after completing this course.
Syllabus :
1. Learn Deep Learning
- About this Course
- Why Learn Deep Learning?
- Overview of the Course
2. Neural Networks
- Linear Classifiers
- Optimization and Gradient Descent
- Neural Networks
- Backpropagation Algorithm
- Build a Neural Network With Pytorch
3. Training Neural Networks
- Optimization
- Popular Optimization Algorithms
- Activation Functions
- Training in Pytorch
4. Convolutional Neural Networks
- The Principles of the Convolution
- Convolution in Practice
- Build a Convolutional Network
- Batch Normalization and Dropout
- Skip Connections
- CNN Architectures
5. Recurrent Neural Networks
- A Simple RNN Cell
- LSTM: Long Short Term Memory Cells
- Writing a Custom LSTM Cell in Pytorch
- Connect LSTM Cells Across Time and Space
6. Autoencoders
- Generative Learning
- Basics of Autoencoders
- Variational Autoencoder: Theory
- Variational Autoencoder: Practice
7. Generative Adversarial Networks
- Generator and Discriminator
- Generative Adversarial Networks in Detail
- Develop a GAN with Pytorch
8. Attention and Transformers
- Sequence to Sequence Models
- Attention
- Key Concepts of Transformers
- Self-Attention
- Multi-Head Self-Attention
- Transformers Building Blocks
- The Transformer's Encoder
- The Transformer's Decoder
- Build a Transformer Encoder
9. Graph Neural Networks
- Basics of Graphs
- Mathematics for Graphs
- Graph Convolutional Networks
- Implementation of a GCN