Description
In this course, you will :
- By the end, you will be familiar with the major technological trends driving the rise of deep learning; you will be able to build, train, and apply fully connected deep neural networks; you will be able to implement efficient (vectorized) neural networks; you will be able to identify key parameters in a neural network's architecture; and you will be able to apply deep learning to your own applications.
- Our foundational programme, the Deep Learning Specialization, will help you understand the capabilities, challenges, and consequences of deep learning and will prepare you to participate in the development of cutting-edge AI technology. It paves the way for you to gain the knowledge and skills needed to apply machine learning to your work, advance your technical career, and take the next step in the world of AI.
Syllabus :
1. Introduction to Deep Learning
- What is a Neural Network?
- Supervised Learning with Neural Networks
- Why is Deep Learning taking off?
- About this Course
- Geoffrey Hinton Interview
2. Neural Networks Basics
- Binary Classification
- Logistic Regression
- Logistic Regression Cost Function
- Gradient Descent
- Derivatives
- More Derivative Examples
- Computation Graph
- Derivatives with a Computation Graph
- Logistic Regression Gradient Descent
- Gradient Descent on m Examples
- Vectorization
- More Vectorization Examples
- Vectorizing Logistic Regression
- Vectorizing Logistic Regression's Gradient Output
- Broadcasting in Python
- A Note on Python/Numpy Vectors
- Quick tour of Jupyter/iPython Notebooks
- Explanation of Logistic Regression Cost Function (Optional)
- Pieter Abbeel Interview
3. Shallow Neural Networks
- Neural Network Representation
- Computing a Neural Network's Output
- Vectorizing Across Multiple Examples
- Explanation for Vectorized Implementation
- Activation Functions
- Why do you need Non-Linear Activation Functions?
- Derivatives of Activation Functions
- Gradient Descent for Neural Networks
- Backpropagation Intuition (Optional)
- Random Initialization
- Ian Goodfellow Interview
4. Deep Neural Networks
- Deep L-layer Neural Network
- Forward Propagation in a Deep Network
- Getting your Matrix Dimensions Right
- Why Deep Representations?
- Building Blocks of Deep Neural Networks
- Forward and Backward Propagation
- Parameters vs Hyperparameters
- What does this have to do with the brain?