Description
This course is intended for students of machine learning or artificial intelligence, as well as software engineers who want to learn more about how NLP models work and how to apply them. By the end of this Specialization, you will have created NLP applications for question-answering and sentiment analysis, tools for translating languages and summarising text, and even a chatbot.
Syllabus :
1. Neural Machine Translation
- Seq2seq
- Alignment
- Attention
- Setup for Machine Translation
- Training an NMT with Attention
- Evaluation for Machine Translation
- Sampling and Decoding
- Andrew Ng with Oren Etzioni
2. Text Summarization
- Transformers vs RNNs
- Transformer Applications
- Dot-Product Attention
- Causal Attention
- Multi-head Attention
- Transformer Decoder
- Transformer Summarizer
3. Question Answering
- Transfer Learning in NLP
- ELMo, GPT, BERT, T5
- Bidirectional Encoder Representations from Transformers (BERT)
- BERT Objective
- Fine tuning BERT
- Transformer: T5
- Multi-Task Training Strategy
- GLUE Benchmark
- Question Answering
4. Chatbot
- Tasks with Long Sequences
- Transformer Complexity
- LSH Attention
- Motivation for Reversible Layers: Memory!
- Reversible Residual Layers
- Reformer
- Andrew Ng with Quoc Le