Description
In this course, you will learn :
- Create word embeddings, then use them to train a neural network to perform sentiment analysis on tweets.
- Using a Gated Recurrent Unit (GRU) language model, generate a synthetic Shakespeare text.
- Using named entity recognition (NER) and LSTMs with linear layers, train a recurrent neural network to extract important information from text.
- Use a Siamese network to compare questions in a text and identify duplicates: questions that are worded differently but have the same meaning.
Syllabus :
1. Neural Networks for Sentiment Analysis
- Trax: Neural Networks
- Why we recommend Trax1
- Trax: Layers
- Dense and ReLU Layers
- Serial Layer
- Other Layers
- Training
2. Recurrent Neural Networks for Language Modeling
- Traditional Language models
- Recurrent Neural Networks
- Applications of RNNs
- Math in Simple RNNs
- Cost Function for RNNs
- Implementation Note
- Gated Recurrent Units
- Deep and Bi-directional RNNs
3. LSTMs and Named Entity Recognition
- RNNs and Vanishing Gradients
- Introduction to LSTMs
- LSTM Architecture
- Introduction to Named Entity Recognition
- Training NERs: Data Processing
- Computing Accuracy
4. Siamese Networks
- Siamese Networks
- Architecture
- Cost Function
- Triplets
- Computing The Cost
- One Shot Learning
- Training / Testing