Description
The course extends the fundamental tools in "Machine Learning Foundations" to powerful and practical models by three directions, which includes embedding numerous features, combining predictive features, and distilling hidden features. [這門課將先前「機器學習基石」課程中所學的基礎工具往三個方向延伸為強大而實用的工具。這三個方向包括嵌入大量的特徵、融合預測性的特徵、與萃取潛藏的特徵。
Syllabus :
1. 第一講:Linear Support Vector Machine
- Large-Margin Separating Hyperplane
- Standard Large-Margin Problem
- Support Vector Machine
- Reasons behind Large-Margin Hyperplane
2. 第二講:Dual Support Vector Machine
- Motivation of Dual SVM
- Lagrange Dual SVM
- Solving Dual SVM
- Messages behind Dual SVM
3. 第三講:Kernel Support Vector Machine
- Kernel Trick
- Polynomial Kernel
- Gaussian Kernel
- Comparison of Kernels
4. 第四講:Soft-Margin Support Vector Machine
- Motivation and Primal Problem
- Dual Problem
- Messages behind Soft-Margin SVM
- Model Selection
5. 第五講:Kernel Logistic Regression
- Soft-Margin SVM as Regularized Model
- SVM versus Logistic Regression
- SVM for Soft Binary Classification
- Kernel Logistic Regression
6. 第六講:Support Vector Regression
- Kernel Ridge Regression
- Support Vector Regression Primal
- Support Vector Regression Dual
7. 第七講:Blending and Bagging
- Motivation of Aggregation
- Uniform Blending
- Linear and Any Blending
- Bagging (Bootstrap Aggregation)
8. 第八講:Adaptive Boosting
- Motivation of Boosting
- Diversity by Re-weighting
- Adaptive Boosting Algorithm
- Adaptive Boosting in Action
9. 第九講:Decision Tree
- Decision Tree Hypothesis
- Decision Tree Algorithm
- Decision Tree Heuristics in C&RT
- Decision Tree in Action
10. 第十講:Random Forest
- Random Forest Algorithm
- Out-Of-Bag Estimate
- Feature Selection
- Random Forest in Action
11. 第十一講:Gradient Boosted Decision Tree
- Adaptive Boosted Decision Tree
- Optimization View of AdaBoost
- Gradient Boosting
- Summary of Aggregation Models
12. 第十二講:Neural Network
- Neural Network Hypothesis
- Neural Network Learning
- Optimization and Regularization
13. 第十三講:Deep Learning
- Deep Neural Network
- Autoencoder
- Denoising Autoencoder
- Principal Component Analysis
14. 第十四講:Radial Basis Function Network
- RBF Network Hypothesis
- RBF Network Learning
- k-Means Algorithm
- k-Means and RBF Network in Action
15. 第十五講:Matrix Factorization
- Linear Network Hypothesis
- Basic Matrix Factorization
- Stochastic Gradient Descent
16. 第十六講:Finale
- Feature Exploitation Techniques
- Error Optimization Techniques
- Overfitting Elimination Techniques
- Machine Learning in Practice