Description
In this course, you will learn:
- For regression and classification, supervised learning techniques are used.
- For data modeling and analysis, unsupervised learning approaches are used.
- Perspectives that are probabilistic versus non-probabilistic.
- Model learning optimization and inference algorithms.
Syllabus:
- Maximum likelihood estimation, linear regression, least squares
- Ridge regression, bias-variance, Bayes rule, maximum a posteriori inference
- Bayesian linear regression, sparsity, subset selection for linear regression
- Nearest-neighbor classification, Bayes classifiers, linear classifiers, perceptron
- Logistic regression, Laplace approximation, kernel methods, Gaussian processes
- Maximum margin, support vector machines, trees, random forests, boosting
- Clustering, k-means, EM algorithm, missing data
- Mixtures of Gaussians, matrix factorization
- Non-negative matrix factorization, latent factor models, PCA and variations
- Markov models, hidden Markov models
- Continuous state-space models, association analysis
- Model selection, next steps