Description
This is an advanced course, the third in a series on Bayesian statistics at UC Santa Cruz, following Herbie Lee's "Bayesian Statistics: From Concept to Data Analysis" and Matthew Heiner's "Bayesian Statistics: Techniques and Models." To succeed in the course, you should be familiar with and comfortable with calculus-based probability, maximum-likelihood estimation principles, and Bayesian estimation.
Syllabus :
1. Basic concepts on Mixture Models
- Welcome to Bayesian Statistics: Mixture Models
- Installing and using R
- Basic definitions
- Mixtures of Gaussians
- Zero-inflated mixtures
- Hierarchical representations
- Sampling from a mixture model
- The likelihood function
- Parameter identifiability
2. Maximum likelihood estimation for Mixture Models
- EM for general mixtures
- EM for location mixtures of Gaussians
- EM example 1
- EM example 2
3. Bayesian estimation for Mixture Models
- Markov Chain Monte Carlo algorithms part 1
- Markov Chain Monte Carlo algorithms, part 2
- MCMC for location mixtures of normals Part 1
- MCMC for location mixtures of normals Part 2
- MCMC Example 1
- MCMC Example 2
4. Applications of Mixture Models
- Density estimation using Mixture Models
- Density Estimation Example
- Mixture Models for Clustering
- Clustering example
- Mixture Models and naive Bayes classifiers
- Linear and quadratic discriminant analysis in the context of Mixture Models
- Classification example
5. Practical considerations
- Numerical stability
- Computational issues associated with multimodality
- Bayesian Information Criteria (BIC)
- Bayesian Information Criteria Example
- Estimating the number of components in Bayesian settings
- Estimating the full partition structure in Bayesian settings
- Example: Bayesian inference for the partition structure