
Unit  Topics/Comments  Reading/Dates 
Part I  Bayesian Models and Inference: In this part we develop the basic concepts of Bayesian modelling and reasoning that are used throughout the course.  
Background Reading  This is more of an overview than an introduction but it is well worth reading; skim through w/o expecting to get all the details.  Chapter 1 
Informal Assignment 
Please familiarize yourself a system that
will facilitates programming with text processign and matrix algebra.
Matlab: There are many online tutorials for Matlab for example Matlab tutorial UBC and Matlab tutorial MIT. SciPy: NumPy and SciPy provide such functionality using Python. See the SciPy Tutorial. Write at least one 10line program, using at least one function, and making at least one plot. 
No submission required. This will prepare you for the next programming assignments. 
Lectures 9/6, 9/11  Maximum likelihood and Bayesian estimates for beta, Bernoulli and univariate Gaussians  Sections 2.1, 2.2, 1.2.4
see also Inferring the complete form of probability distributions 
Assignment 1  hw1.pdf  9/20

Programming Project 1 
pp1.pdf
Data for this programming project is in this zip file 
10/2 
Part II  Bayesian Linear Regression: In this part we develop much of the mathematical machinery of the course while covering the topic of linear regression.  
Lecture 9/13, 9/18  Linear Regression  Section 3.1 
Lecture 9/18  Linear Algebra Review  Any introductory linear algebra text; appendix C 
Lectures 9/20, 9/25  Multivariate Normal Distributions  Section 2.3
see also Handy formulas for MVN distributions 
Lectures 9/25, 9/27  Bayesian Linear Regression with prior for (w,eta)  Section 3.3
see also Slide Copies 
Lecture 10/2,10/4  Model Selection  Section 3.45 
Assignment 2  hw2.pdf  10/11 
Programming Project 2 
pp2.pdf
Data for this programming project is in this zip file 
10/23 
10/9 University Holiday No classes  
Part III  Bayesian Models for Classification and Prediction: In this part we extend the framework to handle other prediction tasks  mostly prediction of categories (discrete values).  
Lecture 10/11  Generative models: Classification with (mostly) Linear models  Section 4.12 
Lecture 10/16  Logistic Regression  Section 4.3 
Lecture 10/18  Exponential Family Distributions  Section 2.4 
Lecture 10/23  Generalized Linear models  Section 4.3 
Lecture 10/25  Bayesian Logistic Regression  Section 4.45, Section 1.5 
Assignment 3  hw3.pdf  11/1 
Programming Project 3 
pp3.pdf
Data for this programming project is in this zip file 
11/13 
Part IV  Graphical Models: In this part we abstract the framework to capture random variables and (in)dependencies among them, together with generic inference algorithms (or algorithmic templates) that work across such models.  
Lectures 10/30, 11/1  Overview of Graphical Models and Sampling Methods  Chapters 8 and 11
[RN] section 14.4 
Lecture 11/6  Topic models (Gibbs Sampling) 
lecture slides
Steyvers & Griffiths (2007) Probabilistic topic models 
Additional Reading  The original LDA paper develops variational inference for the model (we cover variational inference in later lectures).  Blei, Ng & Jordan (2003) Latent Dirichlet Allocation 
Lectures 11/8, 11/13  The EM Algorithm  Chapter 9
lecture slides 
Lectures 11/15, 11/20  Variational Inference  Chapter 10
lecture slides 
Additional Reading 
The PPC paper discusses both sampling and variational approximation on a model which is partly directed and partly undirected. 
Lu & Leen (2007) Penalized Probabilistic Clustering 
Information for Project 
project.pdf
Some ideas for projects 
11/8, 11/13, 11/29, and 12/15 
Part V  Kernel Methods: In this part we use Kernels for learning nonlinear models. This cuts accross regression, classification, unsupervised learning, as well as probabilistic and nonprobabilistic techniques.  
Lecture 11/27  Kernel functions and Kernel Methods. Perceptron, nearest neighbors, least squares.  Sections 6.13
Chapters 2,3 of [CST] 
Lecture 11/29  Gaussian processes.  Sections 6.4.14
Slide Copies 
Assignment 4  hw4.pdf  12/11 