This repository contains the lecture materials for EECS 545, a graduate course in Machine Learning, at the University of Michigan, Ann Arbor.
The link above gives a list of all of the available lecture materials, including links to ipython notebooks (via Jupyter's nbviewer), the slideshow view, and PDFs.
We will make references to the following textbooks throughout the course. The only required textbook is Bishop, PRML, but the others are very well-written and offer unique perspectives.
- Bishop 2006, Pattern Recognition and Machine Learning
- Murphy 2012, Machine Learning: A Probabilistic Perspective
Wednesday, January 6, 2016
No required reading.
Monday, January 11, 2016
- There are lots of places to look online for linear algebra help!
- Juan Klopper has a nice online review, based on Jupyter notebooks.
Wednesday, January 13, 2016 (Notebook Viewer, PDF File, Slide Viewer)
Required:
- Bishop, §1.2: Probability Theory
- Bishop, §2.1-2.3: Binary, Multinomial, and Normal Random Variables
Optional:
- Murphy, Chapter 2: Probability
Wednesday, January 20, 2016 (Notebook Viewer, PDF File, Slide Viewer)
Required:
- Bishop, §1.1: Polynomial Curve Fitting Example
- Bishop, §3.1: Linear Basis Function Models
Optional:
- Murphy, Chapter 7: Linear Regression
Monday, January 25, 2016 (Notebook Viewer, PDF File, Slide Viewer)
Required:
- Bishop, §3.2: The Bias-Variance Decomposition
- Bishop, §3.3: Bayesian Linear Regression
Optional:
- Murphy, Chapter 7: Linear Regression
Wednesday, January 27, 2016 (Notebook Viewer, PDF File, Slide Viewer)
Required:
- Bishop, §4.2: Probabilistic Generative Models
- Bishop, §4.3: Probabilistic Discriminative Models
Optional:
- Murphy, Chapter 8: Logistic Regression
Monday, February 1, 2016 (Notebook Viewer, PDF File, Slide Viewer)
Required:
- Bishop, §4.1: Discriminant Functions
Recommended:
- Murphy §3.5: Naive Bayes Classifiers
- Murphy §4.1: Gaussian Models
- Murphy §4.2: Gaussian Discriminant Analysis
Optional:
- CS 229: Notes on Generative Models
- Paper: Zhang, H., 2004. "The optimality of naive Bayes". AA, 1(2), p.3.
- Paper: Domingos, P. and Pazzani, M., 1997. "On the optimality of the simple Bayesian classifier under zero-one loss". Machine learning, 29(2-3), pp.103-130.
Monday, February 8, 2016
Required:
- Bishop, §6.1: Dual Representation
- Bishop, §6.2: Constructing Kernels
- Bishop, §6.3: Radial Basis Function Networks
Optional:
- Murphy, §14.2: Kernel Functions
Wednesday, February 10, 2016
Required:
- Bishop, §6.1: Dual Representation
- Bishop, §6.3: Radial Basis Function Networks
Optional:
Monday, February 15, 2016
Required:
- Bishop, §7.1: Maximum Margin Classifiers
- Bishop, §2.3.0-2.3.1: Gaussian Distributions
Optional:
- CS229: Support Vector Machines
Wednesday, February 17, 2016
Required:
- Bishop, §3.3: Bayesian Linear Regression
- Bishop, §6.4: Gaussian Processes
Recommended:
- Murphy, §7.6.1-7.6.2: Bayesian Linear Regression
- Murphy, §4.3: Inference in Joinly Gaussian Distributions
Further Reading:
- Rasmussen & Williams, Gaussian Processes for Machine Learning. (available free online)
Monday, February 22, 2016
No required reading.
Monday, March 7, 2016
Required:
- Bishop, §1.6: Information Theory
- Bishop, §2.4: The Exponential Family
Recommended:
- Murphy, §2.8: Information Theory
- Murphy, §9.2: Exponential Families
Further Reading:
- David Blei,, Notes on Exponential Families. 2011.
Wednesday, March 9, 2016
Required:
- Bishop, §8.1: Bayesian Networks
- Bishop, §8.2: Conditional Independence
- Bishop, §8.3: Markov Random Fields
Recommended:
- Murphy, §10.1: Directed Graphical Models
- Murphy, §10.2: Examples of Directed Graphical Models
Monday, March 14, 2016
Required:
- Bishop, §8.2: Conditional Independence
- Bishop, §9.1: K-Means Clustering
Recommended:
- Murphy, §10.5: Conditional Independence Properties
- Murphy, §11.1: Latent Variable Models
Wednesday, March 16, 2016
Required:
- Lecture Notes, "Expectation Maximization" (see Lecture 16 folder)
- Bishop, §9.2: Mixtures of Gaussians
- Bishop, §9.3: An Alternative View of EM
- Bishop, §9.4: The EM Algorithm in General
Recommended:
- Murphy, §10.3: Inference in Bayesian Networks
- Murphy, §10.4: Learning in Bayesian Networks
- Murphy, §11.2: Mixture Models
- Murphy, §11.3: Parameter Estimation for Mixture Models
- Murphy, §11.4: The Expectation Maximization Algorithm
Lecture 17: Markov & Hidden Markov Models
Monday, March 21, 2016
Required:
- Bishop, §13.1: Markov Models
- Bishop, §13.2: Hidden Markov Models
Recommended:
- Murphy, §17.2: Markov Models
- Murphy, §17.3: Hidden Markov Models
- Murphy, §17.4: Inference in HMMs
- Murphy, §17.5: Learning for HMMS
Monday, March 23, 2016
Required:
- Bishop, §10.1: Variational Inference
- Bishop, §11.2: Markov Chain Monte Carlo
Recommended:
- Murphy, §19.1-4: Markov Random Fields
- Murphy, §21.2: Variational Inference
- Murphy, §21.3: The Mean Field Method
- Murphy, §23.1-4: Monte Carlo Inference
- Murphy, §24.1-3: Markov Chain Monte Carlo
- Murphy, §27.3: Latent Dirichlet Allocation
Monday, March 28, 2016
Required:
- Bishop, §12.1: Principal Components Analysis
- Bishop, §12.2: Probabilistic PCA
- Bishop, §12.3: Kernel PCA
- Bishop, §12.4: Nonlinear Latent Variable Models
Recommended:
- Murphy, §12.2: Principal Components Analysis
- Murphy, §12.4: PCA for Categorical Data
- Murphy, §12.6: Independent Component Analysis