Schedule (9am)


Throughout the course, please consider any dates more than two weeks in the future (other than the date of the midterm and the date of the final exam) as somewhat tentative.

Jump to:

[Unit 1: Regression] - [Unit 2: Classification] - [Unit 3: Neural Nets]

[Unit 4: Trees and Ensembles] - [Unit 5: Unsupervised Learning]

Course Introduction

Concepts: supervised learning, unsupervised learning, difference between ML and AI

Date Assigned Do Before Class Class Content Optional
Thu 1/15 day01
out:
- HW0
Readings:
- Install your Python environment

Course Overview

- J. van der Plas' Whirlwind Tour of Python


Unit 1: Regression

Concepts: over-fitting, under-fitting, cross-validation

Methods: Linear regression, k-NN regression

Evaluation: mean squared error, mean absolute error

Date Assigned Do Before Class Class Content Optional
Tue 1/20 day02  
Readings:
- Read ISLP Ch. 1 :
- Focus: 'Notation and Simple Matrix Algebra'
-
- Read ISLP Ch. 2 : Sec. 2.1 & Sec. 2.2
- Focus: 'Parametric Methods'
- Focus: 'Assessing Model Accuracy'
- Focus: 'K-Nearest Neighbors'

Regression basics

Lab Notebook:
Notes:
- Alt. intro to supervised ML: SML Sec. 2.1
- More on k-NN: SML Sec. 2.2: k-NN
Thu 1/22 day03
due:
- HW0
out:
- HW1
Readings:

Linear regression

Notes:
- Derivation: SML Sec. 3.A
- Read MML Textbook Ch. 9 : Sec. 9.1-9.2
- Derivation with probabilistic perspective
Tue 1/27 day04  
Readings:
- Skim ISLP Ch. 3 :
- Focus: 3.3.2 Extensions of the Linear Model
- Esp. pages 98-99 on 'Non-linear Relationships'
- Read ISLP Ch. 5 :
- Focus: 5.1.1 Validation Set Approach
- Focus: 5.1.3 k-fold Cross-Validation

Hyperparameter selection & cross validation

Lab Notebook:
Notes:
 
Thu 1/29 day05
due:
- HW1
out:
- HW2
Readings:

Regularization

Notes:
- Read: ISLP Ch. 6 : Sec. 6.2.1, 6.2.2, and 6.2.3
-
- Video by Prof. A. Ihler (UC-Irvine): Regularization for Linear Regression


Unit 2: Classification

Concepts: feature engineering, hyperparameter selection, gradient descent

Methods: Logistic regression, k-NN classification

Evaluation: ROC curves, confusion matrices, cross entropy

Date Assigned Do Before Class Class Content Optional
Tue 2/3 day06  
Readings:
- --- Focus: 'A Statistical View of the Classification Problem
- --- Focus: 'The Logistic Regression Model for Binary Classification'
- --- Focus: Example 2.3 on k-NN classification

Classification basics

Lab Notebook:
Notes:
 
Thu 2/5 day07
due:
- HW2
out:
- HW3
Readings:
- --- Focus: 'The Confusion Matrix and the ROC Curve'
- --- Focus: 'The F1 Score and the Precision–Recall Curve'

Evaluating Classifiers

Notes:
 
Tue 2/10 day08  
Readings:
- Read SML Sec. 5.4
--- Focus: 'Gradient Descent' in Alg. 5.1

Gradient Descent

Notes:
- Read DL Textbook Sec. 4.3 : Gradient Descent
Thu 2/12 day09
due:
- HW3
out:
Readings:
- --- Review: 'The Logistic Regression Model for Binary Classification'
- --- Focus: Training the Logistic Regression Model by Maximum Likelihood
- --- Focus: Predictions and Decision Boundaries

Logistic Regression

Notes:
 
Tue 2/17 day10  
Readings:
- --- Focus: 'Logistic Regression for More Than Two Classes'

Hyperparameter Search + Project A

Lab Notebook:
Notes:
 
Thu 2/19     --- no class (Tufts Monday Schedule) ---  


Unit 3: Neural Nets

Concepts: backpropagation, stochastic gradient descent

Methods: multi-layer perceptrons for regression and classification

Date Assigned Do Before Class Class Content Optional
Tue 2/24 day11  
Readings:
- --- Focus: 'Two-Layer Neural Network'
- --- Focus: 'Deep Neural Network'
- --- Focus: 'Neural Networks for Classification'

Neural Net basics

Slides:
Math/Concept Exercises:
Notes:
 
Thu 2/26 day12
out:
- HW4
Readings:
- --- Focus: 'Backpropagation'
- --- Focus: 'Algorithm 6.1 and Example 6.2'
- --- Focus: 'Stochastic Gradient Descent'
- --- Skim: 'Learning Rate and Convergence for Stochastic Gradient Descent'
- --- Skim: 'Adaptive Methods'

Training Neural Nets

Notes:
 
Tue 3/3 day13  
Readings:
- --- Focus: 'Convolutional Layer'
- --- Focus: 'Pooling Layer'

Specialized Architectures

 
Thu 3/5 day14
due:
- HW4
due Friday:
Readings:
- Read SML Ch. 12 : Ethics in ML
- --- Focus on Sec. 12.1 'Fairness and Error Functions'
- --- Focus on Sec. 12.2 'Misleading Claims about Performance'

Fairness

Slides:


Midterm Exam

Date Assigned Do Before Class Class Content Optional
Tue 3/10 day15  
Readings:
- n/a
Midterm Review  
Thu 3/12 day16
out:
- HW5
  MIDTERM EXAM  
Tue 3/17     --- no class (Spring Break!) ---  
Thu 3/19     --- no class (Spring Break!) ---  


Unit 4: Explainability, Trees and Ensembles

Concepts: greedy training, bagging, boosting

Methods: decision trees, random forests, XGBoost

Date Assigned Do Before Class Class Content Optional
Tue 3/24 day17  
Readings:
- n/a

Explainability

Notes:
 
Thu 3/26 day18
due:
- HW5
out:
- HW6
Readings:
- n/a

Automatic Differentiation

Lab Notebook:
Notes:
- --- Focus on slides 1-19
Tue 3/31 day19
out:
Readings:
- --- Focus: Example 2.5
- --- Focus: Example 2.6

Decision Trees

Notes:
 
Thu 4/2 day20
due:
- HW6
out:
- HW7
Readings:
- Read SML Sec. 7.1 on Bagging
- Read SML Sec. 7.2 on Random Forest
- Skim SML Sec. 7.4 on Boosting

Ensembles

Notes:
 


Unit 5: Unsupervised Learning

Concepts: recommendation systems, dimensionality reduction, clustering

Methods: principal components analysis, collaborative filtering models, autoencoders, k-means

Date Assigned Do Before Class Class Content Optional
Tue 4/7 day21  
Readings:

Recommender Systems

Notes:
Thu 4/9 day22
due:
- HW7
out:
- HW8
Readings:
- --- Skim: 'Auto-encoders'
- --- Focus: 'Principal Component Analysis' in SML 10.4.2 PCA

Principal Component Analysis (PCA)

Slides:
- PCA
Lab Notebook:
Notes:
 
Tue 4/14 day23  
Readings:
- n/a

Image Data and Autoencoders

 
Thu 11/16 day24
due:
- HW8
out:
- HW9
Readings:
- n/a

Clustering

Slides:
 


Wrap Up

Date Assigned Do Before Class Class Content Optional
Tue 4/21 day25  
Readings:
Project B work-day  
Thu 4/23 day26
due:
- HW9
Readings:
- n/a
Final Exam Review  
Tue 4/28
due:
  --- reading period ---  
Wed 5/6 (12:00-1:30pm)     FINAL EXAM