Comp 236: Computational Learning Theory
Department of Computer Science
Tufts University
Fall 2015

  • (9/19) Change in office hours for this week . Khardon's office hours on Wednesday 9/23 are canceled. Instead he will hold office hours on Monday 9/21 2-3pm and Thursday 9/24 6-7pm.
  • (9/16) Assignment 1 posted.
  • (9/10) Starting week 2 Khardon's office hours will be on Wednesday 5-6pm.
  • (9/8) Class on 9/15 is canceled
  • (9/6) Classroom location: the class is taught (at least for now) in Lane Hall 100A. If you are not sure where this is use the on line Campus Map
  • (9/4) Khardon's office hours for the first week are Tuesday (9/8) 6-7pm, Thursday (9/10) 6-7pm.
  • (9/4) Course Information Posted.

What is this course about? Machine learning algorithms aim to "make sense of data", often developing some generalizations that are useful for future tasks, for example in classification, prediction, and control. Machine learning has matured and expanded over the last two decades and is now used extensively as a tool in many application areas. Increasingly, theory and analysis are used to guide the development of successful algorithms and systems.
Computational Learning Theory investigates such tasks and algorithms in a formal framework. The focus is to identify performance guarantees for algorithms in terms of computational complexity (how long it takes to run), data complexity (how much data is required), and convergence properties (how well does the result/output of the learning algorithm perform). Alternatively, when such guarantees do not exist, we might aim to show that learning is impossible by proving lower bounds on the amount of resources for any potential algorithm. Models vary from adversarial worst case scenarios, to statistical settings where a random process generates the data, and to interactive settings where the learner can control the flow of data.
The course will mix taught classes with seminar type reading of research papers of recent topics.
This semester: In addition to a review of core topics, we will explore some of the following topics: online convex optimization, bandit problems, PAC-Bayes theorems, learning theory for pattern recognition models, matrix completion problems.

Syllabus: Probabilistic and adversarial models of machine learning. Development and analysis of machine learning principles and algorithms, their computational complexity, data complexity and convergence properties. Computational and cryptographic limitations on algorithms for machine learning. Core results and recent developments in this field.
Prerequisites: COMP 160; EE 104 or MATH 162; COMP 170 recommended but not required. Or permission of instructor.
Times and Location: (D+ Block) TR 10:30-11:45, Lane Hall 100A
Instructor: Roni Khardon
Office: Halligan 230
Office Hours: Wednesday 5-6pm
Phone: 1-617-627-5290

Activities & Grading

The course will mix taught classes with seminar type reading of research papers. Graded course work will include 3 types of activities:

Policy on collaboration on homework assignments: You may discuss the problems and general ideas about their solutions with other students, and similarly you may consult other textbooks or the web. However, you must work out the details on your own and write the solution on your own. Every such collaboration (either getting help of giving help) and every use of text or electronic sources must be clearly cited and acknowledged in the submitted homework. Failure to follow these guidelines will result in disciplinary action for all parties involved. Any questions? for this and other issues concerning academic integrity please consult the booklet available from the office of the Dean of Student Affairs.


We will sample from different sources including some of the texts below (on reserve in library) and research and survey articles.

Papers for presentations and Discussion.

Concrete papers suggestions will be posted. But I will be open to selecting other paper as per students interest (if they are on learning theory). In the meantime here are some links (CoLT, ALT are devoted to learning theory and ICML, NIPS have some learning theory papers).
CoLT 2015: schedule and papers, CoLT 2014: schedule and papers, ALT 2014 schedule and papers, ICML 2015 papers, ICML 2014 papers, NIPS 2014 schedule and papers.

Schedule and Assignments

I will provide lecture notes for many of the lectures. PDFs will be added once they are ready (obviously after the lectures).

Type What Due Date
Reading [KV] Sections 1.1-1.3; [SB] Chapters 2,3  
Lecture 1 Introduction to CLT  
Lecture 2 Learning Intervals; PAC Learning Model  
Assignment 1 Assignment 1 9/24
Lecture 3-4 Learning conjunctions; Tools for Probabilistic Analysis  
Reading [KV] Chapter 9; [SB] Appendix B  
Reading [KV] Sections 4.1-2 (Optional: skim through 4.3.1-4.3.2)  
Lecture 4-5 Variations of PAC learning; Proof of Hoeffding's Bound;  
Assignment 2 Assignment 2 10/6
Reading [KV] Sections 1.4-1.5  
Lecture 6 Hardness Results for PAC learning  
Reading [KV] Chapter 2; see also section 7.3, and Chapter 10 of [SB]  
Lecture 7 Occam's Razor; Compression schemes
The companion slides
and pages 12-13 of this paper
Reading [SB] Sections 4.1-2 and 7.2  
Lecture 8-9 Agnostic Learning and Structural Risk Minimization  
Reading [KV] Ch 3; [SB] Ch 6.  
Lecture 10-11 The VC dimension and Learnability (properties and lower bounds)
Lecture notes from previous offering of the course
Assignment 3 Assignment 3 10/27
Lecture 13-14 Convergence based on VC Dimension
Lecture notes from previous offering of the course
we also used companion slides
and for the Agnostic case we followed Avrim Blum's notes
Reading [SB] 26.1, 28.1  
Lecture 15-16 Convergence Based on Rademacher Averages
We used companion slides
Reading [SB] Chapters 12,13  
Lecture 16-17 Convergence for Convex Learning Problems
We used companion slides
Assignment 4 Assignment 4 12/3
Lecture 17-20 On line learning and Online Convex Optimization
Lecture Slides
On line to batch conversion for convex losses is from Cesa Bianchi et al 2004 (see page 2052)
Lecture notes from previous offering of the course L0 , L1 , L2 , L3 , L4
Sources for Online Convex Optimization Lectures Lectures follow material from
Kakade and Tewari lecture notes L3, and L4
and the survey article By Shalev-Schwartz
Lecture 21-24 Paper Presentations