Comp 135: Introduction to Machine Learning
Department of Computer Science
Tufts University
Fall 2015

Course Web Page (redirects to current page):

  • (10/7) The slides for Naive Bayes lecture were edited to remove typos in an earlier version. The posted version is the updated one.
  • (10/6) There was a typo in the previous posted version of assignment 2 (now corrected) in question 3 part (1), where the bound stated was p(1-p)<=1/2 whereas it is p(1-p)<=1/4.
  • Please check table below for reading, assignment, and slide copies.
  • Previous announcements

What is this course about?

Machine learning is the science of collecting and analyzing data and turning it into predictions, encapsulated knowledge, or actions. There are many ways and scenarios by which data can be obtained, many different models and algorithms for data analysis, and many potential applications. In recent years machine learning has attracted attention due to commercial successes and widespread use.
The course gives a broad introduction to machine learning aimed at upper level undergraduates and beginning graduate students. Some mathematical aptitude is required, but generally speaking we focus on baseline algorithms, practical aspects, and breadth and leave more sophisticated aspects as well as detailed analysis to more advanced courses: Statistical Pattern Recognition (Fall 2015), Computational Learning Theory (Fall 2015), Learning, Planning and Acting in Complex Environments (likely to be offered in Spring or Fall 2016).


An overview of methods whereby computers can learn from data or experience and make decisions accordingly. Topics include supervised learning, unsupervised learning, reinforcement learning, and knowledge extraction from large databases with applications to science, engineering, and medicine.


Comp 15 and COMP/MATH 22 or 61 or consent of instructor. Comp 160 is highly recommended. You will also need a minimal amount of calculus.

Class Times:

(H+ Block) TR 1:30-2:45, Barnum/Dana Hall 104


Roni Khardon
Office Hours: Wednesday 6-7pm (Halligan 230)
Office: Halligan 230
Phone: 1-617-627-5290

Teaching Assistants:

Hao Cui, Email:
Office Hours: Monday 1-2, Tuesday 3-4, Thursday 7-8, Friday 1-2 (location Halligan 121).

Course Work and Marking

The course grade will be determined by a combination of
Written homework assignments (20%)
these assignments exercise and reinforce class material.
Experimental/Programming projects (30%)
these assignments exercise and reinforce class material. Projects will include both programming assignments and use of existing machine learning software.
Rules for late submissions:
All work must be turned in on the date specified. Unless there is a last minute emergency, please notify Roni Khardon of special circumstances at least two days in advance. Otherwise, If you haven't finished an assignment, turn in what you have on the due date, and it will be evaluated for partial credit.
In-class midterm exam (20%), Thursday, October 22.
Final exam (30%) , scheduled according to the Tufts' exam schedule for the H Block on Tuesday, December 15, 3:30-5:30.
Note: If your final exam grade is higher than the midterm the midterm is discounted and the final will count for 50%.


On homework assignments and projects: You may discuss the problems and general ideas about their solutions with other students, and similarly you may consult other textbooks or the web. However, you must work out the details on your own and code/write-out the solution on your own. Every such collaboration (either getting help of giving help) and every use of text or electronic sources must be clearly cited and acknowledged in the submitted homework.
On exams: no collaboration is allowed.
Failure to follow these guidelines may result in disciplinary action for all parties involved. Any questions? for this and other issues concerning academic integrity please consult the booklet available from the office of the Dean of Student Affairs.

Tentative List of Topics

[We may not cover all sub-topics]

Textbooks and Material Covered

No single text covers all the material for this course at the right level. We have the following texts on reserve in the library. Other material will be selected from research and survey articles or other notes. Detailed reading assignments and links to material will be posted.


Reading, References, and Assignments

Lecture Topic Reading/Assignments/Notes Due Date
L1 Introduction to Machine Learning Read the introductory chapter of [M], [WF], [F] or [A]
See also lecture slides.
  Supervised Learning Basics:    
L2 Instance based learning [M] Chapter 8 is cloest to class material; or [RN] 18.8; or [DHS] 4.4-4.6.
See also lecture slides.
See also Andrew Moore's tutorial on kd-trees
See also original paper describing the Relief Method
L3-4 Decision Trees [M] Chapter 3; or [RN] 18.1-4; or [F] Chapter 5.
See also lecture slides.
  Optional Reading T. Dietterich, M. Kearns, and Y. Mansour Decision Tree Learning and Boosting Applying the Weak Learning Framework to Understand and Improve C4.5. International Conference on Machine Learning, 1996.  
  Written Assignment 1 Assignment 1 9/24
  Empirical/Programming Assignment 1 Project 1 and corresponding Data 9/29
L5 Naive Bayes Algorithm [M] 6.1-6.2, and 6.9-6.10; [DHS] Section 2.9; [F] 9.2; [WF] 4.2.
See also new book chapter from [M]
See also lecture slides.
Lecture also provided a basic introduction to probability and working with random variables.
L6-7 Evaluating Machine Learning Outcomes [M] Ch 5; [F] Ch 12
See also lecture slides.
  Optional Reading Foster Provost, Tom Fawcett, Ron Kohavi The Case Against Accuracy Estimation for Comparing Induction Algorithms Proc. 15th International Conf. on Machine Learning, 1998.
T. Dietterich, Approximate Statistical Tests for Comparing Supervised Classification Learning Algorithms Neural Computation 10(7), 1998.
Stephen Salzberg On Comparing Classifiers: Pitfalls to Avoid and a Recommended Approach Data Mining and Knowledge Discovery, 1997.
L7-8 Features (selection, transformation, discretization) Relevant reading includes some portions of [F] Chapter 10, and [A] Chapter 6 (there is a good overlap but not a perfect match)
See also lecture slides.
  Optional Reading Wrappers for Feature Subset Selection Ron Kohavi, George H. John Artificial Intelligence, 1996. (Read till section 3.2 inclusive.)
Supervised and unsupervised discretization of continuous features. James Dougherty, Ron Kohavi, and Mehran Sahami. International Conference on Machine Learning, 1995.
  Written Assignment 2 Assignment 2 10/8
  Empirical/Programming Assignment 2 Project 2 and corresponding Data 10/13