
Lecture  Topic  Reading/Assignments/Notes  Due Date 
9/7  Introduction to Machine Learning 
Read the introductory chapter of [M].
See also lecture slides. Alternate reading: Read the introductory chapter of one of [WF], [F] or [A]. 

Supervised Learning Basics:  
9/12  Instance based learning 
[M] Chapter 8.
See also lecture slides. See also Andrew Moore's tutorial on kdtrees See also original paper describing the Relief Method Alternate reading: [RN] 18.8 or [DHS] 4.44.6. 

9/14; 9/19  Decision Trees 
[M] Chapter 3.
See also lecture slides. Alternate reading: [RN] 18.14 or [F] Chapter 5. 

Optional Reading  T. Dietterich, M. Kearns, and Y. Mansour Decision Tree Learning and Boosting Applying the Weak Learning Framework to Understand and Improve C4.5. International Conference on Machine Learning, 1996.  
9/19  Probability Basics 
Lecture provides a basic and brief introduction to probability theory
and working with random variables.
Please review relevant material from your discrete math, algorithms, or probability and statistics course. 

Written Assignment 1  Assignment 1  9/28  
Empirical/Programming Assignment 1  Project 1 and corresponding Data  9/28  
9/21, 9/26  Maximum Likelihood Estimation and the Naive Bayes Algorithm 
[M] 6.16.2, and 6.96.10.
See also new book chapter from [M] See also lecture slides. Alternative reading: [DHS] Section 2.9; [F] 9.2; [WF] 4.2. 

Written Assignment 2  Assignment 2  10/5  
Empirical/Programming Assignment 2  Project 2 and corresponding Data  10/19  
9/28  Evaluating Machine Learning Outcomes 
[M] Ch 5.
See also lecture slides. Alternative reading: [F] Ch 12 

Additional Optional Reading 
Foster Provost, Tom Fawcett, Ron Kohavi
The Case Against Accuracy Estimation for Comparing Induction
Algorithms
Proc. 15th International Conf. on
Machine Learning, 1998.
T. Dietterich, Approximate Statistical Tests for Comparing Supervised Classification Learning Algorithms Neural Computation 10(7), 1998. Stephen Salzberg On Comparing Classifiers: Pitfalls to Avoid and a Recommended Approach Data Mining and Knowledge Discovery, 1997. 

10/3 Class Canceled  
10/5  Features (selection, transformation, discretization) 
None of our sources is a perfect match for this lecture.
Relevant reading includes some portions of [F] Chapter 10, and [A] Chapter 6 See also lecture slides. 

Additional Optional Reading 
Wrappers for Feature Subset Selection
Ron Kohavi, George H. John
Artificial Intelligence, 1996.
(Read till section 3.2 inclusive.)
Supervised and unsupervised discretization of continuous features. James Dougherty, Ron Kohavi, and Mehran Sahami. International Conference on Machine Learning, 1995. 

Written Assignment 3  Assignment 3  10/19  
10/10 University Holiday No classes  
10/12 Class Canceled  
10/17; 10/19  Linear Threshold Units 
[M] 4.14.4
See also new book chapter from [M] See also lecture slides. Alternative reading: [DHS] 5.5; 

Written Assignment 4  Assignment 4  11/2  
Wednesday 10/26  Midterm Exam 
Material for the exam includes everything covered up to 10/19 but excluding clustering.
Everything discussed in class is included for the exam.
The reading assignments are supporting materials that should be useful in review and study but I will not hold you responsible for details in the reading that were not discussed in class.
The Exam is closed book; no notes or books are allowed; no calculators or other machines of any sort are allowed. The exam will aim to test whether you have grasped the main concepts, problems, ideas, and algorithms that we have covered, including the intuition behind these. Generally speaking, the exam will not test your technical wizardry with overly long equations or calculations, but, on the other hand, it is sure to include some shorter ones. 

10/19; 10/24  Clustering 
[F] 8.45
See also lecture slides. (TBA) Alternative reading: [DHS] 10.67,10.9. 

10/24; 10/31  Unsupervised and SemiSupervised Learning with EM 
[M] Section 6.12
Text Classification Using Labeled and Unlabeled Documents using EM Nigam et. al, Machine Learning Volume 39, pages 103134, 2000. (The entire paper is relevant; you can skip section 5.3) See also lecture slides . (TBA) Alternative reading: [A] 7.4; [F] 9.4; [DHS] 3.9 

11/2  Association Rules 
[F] 6.3
Mining Association Rules between Sets of Items in Large Databases Rakesh Agrawal, Tomasz Imielinski, Arun Swami Proceedings of the 1993 ACM SIGMOD International Conference on Management of Data, 1993. See also lecture slides. (TBA) Alternative reading: [WF] 4.5 

Optional Reading 
Real World Performance of Association Rule Algorithms
Zheng et al, KDD 2001.
Mining the Most Interesting Rules Bayardo et all, KDD 1999. Dynamic Itemset Counting and Implication Rules for Market Basket Data Brin et al, SIGMOD 1997. Discovering All Most Specific Sentences Gunopulos et al, TODS, 2003. 