Jump to: Software • Conferences & Workshops • Related Courses • Prereq Catchup • Deep Learning Selfstudy Resources
Software
For this course, we strongly recommend using a custom environment of Python packages all installed and maintained via the free ['conda' package/environment manager from Anaconda, Inc.]
For detailed instructions, see the [Python Setup Instructions page]
High Performance Computing
For your final project, you have two options:
 use the Tufts High performance computing cluster

 See the coursespecific [Tufts HPC Setup] page.

use Amazon Web Services (AWS)

 See the coursespecific [Tufts AWS Setup] page.
If you are looking for basic workshops to get help on things like using Linux or the Tufts Highperformance computing cluster, checkout the programs in the Tufts Data Lab: https://sites.tufts.edu/datalab/workshops/
Conferences and Workshops
Bayesian Deep Learning workshop at NIPS
Good place to browse for potential project ideas:
 Upcoming in Dec. 2018: http://bayesiandeeplearning.org
 2017 workshop: http://bayesiandeeplearning.org/2017
 2016 workshop: http://bayesiandeeplearning.org/2016
Related Courses
Summer School on Deep Learning AND Bayesian Methods
August 27th – September 1st 2018, Moscow, Russia
Prereq Catchup Resources
Here are some useful resources to help you catch up if you are missing some of the prerequisite knowledge. Please contribute new resources by starting a topic on the Canvas discussion forum.
To achieve this objective, we expect students to be familiar with:
Probability
 Key concepts:

 Gaussian pdf (univariate and multivariate)

 Bayes theorem and associated algebra
 Litmus test:

 Do you follow the derivation in R&W Ch. 2 in computing the analytic form of the posterior given a Gaussian likelihood and a Gaussian prior?

Possible resources:

 David Mackay's "The Humble Gaussian Distribution" tutorial: http://www.inference.org.uk/mackay/humble.pdf

 Stanford CS229 notes on Gaussian distributions: http://cs229.stanford.edu/section/gaussians.pdf

 Stanford CS229 notes on Gaussian processes: http://cs229.stanford.edu/section/cs229gaussian_processes.pdf
Firstorder gradientbased optimization
 Key concepts:

 Gradient descent

 Learning rates

 Difference between convex and nonconvex functions for minimization
 Litmus test:

 Could you fit a linear regression model via gradient descent? (see notebook below).
 Possible resources:

 Convex Optimization overview for Stanford CS229: http://cs229.stanford.edu/section/cs229cvxopt.pdf

 Jupyter notebook on 'Linear Regression with NumPy' (fits linear model with gradient descent): https://www.cs.toronto.edu/~frossard/post/linear_regression/
Linear algebra
 Key concepts:

 matrix multiplication

 matrix inversion
 Litmus test:

 Could you turn the pseudocode from R&W Chapter 2 on GPs for Regression into Python code (as in HW1)? If so, you probably have all the necessary background
 Possible resources:

 Goodfellow et al's chapter on Linear Algebra: http://www.deeplearningbook.org/contents/linear_algebra.html

 Immersive Linear Algebra: http://immersivemath.com/ila/

 Essence of Linear Algebra videos: https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab

 'Computational Linear Algebra for Coders' course by fast.ai: https://github.com/fastai/numericallinearalgebra/blob/master/README.md
Basic supervised machine learning methods
 Key concepts:

 Linear regression

 Logistic regression
Deep Learning Selfstudy Resources
Here are some related free online courses which would be good introductions to standard deep learning methods:
 FastAI's 'Deep Learning for Coders': http://course.fast.ai/
 Coursera/Andrew Ng's 'Neural Networks and Deep Learning': https://www.coursera.org/learn/neuralnetworksdeeplearning/home/welcome
 Stanford's CS231n "Convolutional Neural Networks for Visual Recognition": http://cs231n.stanford.edu/syllabus.html