CSCI 378: Deep Learning
REED COLLEGE, SPRING 2020
This course is an introduction to deep neural architectures and their training. Beginning with the fundamentals of regression, optimization, and regularization, the course will then survey a variety of architectures and their associated applications. Students will develop projects that implement deep learning systems to perform various tasks. Prerequisites: Mathematics 201 and Computer Science 121.
BASIC INFO
Professor: Mark Hopkins, hopkinsm@reed.edu
Class Schedule: MWF 9-950am, on Zoom
Office Hours: M 4-5pm, W 4-6pm, Th 1030-1130, F 4-5pm (all by appointment, sign up here, the Zoom meeting link is here).
Syllabus: downloadable here
LECTURE NOTES
- Machine Learning: A Whirlwind Guide (pdf)
- Gradient Descent (pdf)
- Gradient Descent: Now in 2D! (pdf)
- A Quick Review of Probability (pdf)
- Bayesian Networks (pdf)
- Reading a Bayesian Network (pdf)
- Functional Causal Models (pdf)
- Exponential Distributions (pdf)
- Regression Problems (pdf)
- Argmin and Monotonic Functions (pdf)
- Linear Regression: MLE (pdf)
- Linear Regression: MAP (pdf)
- Logistic Regression: MLE (pdf)
- The Chain Rule of Partial Derivatives (pdf)
- Regression: A Neural View (pdf)
- Feature Discovery Networks (pdf)
- Training Feature Discovery Networks (pdf)
- Neural Networks and Backpropagation (pdf)
- Backpropagation: A Matrix Formulation (pdf)
- Minibatch Gradient Descent (pdf)
- Multiway Classification (pdf)
- Activation Functions (pdf)
- Convolutional Neural Networks (pdf)
- Padding and Pooling (pdf)
- CNNs for Images (pdf)
- Computing Convolutions (pdf)
- Recurrent Neural Networks (pdf)
- RNN Language Models (pdf)
- Word Vectors (pdf)
- Long Short Term Memory Networks (not covered in lecture, pdf)
- Encoder-Decoder Networks (not covered in lecture, pdf)
LECTURE CODE/VIDEOS
- Tensors (Jan 29): https://classroom.github.com/a/h3tOU2km
- Training Feature Discovery Networks (Mar 30): (video1) (video2) (video3)
- Neural Networks and Backpropagation (Apr 1): (video1) (video2) (video3) (video4)
- Backpropagation: A Matrix Formulation (Apr 3): (video1) (video2)
- Minibatch Gradient Descent (Apr 6): (video1) (video2) (video3)
- Multiway Classification (Apr 8): (video1) (video2) (video3)
- Introduction to Autograd (Apr 10): https://classroom.github.com/a/dNYuHSx7
- Activation Functions (Apr 15): (video)
- Convolutional Neural Networks (Apr 15-17): (video1) (video2) (video3) (video4)
- Padding and Pooling (Apr 20): (video1) (video2) (video3)
- CNNs for Images (Apr 22): (video1) (video2) (video3)
- Computing Convolutions (Apr 24): (video1) (video2) (video3)
- Recurrent Neural Networks (Apr 27): (video1) (video2) (video3)
- RNN Language Models (Apr 29): (video1) (video2)
- Word Vectors (May 1): (notebook) (video1) (video2)
HOMEWORK AND PROJECT LINKS
- HW0 (Memoize): https://classroom.github.com/a/WqP7Oqmf
- HW1 (Rubik): https://classroom.github.com/a/yaOkb2-4
- Project 1 (Descent): https://classroom.github.com/a/zidPHJ4R
- HW2 (optional): (pdf)
- HW3: (pdf)
- HW4: (pdf)
- HW5: (pdf)
- HW6: (pdf)
- HW7: (pdf)
- HW8: (pdf)
- HW9: (pdf)
- Project 2 (“Lines”, due Monday, Mar. 9): https://classroom.github.com/a/Ewo2k4xc
- HW10: (pdf)
- HW11: (pdf)
- Project 3 (“Logistics”, due Wednesday, Mar. 18): https://classroom.github.com/a/EZrBrI8n
- HW14: (pdf)
- HW15: (pdf)
- HW16 (note that the PDF erroneously refers to itself as HW15): (pdf)
- Project 4, Part 1 (“Colonels”, due Friday, Apr. 10): https://classroom.github.com/a/veJR40RC
- HW17: (pdf)
- Project 4, Part 2 (“Basic Training”, due Friday, Apr. 17): https://classroom.github.com/a/o8q4BTjw
- HW19: (pdf)
- HW20: (pdf)
- Project 4, Part 3 (“Kernels”, due Wednesday, Apr 29): https://classroom.github.com/a/jX3RW4jL
- HW21: (pdf)
SCHEDULE
Jan 27: Introduction
- Reading: None
- Assignment:
- Sign up for a Github account if you don’t already have one (www.github.com).
- Go through the following setup, if desired (no need to use the Anaconda setup if you prefer another one, but make sure you install torch and rpyc): https://github.com/Mark-Hopkins-at-Reed/csci-378/blob/master/admin/setup.md
- Go through the Github Classroom tutorial: https://github.com/Mark-Hopkins-at-Reed/csci-378/blob/master/admin/github.md
- Bring your laptop on Wednesday!
Jan 29: Tensor Manipulation with Torch
- Reading: Go through the Tensors lecture code above on your own to review.
- Assignment (due Friday, Jan 31): Complete HW0 (assignment link above).
- Assignment (due Monday, Feb 3): Complete HW1 (assignment link above).
Jan 31: Machine Learning: A Whirlwind Guide
- Reading: Review the lecture notes.
- Assignment (due Monday, Feb 3): Complete HW1 (assignment link above).
Feb 3: Gradient Descent
- Reading: Review the lecture notes.
- Assignment (due Friday, Feb 7): Complete Project 1 (assignment link above).
Feb 5: Gradient Descent
- Reading: Review the complete lecture notes on Gradient Descent and Gradient Descent: Now in 2D!
- Assignment (extended deadline, now due Monday, Feb 10): Complete Project 1 (assignment link above).
Feb 7: Gradient Descent: Now in 2D!
- Reading: Review the lecture notes on A Quick Review of Probability.
- Assignment (due Monday, Feb 10): Complete Project 1.
- Assignment (optional): Complete HW2.
Feb 10: Bayesian Networks
- Reading: Review the lecture notes on Bayesian Networks.
- Assignment (due Wednesday, Feb 12): Complete HW3.
Feb 12: Reading a Bayesian Network
- Reading: Review the lecture notes on Reading a Bayesian Network.
- Assignment (due Friday, Feb 14): Complete HW4.
Feb 14: Reading a Bayesian Network
- Reading: If you haven’t already, review the lecture notes on Reading a Bayesian Network, focusing especially on the definition for d-separation and the exercise that follows it.
- Assignment (due Monday, Feb 17): Complete HW5.
Feb 17: Causal Models
- Reading: Review the lecture notes on Functional Causal Models.
- Assignment (due Wednesday, Feb 19): Complete HW6.
Feb 19: Exponential Distributions
- Reading: Review the lecture notes on Exponential Distributions.
- Assignment (due Friday, Feb 21): Complete HW7.
Feb 21: Regression Problems
- Reading: Review the lecture notes on Regression Problems (at least up to slide 9).
- Assignment (due Monday, Feb 24): Complete HW8.
Feb 26: Regression Problems
- Reading: Review the lecture notes on Regression Problems (at least up to slide 16).
- Assignment (due Friday, Feb 28): Complete HW9.
Feb 28: Argmin and Monotonic Functions
- Reading: Review the lecture notes on Argmin and Monotonic Functions.
- Assignment (due Monday, Mar 2): Complete HW10.
Mar 2: Maximum Likelihood Estimation for Linear Regression
- Reading: Review the lecture notes on Linear Regression: MLE.
- Assignment (due Wednesday, Mar 4): Complete HW11.
Mar 4: MAP Estimation for Linear Regression
- Reading: Review the lecture notes on Linear Regression: MAP.
Mar 6: Exam 1
Mar 9: Maximum Likelihood Estimation for Logistic Regression
- Reading: Review the lecture notes on Logistic Regression: MLE.
- Assignment (due Wednesday, Mar 11): Complete the exercise on Slide 9 of the notes on Logistic Regression: MLE.
Mar 11: The Chain Rule of Partial Derivatives
- Reading: Review the lecture notes on Partial Derivatives and the Chain Rule.
- Assignment (due Friday, Mar 13): Adapt the analysis from slides 10-14 to compute the partial derivative of A with respect to theta, except now assume that c (the length of the hypotenuse) doesn’t change, but instead b changes. This simulates how the area under a fixed-length ladder changes as we change its angle to the ground.
Mar 13: Regression: A Neural View
- Reading: Review the lecture notes on Regression: A Neural View.
Mar 16: Feature Discovery Networks
- Reading: Review the lecture notes on Feature Discovery Networks.
- Assignment (due Wednesday, Mar. 18): Complete HW14. Also remember that Project 3 is due Wednesday as well.
Mar 30: Training Feature Discovery Networks
- Reading: Review the lecture notes on Training Feature Discovery Networks.
- Assignment: None
Apr 1: Neural Networks and Backpropagation
- Reading: Review the lecture notes on Neural Networks and Backpropagation.
- Assignment (due Friday, April 3): Fill in the blanks on Slides 12 and 13 of the lecture notes for Neural Networks and Backpropagation. Please submit through Gradescope if possible; if not, just email me your solution.
Apr 3: Backpropagation: A Matrix Formulation
- Reading: Review the lecture notes on Backpropagation: A Matrix Formulation.
- Assignment (due Monday, April 6): Complete HW16 (“Backpropagation Practice”). Please submit through Gradescope if possible; if not, just email me your solution.
Apr 6: Minibatch Gradient Descent
- Reading: Review the lecture notes on Minibatch Gradient Descent.
- Assignment (due Wednesday, April 8): Complete HW17 (“Softmax”). Please submit through Gradescope if possible.
Apr 8: Multiway Classification
- Reading: Review the lecture notes on Multiway Classification.
- Assignment (due Friday, April 10): Complete Project 4, Part 1 (“Colonels”).
Apr 10: Introduction to Autograd
- Reading: Go through the lecture code (“Introduction to Autograd”).
- Assignment: None, but the second exam is on Monday.
Apr 13: Exam 2
Apr 15: Activation Functions/Intro to CNNs
- Reading: Review the lecture notes on Activation Functions and the videos on Convolutional Neural Networks.
- Assignment: Project 4, Part 2 (“Basic Training”) is due Friday.
Apr 17: Convolutional Neural Networks
- Reading: Review the lecture notes on Convolutional Neural Networks.
- Assignment: Complete the exercise on the final page of the CNN lecture notes. Please submit through Gradescope if possible.
Apr 20: Padding and Pooling
- Reading: Review the lecture notes on Padding and Pooling.
- Assignment: Complete HW19 (“Padding and Pooling”). Please submit through Gradescope if possible.
Apr 22: CNNs for Images
- Reading: Review the lecture notes on CNNs for Images, and preview the lecture notes for Computing Convolutions.
- Assignment: Complete HW20 (“CNNs for Images”). Please submit through Gradescope if possible. In addition, Project 4, Part 3 (“Kernels”) has been assigned and is due next Wednesday, April 29.
Apr 24: Computing Convolutions
- Reading: Review the lecture notes on Computing Convolutions.
- Assignment: None, but this would be a good time to start on Project 4, Part 3 (due Wednesday) if you haven’t already. This project is almost certainly the most difficult of the semester.
Apr 27: Recurrent Neural Networks
- Reading: Review the lecture notes on Recurrent Neural Networks.
- Assignment: None, but Project 4, Part 3 is due Wednesday.
Apr 29: RNN Language Models
- Reading: Review the lecture notes on RNN Language Models.
- Assignment: Complete HW21 (“Cosine Similarity”). Please submit through Gradescope.
May 1: Word Vectors
- Reading: Review the lecture notes on Word Vectors.
- Assignment: Study for the final exam, which will be administered through Gradescope on Wednesday, May 13 from 1-3pm (Pacific Time). More details to come, but it will focus on material since the second exam (i.e. CNNs and RNNs).