Assignment 03: Backpropagation
Learning Goals
- Learn how the backpropagation algorithm works.
- Practice matrix calculus.
Grading Walk-Throughs
This assignment will be graded as “Nailed It” / “Not Yet” by a TA. To complete (“Nailed It”) the assignment, you must
- Complete the assignment and submit your work to gradescope.
- Meet with a TA during their mentor session hours.
- Complete the walk-through with all group members. I prefer all partners to be present during the walk-through, but you can each meet with the TA separately if needed.
- Walk the TA through your answers. Do not expect to make corrections during the walk-through.
- The TA will then either
- mark your assignment as 100% on gradescope, or
- inform you that you have some corrections to make.
- If corrections are needed, then you will need to complete them and conduct a new walk-through with a TA.
If you have concerns about the grading walk-through, you can meet with me after you have first met with a TA.
Overview
You must work on this assignment with a partner; please let me know if you want me to help you find a partner.
- Read the description of neural network forward and backward propagation in this PDF. Here also is an example derivation for a two-layer network using binary cross entropy loss.
- Write your derived equations on gradescope (deriving equations starting at the output layer and then moving toward the input layer).
- Write some code that corresponds to your equations. Your code does not need to be “correct.” At this stage I just want you to try converting equations into code. You’ll find this Jupyter Notebook helpful for the coding parts.
I highly recommend skimming this useful article: The matrix calculus you need for deep learning.
Submitting Your Assignment
You will submit your responses on gradescope. Only one partner should submit. The submitter will add the other partner through the gradescope interface.
Additional details for using gradescope can be found here: