๐Ÿ“— -> Lecture Date: Name


[Lecture Slide Link]

๐ŸŽค Vocab

โ— Unit and Larger Context

Small summary

โœ’๏ธ -> Scratch Notes

Optimization

Gradient descent

  • Limit to a convex function
  • Compute gradient at the current location
  • Take a step down the gradient
  • Repeat
  • Local Optima = Global Optima = Derivative == 0
    Step size (alpha) is very important
  • If step too big,

๐Ÿงช-> Example

  • List examples of where entry contents can fit in a larger context

Resources

  • Put useful links here

Connections

  • Link all related words