๐ -> Lecture Date: Name
[Lecture Slide Link]
๐ค Vocab
โ Unit and Larger Context
Small summary
โ๏ธ -> Scratch Notes
Optimization
Gradient descent
- Limit to a convex function
- Compute gradient at the current location
- Take a step down the gradient
- Repeat
- Local Optima = Global Optima = Derivative == 0
Step size (alpha) is very important - If step too big,
๐งช-> Example
- List examples of where entry contents can fit in a larger context
๐ -> Links
Resources
- Put useful links here
Connections
- Link all related words