πŸ“— -> 10/02/25: ECS171-L2


Lecture 2 slides
Also a lecture 3 slideshow that hasnt been posted yet

🎀 Vocab

❗ Unit and Larger Context

Small summary

βœ’οΈ -> Scratch Notes

Starting from: Solving the problem: Gradient Descent (Method 2) (page 17)
rule is parameters updated according to:


  1. Least Mean Squares (LMS) update rule:

If we have m samples:

  1. Batch gradient descent
    • Watered down, takes everything into account
  2. Stochastic gradient descent
    • Might be disparate, β€˜random’ updates

Sweet spot is mini-batch

Bootstrapping - sampling with replacement (only done when not enough data, creates new data set)


Onto Lecture 3

Regression and Polynomial Fitting

Finding sweet spot between data complexity and model complexity

  • (how many model parameters do we want? Depends on the underlying data function)
    • You’re example of data generated by third degree function, training error minimized by M=9 model, test error minimized by M=3/4/5 model.
    • However, if N is big enough even the M=9 model fits nearly perfectly (with N=100 instead of N=10, it fits the baseline function nearly perfectly)

Linear Regression

Two solutions for linear regression:

Ordinary Least Squares:
Gradient descent

See equation we wrote above, update weight with learning rate proportional to error

Regularization

A method to reduce the number of non-zero weights

Ridge Regression

L2 regularization

Lass Regression

L1 regularization

Elastic Net Regression

Combines L1 and L2 regression

Conditional Probabilities

assume error is normal:

…

Maximum Likelihood Estimator (MLE)

β€œincrease the likelihood (prob) that we observe the data given the model”

Maximizing probability of connecting X to Y

πŸ§ͺ -> Refresh the Info

Did you generally find the overall content understandable or compelling or relevant or not, and why, or which aspects of the reading were most novel or challenging for you and which aspects were most familiar or straightforward?)

Did a specific aspect of the reading raise questions for you or relate to other ideas and findings you’ve encountered, or are there other related issues you wish had been covered?)

Resources

  • Put useful links here

Connections

  • Link all related words