π -> 10/02/25: ECS171-L2
Lecture 2 slides
Also a lecture 3 slideshow that hasnt been posted yet
π€ Vocab
β Unit and Larger Context
Small summary
βοΈ -> Scratch Notes
Starting from: Solving the problem: Gradient Descent (Method 2) (page 17)
rule is parameters updated according to:
Least Mean Squares (LMS) update rule:
If we have m samples:
- Batch gradient descent
- Watered down, takes everything into account
- Stochastic gradient descent
- Might be disparate, βrandomβ updates
Sweet spot is mini-batch
Bootstrapping - sampling with replacement (only done when not enough data, creates new data set)
Onto Lecture 3
Regression and Polynomial Fitting
Finding sweet spot between data complexity and model complexity
- (how many model parameters do we want? Depends on the underlying data function)
- Youβre example of data generated by third degree function, training error minimized by M=9 model, test error minimized by M=3/4/5 model.
- However, if N is big enough even the M=9 model fits nearly perfectly (with N=100 instead of N=10, it fits the baseline function nearly perfectly)
Linear Regression
Two solutions for linear regression:
Ordinary Least Squares:
Gradient descent
See equation we wrote above, update weight with learning rate proportional to error
Regularization
A method to reduce the number of non-zero weights
Ridge Regression
L2 regularization
Lass Regression
L1 regularization
Elastic Net Regression
Combines L1 and L2 regression
Conditional Probabilities
assume error is normal:
β¦
Maximum Likelihood Estimator (MLE)
βincrease the likelihood (prob) that we observe the data given the modelβ
Maximizing probability of connecting X to Y
π§ͺ -> Refresh the Info
Did you generally find the overall content understandable or compelling or relevant or not, and why, or which aspects of the reading were most novel or challenging for you and which aspects were most familiar or straightforward?)
Did a specific aspect of the reading raise questions for you or relate to other ideas and findings youβve encountered, or are there other related issues you wish had been covered?)
π -> Links
Resources
- Put useful links here
Connections
- Link all related words