📗 -> 05/21/25: ECS170-L23
🎤 Vocab
❗ Unit and Larger Context
came in 30 mins late…
HMM Problems
- Likelihood - Hidden states?
- Decoding - Likelihood of next state?
- Learning - Finding underlying HMM model from observations
✒️ -> Scratch Notes
Viterbi Algorithm vs Forward Algorithm
Identical, except:
- Viterbi takes the max over the previous path probabilities, whereas the forward algorithm takes the sum.
- Viterbi algorithm has one component that the forward algorithm doesn’t have: backpointers.
- forward algorithm only needs to produce an observation likelihood
- Viterbi algorithm must produce a probability and also the most likely state sequence
- We compute this best state sequence by keeping track of the path of hidden states that led to each state, and then at the end backtracing the best path to the beginning (the Viterbi backtrace)
Problem 3. Learning
How to find underlying HMM model (Initial Probability, Transition Probability, Emission Probability) to fit this data?
- Use the forward-backward algorithm or the Baum-Welch algorithm
- Special case of the expectation maximization algorithm. An iterative algorithm, slowly improving learned probabilities
- maximum likelihood estimation (MLE)
- Special case of the expectation maximization algorithm. An iterative algorithm, slowly improving learned probabilities
Picture taken ->
forward backward algorith
- Initialize A and B
- Iterate until convergence
-
E-Step (expectation)
- M-Step (maximization)
- Apply the variables calculated in E-step to maximize expectation
🧪 -> Refresh the Info
Did you generally find the overall content understandable or compelling or relevant or not, and why, or which aspects of the reading were most novel or challenging for you and which aspects were most familiar or straightforward?)
Did a specific aspect of the reading raise questions for you or relate to other ideas and findings you’ve encountered, or are there other related issues you wish had been covered?)
🔗 -> Links
Resources
- Put useful links here
Connections
- Link all related words
i