π -> 09/26/25: NPB287A-2-L1
[Lecture Slide Link]
π€ Vocab
β Unit and Larger Context
Mehlig, Machine Learning with Neural Networks - link
- Inspo from: intro to the theory of neural computation
βοΈ -> Scratch Notes
Mon: Draft pres and problem set
Tues: Meet with prof
Wed-Thurs: Keep working w group
Friday: Lead class discussions
Tips:
- Start big picture
- Foreach topic:
- GIve both qual andquant realization of idea, show connection
- SImulations/demos are helpful
- Makesure to ocnnect them, not just do them or demos sake
Intro
ANNs vs real NNs
- recurrent neural networks, not modular layers
Neurons
- Real: Have voltage traces, noisy and with spikes
- Artificial: Represented by binary (active/inactive) or firing rate (continuous number, modeling # of spikes in window of time)
LinAlg
Product of vectors:
- Element by element
- (least common), hadamard product.
.*in Matlab
- (least common), hadamard product.
- Inner product
- Dot product,
, collapsing similar vectors into - Must be 1xN and Nx1. INNER dimmensions (N) must agree, and you will have a resulting matrix of side 1x1 (outer dims)
- Intuition is the βoverlapβ of 2 vectors. (the projection of one vectoron another).
- Linear feed forward network:
is firing rate, is the weights. Activation of output neurons is - Intuition: output gets bigger as r and w gets bigger, but ALSO gets bigger when rates and weights are parallel/aligned
- Receptive fields can be thought of this way
- Intuition: output gets bigger as r and w gets bigger, but ALSO gets bigger when rates and weights are parallel/aligned
- Dot product,
- Outer product
- Hopfield product if built from outer product
- OUTER dimensions must match: Nx1 by 1xM -> NxM matrix
- Each column/row is a multiple of the others
- 1 vector direction, with different scalings in each of the columns (same thing applies for rows)
- Gives rank 1 matrix (SVD is built on this)
- Matrix by a matrix
- y1 is the inner product of the first row of W times X
- There is also an outerproduct interpretation: Column 1 of matrix times x
- Weighted sum of columns of W, weighted by columns of x
π§ͺ -> Refresh the Info
Did you generally find the overall content understandable or compelling or relevant or not, and why, or which aspects of the reading were most novel or challenging for you and which aspects were most familiar or straightforward?)
Did a specific aspect of the reading raise questions for you or relate to other ideas and findings youβve encountered, or are there other related issues you wish had been covered?)
π -> Links
Resources
- Put useful links here
Connections
- Link all related words