πŸ“— -> 09/26/25: NPB287A-2-L1


[Lecture Slide Link]

🎀 Vocab

❗ Unit and Larger Context

Mehlig, Machine Learning with Neural Networks - link

  • Inspo from: intro to the theory of neural computation

βœ’οΈ -> Scratch Notes

Mon: Draft pres and problem set
Tues: Meet with prof
Wed-Thurs: Keep working w group
Friday: Lead class discussions

Tips:

  1. Start big picture
  2. Foreach topic:
    1. GIve both qual andquant realization of idea, show connection
  3. SImulations/demos are helpful
    1. Makesure to ocnnect them, not just do them or demos sake

Intro

ANNs vs real NNs

  • recurrent neural networks, not modular layers

Neurons

  • Real: Have voltage traces, noisy and with spikes
  • Artificial: Represented by binary (active/inactive) or firing rate (continuous number, modeling # of spikes in window of time)

LinAlg

Product of vectors:

  1. Element by element
    1. (least common), hadamard product. .* in Matlab
  2. Inner product
    1. Dot product, , collapsing similar vectors into
    2. Must be 1xN and Nx1. INNER dimmensions (N) must agree, and you will have a resulting matrix of side 1x1 (outer dims)
    3. Intuition is the β€œoverlap” of 2 vectors. (the projection of one vectoron another).
    4. Linear feed forward network: is firing rate, is the weights. Activation of output neurons is
      1. Intuition: output gets bigger as r and w gets bigger, but ALSO gets bigger when rates and weights are parallel/aligned
        1. Receptive fields can be thought of this way
  3. Outer product
    1. Hopfield product if built from outer product
    2. OUTER dimensions must match: Nx1 by 1xM -> NxM matrix
      1. Each column/row is a multiple of the others
      2. 1 vector direction, with different scalings in each of the columns (same thing applies for rows)
      3. Gives rank 1 matrix (SVD is built on this)
  4. Matrix by a matrix
    1. y1 is the inner product of the first row of W times X
    2. There is also an outerproduct interpretation: Column 1 of matrix times x
      1. Weighted sum of columns of W, weighted by columns of x

πŸ§ͺ -> Refresh the Info

Did you generally find the overall content understandable or compelling or relevant or not, and why, or which aspects of the reading were most novel or challenging for you and which aspects were most familiar or straightforward?)

Did a specific aspect of the reading raise questions for you or relate to other ideas and findings you’ve encountered, or are there other related issues you wish had been covered?)

Resources

  • Put useful links here

Connections

  • Link all related words