π -> 11/1: Neural Network Functions
[Lecture Slide Link]
π€ Vocab
Max Pool - Selecting the biggest value in a neighborhood as a representative to reduce output
- Pooling together and picking the max
- Average pooling also popular
One Hot Encoding - One-Hot-Encoding, we want the output layer to in a perfect world only have 1 active (output layer is only cat firing, dog and squirrel inactive). This is a form of one hot encoding
Soft max - An activation function (like sigmoid or RELU)
Multi Object Optimization - More advanced? Optimizing more than one object
β Unit and Larger Context
Conv Nets
Convolution -> Pooling -> Convolution -> Pooling -> Fully connected -> Fully Connected -> Output Predictions
βοΈ -> Scratch Notes
RELU makes things non negative
Loss: Comparing ground truth with the output:
If weβre assuming one hit encoding, we can make the following step:
Loss assuming one-hot encoding for ground truth:
Loss for a mini-batch of size N:
AlexNet presented at NIPS 2012, one of the biggest strides for CNNs, neural networks period. They showed that just expanding hte number of filters, layers, datas (and having access to a GPU) helped the network to scale very well
ImageNet Classification with Deep Convolutional Neural Networks, NIPS 2012
π§ͺ-> Example
- List examples of where entry contents can fit in a larger context
π -> Links
Resources
- Put useful links here
Connections
- Link all related words