πŸ“— -> 04/18/25: ECS189G-L9


sec_5_deep_learning_basic

🎀 Vocab

❗ Unit and Larger Context

Section 5 - DL Basic Summary

Background Knowledge

  • What is deep learning?
  • Why we need deep learning?
  • A brief history of deep learning
  • What makes deep learning work?
    Technical Details
  • Biological Neuron vs Artificial Neuron
  • Perceptron and its Weakness
  • Multi-layer perceptron and applications
  • How to train a MLP
  • Error Backpropagation Algorithm

βœ’οΈ -> Scratch Notes

Artificial Neurons

Artificial Neuron: McCulloch-Pitts (MP) Neuron

  • Input: from n other neurons
  • Connection Weight
  • Inherent Activating Threshold of the current neuron:
  • Formally:
  • Output of the neuron will be

Activation Functions:

  • Binary step function - f(z) = 1 if z > 0 else 0
  • Sigmoid Function -
    • Range from 0 to 1. f(0)=0.5
  • Tanh -
    • Between -1 and 1. Smooth, f(0)=0
  • ReLU (Rectified Linear Unit) -
    • Zero or Z
  • Softplus =
    • Can be activated if less than zero
      Each of these has an easy derivative calculation (except for binary step, and ReLU, that have no derivative at 0)

Perceptron and its weakness

They can only solve problems where the data is linearly seperable.
Due to this, it can’t handle complex data sets like the XOR data set.

  • Takeaway: XOR is the weakness of single layer perceptron models.

However, adding a hidden layer allows the network to learn complex non linear relationships, overcoming the XOR problem.

# XOR Network

from torch import nn
layer = nn.Linear(2,2, bias=True)
act_func = 
h = layer(x)

How to Train an MLP

Baacckkpprooop

πŸ§ͺ -> Refresh the Info

Did you generally find the overall content understandable or compelling or relevant or not, and why, or which aspects of the reading were most novel or challenging for you and which aspects were most familiar or straightforward?)

Did a specific aspect of the reading raise questions for you or relate to other ideas and findings you’ve encountered, or are there other related issues you wish had been covered?)

Resources

  • Put useful links here

Connections

  • Link all related words