Notes from when I took UCSB’s CMPSC 190i : Introduction to Deep Learning with Professor Shiyu Chang.


Lecture 1

  • LLM’s
  • Diffusion Model
  • Dangers of AI

Lecture 2

  • Python Review Notebook
  • Numpy
  • Matplot

Lecture 3

  • Linear Regression
  • Ridge Regression

Lecture 4

  • Gradient Descent
  • Finding the best weights
  • How to roll down
  • Stochastic Gradient Descent

Lecture 5

  • SGD
  • Sigmoid
  • Logistic Loss Function
  • Pseudocode for SGD

Lecture 6

  • Multi-Class Classification
  • Probabilistic Interpretation
  • Maximizing the Data Likelihood
  • Tailoring
  • Softmax Function
  • Training Objective for Multi-Class Classification
  • SGD with Cross Entropy Loss

Lecture 7

  • Linear Classifier
  • Multi-Layer Neural Networks

Lecture 8

  • 10 Class Classification Deep Review

Lecture 9

  • Simple Nonlinear Prediction
  • Multi-layer Neural Network
  • Nonlinearity
  • XOR Problem

Lecture 10

  • Training Neural Networks
  • How to find Hyperparameters

Lecture 11

  • Multi-Layer Neural Networks
  • Back Propagation

Lecture 12

  • Backpropagation
  • Gates
  • Vector Derivatives

Lecture 13

  • MISSING

Lecture 14

  • MISSING

Lecture 15

  • CNN (Convolutional Neural Network)
  • 1x1 Convolution
  • ResNet

Lecture 16

  • MISSING

Lecture 17

  • Language Modeling
  • Improved RNN Architecture
  • Sequence to Sequence Architecture
  • Transformer
  • Scaled Dot-Product Attention

Lecture 18

  • Self-Attention Layer
  • Permuting
  • Positional Encoding
  • Masked Self-Attention Layer
  • Transformer Block