Notes from when I took UCSB’s CMPSC 190i : Introduction to Deep Learning with Professor Shiyu Chang.
- LLM’s
- Diffusion Model
- Dangers of AI
- Python Review Notebook
- Numpy
- Matplot
- Linear Regression
- Ridge Regression
- Gradient Descent
- Finding the best weights
- How to roll down
- Stochastic Gradient Descent
- SGD
- Sigmoid
- Logistic Loss Function
- Pseudocode for SGD
- Multi-Class Classification
- Probabilistic Interpretation
- Maximizing the Data Likelihood
- Tailoring
- Softmax Function
- Training Objective for Multi-Class Classification
- SGD with Cross Entropy Loss
- Linear Classifier
- Multi-Layer Neural Networks
- 10 Class Classification Deep Review
- Simple Nonlinear Prediction
- Multi-layer Neural Network
- Nonlinearity
- XOR Problem
- Training Neural Networks
- How to find Hyperparameters
- Multi-Layer Neural Networks
- Back Propagation
- Backpropagation
- Gates
- Vector Derivatives
- MISSING
- MISSING
- CNN (Convolutional Neural Network)
- 1x1 Convolution
- ResNet
- MISSING
- Language Modeling
- Improved RNN Architecture
- Sequence to Sequence Architecture
- Transformer
- Scaled Dot-Product Attention
- Self-Attention Layer
- Permuting
- Positional Encoding
- Masked Self-Attention Layer
- Transformer Block