60.001 Applied Deep Learning
Definitions
- Artificial Intelligence
- Machine Learning
- Neural Networks
- Deep Learning
- Regression and Classification Models
Per Week
Week 1
Lecture 1
- Regression Models
- Classification Models
- Supervised ML
- ML Problem Elements
- Loss function
- Linear regression
- Gradient Descent
- Multi-parameter linear regression
- Polynomial regression
- Hyperplane
- Overfitting
- Underfitting
- Generalisation
- Train and test split
- Regularisation
- Ridge Regressor
- Unsupervised ML
Lecture 2
- Sigmoid function
- Logistic function
- Expert systems
- Logistic regression
- Log-likelihood cross-entropy function
- Confusion matrix
- Neuron (ML)
- Neural Networks#A Minimal NN
- Backpropagation
Week 2
Lecture 1
- Neural Networks#Training Procedure
- Neural Networks#Symmetry
- No free lunch theorem
- Constant initialisation
- Exploding gradients
- Vanishing gradients
- Hyperparameters
- Neural Networks#Introducing non-linearity
- Sigmoid function
- tanh function
- ReLU function
- Leaky ReLU function
- Universal approximation theorem
- Neural Networks#Improving gradient descent
- Momentum
- Learning rate decay
- Gradient-based learning rate control
- AdaGrad
- RMSProp
- Adam optimiser
Lecture 2
Week 3
Lecture 1
Lecture 2
Week 4
Lecture 1
- Image encoding
- Computer vision
- Spatial dependence
- Homophily
- Convolution
- Blur kernel
- Edge detection kernel
- Contrast enhancer kernel
- Convolutional Neural Networks
Lecture 2
- Pooling layer
- Dropout layer
- Batchnorm layer
- Data augmentation
- Convolutional Neural Networks#A quick history on remarkable CV models
- Skip connections
- Transfer learning
- Finetuning
Week 5
Lecture 1
Parent: DAI @ SUTD