In this lecture, we talk about more advanced concepts of DL, like convolutional neural networks, pooling, transfer learning, learning rate scheduling, hyper parameter tuning, etc.
Much of the same papers as DL101 - these are very useful ...
- A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay, Leslie N. Smith
- Bag of Tricks for Image Classification with Convolutional Neural Networks, Tong He, Zhi Zhang, Hang Zhang, Zhongyue Zhang, Junyuan Xie, Mu Li
- The 1Cycle Policy, Sylvain Gugger
- Pytorch documentation
- Resnet paper
- Efficientnet paper