Here you will find all the Jupyter notebooks used in the course. They will be added as the course progresses.
Notebook | Colab link |
---|---|
1: Digit classification Deep learning 101: Classifying handwritten numbers. This notebook serves as a quick intro to the Keras deep learning framework. |
|
2: Fashion MNIST classification In this notebook we go a little further in computer vision. |
|
3: Advanced image classification Time to do more advanced stuff, which requires a more powerful approach to constructing neural networks in Keras, namely the functional model API. Here we also start using TensorFlow datasets, image augmentation, and look at fine-tuning of existing models. |
|
4: Visualising ConvNets Plot activations of the different filters in a convnet, to visualise how patterns are encoded from the first to the last layer. |
|
5: Activations and initialisers Plot different activation functions and try them out along with different parameter initialisation schemes. |
|
6: Callbacks and schedulers Dynamically modify the optimiser settings during training. |
|
7: Optimisers Compare and evaluate different optimisation methods. |
|
8: Data loading in TensorFlow Intro to efficient data loading with tf.data.Dataset . |
|
9: Augmentation Test different augmentation methods for computer vision. |
|
10: ResNets and modern convolutional networks Implement modern, non-sequential network architectures. |
|
11: Image segmentation Train a model for semantic segmentation. |
|
12: YOLO models Optional exercise: Try out pre-trained segmentation models from the YOLO family. |
|
13: Image denoising with autoencoders Yet another computer vision task: Enhance images by removing noise. |
|
14: Process sequences with RNNs Test different methods for forecasting passenger numbers on public transport, following the approach in Ch 14. |
|
15: Weather forecasting with RNNs Another forecasting task, where we try even further deep learning approaches. |
|
16: Translate languages with a sequence-to-sequence model Here we train a recurrent network to predict entire sequences. For a given input sentence in English, we train a model to predict the Frensh translation. |
|
17: Anomaly detection Train an autoencoder to detect anomalous data in time series. |
|
18: Hacking CNNs with adversarial examples Try to fool an advanced pre-trained convolutional network into making bad predictions, by sprinkling some magic at the input images. |
|
19: Compare neural networks to tree-based models on tabular data Compare the predictive performance of neural networks and decision trees, when applied to tabular datasets. |
|
20: Preprocessing tabular data with Keras Try out the different preprocessing layers in Keras. |
|
21: Embeddings and modern networks for tabular data Skip the one-hot encodings and replace them with embeddings. Then train a transformer model on it 👩💻 |
|
22: Text classificatiom Build an NLP model for sentiment analysis, in this case, classifying film reviews. |
|
23: Word embeddings Train word embeddings and investigate the high-dimensional embedding space. |
|
24: Fine-tune a model on pretrained embeddings Benefit from the work others have done, by downloading pretrained word embeddings. |
|
25: Code classification Can you tell the difference between Java, JavaScript, C# and Python? |
|
26: Tokenisers Modern language models rely on different tokenisation algorithms to split up the text into useful tokens. Here we try out a few of them. |
|
27: Transformer encoder for classification We continue doing text classification, but with a more powerful tool: The transformer. |
|
28: Attention visualisation Visualise the attention mechanism in pretrained large language models. |
|
29: Machine translation with transformers Translate to the language of your choice using an encoder-decoder transformer model. |
|
30: Text generation Try out different sampling strategies to generate text just like ChatGPT (or at least almost). |
|
31: Run LLMs from Hugging Face Hub Run state-of-the-art open-source models with almost zero effort. |