arrow-left

All pages
gitbookPowered by GitBook
1 of 18

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Prequisites

There are certain requirements which form the basis for a successful course participation. If you do not have the mandatory requirements listed below, you should consider enrolling in a more basic course of our offerings. Alternatively bring yourself up to speed. Under additional ressources on the left sidebar you find the necessary ressources. Since the course has a really high pace it will be absolutely necessary to straighten these basic requirements before the course!

hashtag
Mandatory

Python

Here is a refresher notebook:

Math

Linear Algebra, Probability Theory (at least the basics)

Machine Learning

Basics:

  • What is a neural network

  • What is a forward/backprogragation

  • What is a loss

hashtag
Totally optional

You can set up your PC for local development. A guiding notebook is here:

Here different IDEs are presented and compared:

What is an activation function
  • What is train/val/test

  • Hello and welcome😊

    Great that you want to dive into the deep waters of Machine Learning. These are exciting times with major advancements on a quarterly basis like ChatGPT, Whisper, StableDiffusion and so many more. Nevertheless, all these exciting models were developed using solid ML knowledge, which is what we aim to acquire in this course.

    This is a course which brings you from beginner to intermediate or even advanced. It is formally called Intermediate Machine Learning but following HuggingFace🤗 terms (which we will use heavily in the course) I like to call the course SmilingFace😊. This is meant ironically, because you will never laugh in the course😊. Okay joking aside the use of smileys during learning and practioning ML helps us to remember to have fun, laugh about our mistakes and take ourself not too seriously as it was proposed by the HuggingFace🤗 community. Therefore we will use our 😊 heavily in this course.

    On the next pages you can see what the content of each course week will be starting with what will happen during each of our course sessions. Then again the SmilingFace😊 will lead you to what else to do in the week. I have divided the course into three levels of course work:

    😊

    The part after one 😊 is mandatory for each course participant for a for successful participation.

    😊😊

    The part after two 😊😊 is voluntary but recommended.

    😊😊😊

    The part after three 😊😊😊 is completely voluntarily for the ones who really want to know.

    Remember the course instructor(me) is also fallible so please question me if you see something that does not seem kind of right to you. Also always ask questions especially if you don't fully understand something. This is really why we give this course afterall so that you understand everything😊

    Own contributions or suggestions for improving the course as well as feedback are always welcome😊

    Let's dive right in!

    Intermediate Machine Learning

    Hybrid Format - Every Thursday 18h00-19h45

    Taken your first steps in ML and thirsty for becoming a fully equipped practitioner? Then this course is perfectly right for you! You will learn+build all relevant NN Architectures with the best tools

    The next pages serve as the course book. Go ahead and dive in!

    Week 2 - Recap ML Basics, Intro to PyTorch

    hashtag
    Course session

    Quiz

    ML Basics recap

    Solutions exercises

    Presentation from the participants of the tasks from PyTorch101

    Walk-through

    PyTorch 202 (Lab 02)

    hashtag
    To-do

    😊

    Go for your own through the Colab Notebook above (Pytorch202) and try to understand and repeat the steps for your own.

    Watch the videos on the next page

    Go through the following notebooks and complete the second one (assignment notebook):

    The redundancy between some notebooks is desired to reintroduce the concepts in a different way and hence enrich your learning experience!

    😊😊

    Try to improve the accuracy in the PyTorch 202 notebook by tweaking the amount of layers and number of neurons

    😊😊😊

    Familiarize yourself with basic PyTorch Tutorials:

    • (Second part)

    https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.htmlarrow-up-right
    https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial3/Activation_Functions.htmlarrow-up-right
    https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial4/Optimization_and_Initialization.htmlarrow-up-right

    Cousera Videos

    Watch them all😊

    1. Why Machine Learning is exciting

    1. What is Machine Learning?

    1. Logistic Regression

    1. Interpretation of Logistic Regression

    1. Motivation for Multilayer Perceptron

    1. Multilayer Perceptron Concepts

    1. Multilayer Perceptron Math Model

    1. Deep Learning

    1. Example: Document Analysis

    1. Interpretation of Multilayer Perceptron

    1. Transfer Learning

    1. Model Selection

    1. Early History of Neural Networks

    1. Hierarchical Structure of Images

    1. Convolutional Filters

    1. Convolutional Neural Networks

    1. CNN Math Model

    1. How the Model learns

    1. Advantages of Hierachical Features

    1. CNN on Real Images

    1. Applications and Use in Practice

    1. Deep Learning and Transfer Learning

    Done!

    Week 1 - Course Introduction

    hashtag
    Course session

    Welcome and Introduction round

    Introduction of the course, opencampus, the course instructor and the course participants

    Tool Set-Up

    • Colab

    • Editor (VSCode)

    • Virtual Environments

    • Git/Github

    Walk-through

    PyTorch 101 (Lab 01)

    A visual overview of the workflow in the Colab notebook you can get in the PyTorch diagram below:

    hashtag
    To-do

    😊

    1. Watch the following introduction video to the PyTorch framework

    1. Watch all the videos on the next page - they are derived from a former Coursera Course

    2. Go for your own through the Colab Notebook above (Pytorch101) and try to understand and repeat the steps for your own. Thereby you should also solve Task 1-3 in the notebook. You can create therefore a copy of the notebook in your Drive or download the notebook to work locally on it. Ensure that you sufficient computing resources available (i.e gpu) if you choose to work locally.

    😊😊

    Try to improve the accuracy in the PyTorch 101 notebook by tweaking the amount of layers and number of neurons

    😊😊😊

    Familiarize yourself with basic PyTorch Tutorials:

    • (First part)

    Week 5 - Recurrent Neural Networks

    RNNs

    hashtag
    Course session

    Faster Coding with ChatGPT, Stackoverflow and clever search

    Deep dive

    • What are Embeddings?

    • Reinforcements of and insights into RNNs beyond Coursera

    Solutions exercise RNN

    Presentation from the participants of the RNN assignment from Coursera

    Paddy challenge

    Logistic regression baseline for the Paddy Competition

    Walk-through

    Basic CNN in PyTorchLightning

    PyTorch 404 (Lab 04)

    hashtag
    To-do

    😊

    Watch the videos on the next page

    Build your own model(s) for the Paddy challenge and try to achieve the best accuracy. Log your results that you can present them in the class!

    😊😊

    Add the the test functionality and create a submission.csv and upload it to the leaderboard

    Week 3 - Intro Kaggle competition - EDA and baseline models with PyTorch

    Learning and testing - a.k.a. don't do Bullshit Machine Learning

    hashtag
    Course session

    Walk-through

    Hyperparameter experiment

    The following notebook will show how to set up a hyperparameter experiment in plain PyTorch. More importantly it give you the results and enables you to analyze and play around

    Kaggle

    • Introduction

    • Titanic

    Solutions exercise MLP

    Presentation from the participants of the MLP from Coursera

    hashtag
    To-do

    😊

    Watch the videos on the next page

    Go through the following notebooks and complete the second one (assignment notebook):

    The next task is to analyze the results of the hyperparameter experiment and create a small presentation on your findings(e.g. batch size of 16 with lr=0.2 seems to equal batch size of 1 with lr=0.01). Here is the notebook again:

    😊😊

    Run your own hyperparameter experiment

    😊😊😊

    Do your own EDA on the Titanic Dataset and/or look at other EDA notebooks from competitors. Make a final presentable EDA notebook.

    Familiarize yourself with this PyTorch Tutorials:

    https://drive.google.com/file/d/18VsrKSqNFaWeWsL24ULFrNnwpghpdwJZ/view?usp=sharingarrow-up-right
    file-pdf
    97KB
    pytorch diagram.pdf
    PDF
    arrow-up-right-from-squareOpen
    https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.htmlarrow-up-right
    https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial2/Introduction_to_PyTorch.htmlarrow-up-right

    Week 4 - Convolutional Neural Networks

    CNNs

    hashtag
    Course session

    Kaggle - Our first challenge: Paddy

    Exploratory Data Analysis(EDA) for Paddy Disease Classification

    Solutions exercise CNN

    Presentation from the participants of the CNN assignment from Coursera

    Walk-through

    PyTorchLightning

    PyTorch 303 (Lab 03)

    hashtag
    To-do

    😊

    Go for your own through the Colab Notebook above (PyTorch303) and try to understand and repeat the steps for your own.

    Watch the videos on the next page

    Go through the following notebooks and complete the second one (assignment notebook):

    Please register at kaggle.com and join the competition. Go through the Exploratory Data Analysis Notebook session and create your own EDA. Here is the link to the competiton:

    The main objective of this Kaggle competition is to develop a machine or deep learning-based model to classify the given paddy leaf images accurately. A training dataset of 10,407 (75%) labeled images across ten classes (nine disease categories and normal leaf) is provided. Moreover, the competition host also provides additional metadata for each image, such as the paddy variety and age. Your task is to classify each paddy image in the given test dataset of 3,469 (25%) images into one of the nine disease categories or a normal leaf.

    So that is where we will be heading in the next sessions, trying different tools and techniques to tackle this challenge.

    Here again my EDA Notebook:

    😊😊

    😊😊😊

    Cousera Videos

    Word Vectors

    Attention Mechanism

    Cousera Videos

    Watch them all😊

    1. Introduction to the Concept of Word Vectors

    1. Words to Vectors

    1. Example of Word Embeddings

    1. Neural Model of Text

    1. The Softmax Function

    1. Methods for Learning Model Parameters

    1. More Details on How to Learn Model Parameters

    1. The Recurrent Neural Network

    1. Long Short-Term Memory

    1. Long Short-Term Memory Review

    1. Use of LSTM for Text Synthesis

    1. Simple and Effective Alternative Methods for Neural NLP

    Done!

    Cousera Videos

    Sequence-to-Sequence Encoder and Decoder

    The Transformer Network

    Week 6 - CNN and RNN Applications

    Hands-on

    hashtag
    Course session

    Kaggle

    Presentation of experiments with the goal of improving the classification accuracy

    Transfer Learning

    Theory and Applications

    Walk-through

    PyTorch 505

    Transfer Learning CNN in PyTorchLightning:

    hashtag
    To-do

    😊

    Watch the videos on the next page

    Watch the following Seminar about Transformers:

    😊😊

    Go on using ideas discussed in this session and go on improving the accuracy on the Paddy Dataset

    Coursera Videos

    Watch them all😊

    1. How do we define learning?

    1. How do we evaluate our Networks?

    1. How do we learn our Network?

    1. How do we handle Big Data?

    1. Early Stopping

    Done!

    Week 7 - Transformers & Hugging Face

    hashtag
    Course session

    Explanatory Session Part 1

    Self-attention and multihead attention

    Hugging Face Introduction

    Library and Use of HuggingFace for working with transfomer models

    Explanatory Session Part 2

    Transformer Encoder and Positional Encoding

    Explanatory Session Part 3

    Vision Transformer

    Walk-through

    Finetuning Vision Transformer on Kaggle Paddy Dataset

    hashtag
    To-do

    😊

    Look at current Kaggle competitions and make proposals

    😊😊

    Go through this excellent site explaining Transformers:

    Do Chapter 1-3 of the HuggingFace NLP course

    😊😊😊

    Look closer at the Pytorch module nn.Transformer () and go through a on how to use it for next token prediction.

    Watch this excellent "Build from Scratch" video from Andrej Karpathy

    documentationarrow-up-right
    tutorialarrow-up-right

    Week 11 - Final Presentations

    Final Session

    Coursera Videos

    Watch them all😊

    1. Motivation Diabetic Retinopathy

    1. Breakdown of the Convolution (1D and 2D)

    1. Core Components of the Convolutional Layer

    1. Activation Functions

    1. Pooling and Fully Connected Layers

    1. Training the Network

    1. Transfer Learning

    Done!

    Week 8-10 - Kaggle competiton sessions

    Kaggle Competition

    Google Colabcolab.research.google.comchevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    The assignment notebook
    Logo
    Google Colabcolab.research.google.comchevron-right
    The first notebook
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Opencampus Paddy PyTorch Logistic RegressionKagglechevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Stanford CS224N NLP with Deep Learning | 2023 | PyTorch Tutorial, Drew KaulYouTubechevron-right
    Google Colabcolab.research.google.comchevron-right
    Google Colabcolab.research.google.comchevron-right
    Logo
    Logo
    Google Colabcolab.research.google.comchevron-right
    The first notebook
    Tutorial 5: Inception, ResNet and DenseNet — UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
    Logo
    Logo
    Word vectors and their interpretation
    Opencampus Paddy EDAKagglechevron-right
    Opencampus Paddy EDAKagglechevron-right
    Logo
    Logo
    Intuition Into Meaning of Inner Products of Word Vectors
    Inner Products Between Word Vectors
    Relationships Between Word Vectors
    Paddy Doctor: Paddy Disease Classificationwww.kaggle.comchevron-right
    Paddy Doctor: Paddy Disease Classificationwww.kaggle.comchevron-right
    Logo
    Logo
    Self-Attention and Positional Encodings
    Google Colabcolab.research.google.comchevron-right
    The assignment notebook
    Logo
    Introduction of Attention Mechanism
    Google Colabcolab.research.google.comchevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Assignment notebook
    Logo
    Queries, Keys, and Values of Attention Network
    Google Colabcolab.research.google.comchevron-right
    First notebook
    Logo
    Cross Attention In the Sequence-to-Sequence Model
    Coupling the Sequence Encoder and Decoder
    The complete Transformer Network
    Attention-Based Sequence Encoder
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Stanford CS25: V2 I Introduction to Transformers w/ Andrej KarpathyYouTubechevron-right
    Multi-Head Attention
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Tutorial 6: Transformers and Multi-Head Attention — UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
    Logo
    The Illustrated Transformerjalammar.github.iochevron-right
    Introduction - Hugging Face LLM Coursehuggingfacechevron-right
    Logo
    Tutorial 15: Vision Transformers — UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
    Logo