Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Hybrid Format - Every Thursday 18h00-19h45
Taken your first steps in ML and thirsty for becoming a fully equipped practitioner? Then this course is perfectly right for you! You will learn+build all relevant NN Architectures with the best tools
The next pages serve as the course book. Go ahead and dive in!
Great that you want to dive into the deep waters of Machine Learning. These are exciting times with major advancements on a quarterly basis like ChatGPT, Whisper, StableDiffusion and so many more. Nevertheless, all these exciting models were developed using solid ML knowledge, which is what we aim to acquire in this course.
This is a course which brings you from beginner to intermediate or even advanced. It is formally called Intermediate Machine Learning but following HuggingFace🤗 terms (which we will use heavily in the course) I like to call the course SmilingFace😊. This is meant ironically, because you will never laugh in the course😊. Okay joking aside the use of smileys during learning and practioning ML helps us to remember to have fun, laugh about our mistakes and take ourself not too seriously as it was proposed by the HuggingFace🤗 community. Therefore we will use our 😊 heavily in this course.
On the next pages you can see what the content of each course week will be starting with what will happen during each of our course sessions. Then again the SmilingFace😊 will lead you to what else to do in the week. I have divided the course into three levels of course work:
😊
The part after one 😊 is mandatory for each course participant for a for successful participation.
😊😊
The part after two 😊😊 is voluntary but recommended.
😊😊😊
The part after three 😊😊😊 is completely voluntarily for the ones who really want to know.
Remember the course instructor(me) is also fallible so please question me if you see something that does not seem kind of right to you. Also always ask questions especially if you don't fully understand something. This is really why we give this course afterall so that you understand everything😊
Own contributions or suggestions for improving the course as well as feedback are always welcome😊
Let's dive right in!
Quiz
ML Basics recap
Solutions exercises
Presentation from the participants of the tasks from PyTorch101
Walk-through
PyTorch 202 (Lab 02)
😊
Go for your own through the Colab Notebook above (Pytorch202) and try to understand and repeat the steps for your own.
Watch the videos on the next page
Go through the following notebooks and complete the second one (assignment notebook):
The redundancy between some notebooks is desired to reintroduce the concepts in a different way and hence enrich your learning experience!
😊😊
Try to improve the accuracy in the PyTorch 202 notebook by tweaking the amount of layers and number of neurons
😊😊😊
Familiarize yourself with basic PyTorch Tutorials:
Watch them all😊
How do we define learning?
How do we evaluate our Networks?
How do we learn our Network?
How do we handle Big Data?
Early Stopping
Done!
There are certain requirements which form the basis for a successful course participation. If you do not have the mandatory requirements listed below, you should consider enrolling in a more basic course of our offerings. Alternatively bring yourself up to speed. Under additional ressources on the left sidebar you find the necessary ressources. Since the course has a really high pace it will be absolutely necessary to straighten these basic requirements before the course!
Python
Here is a refresher notebook:
Math
Linear Algebra, Probability Theory (at least the basics)
Machine Learning
Basics:
What is a neural network
What is a forward/backprogragation
What is a loss
What is an activation function
What is train/val/test
You can set up your PC for local development. A guiding notebook is here:
Here different IDEs are presented and compared:
Watch them all😊
Motivation Diabetic Retinopathy
Breakdown of the Convolution (1D and 2D)
Core Components of the Convolutional Layer
Activation Functions
Pooling and Fully Connected Layers
Training the Network
Transfer Learning
Done!
CNNs
Kaggle - Our first challenge: Paddy
Exploratory Data Analysis(EDA) for Paddy Disease Classification
Solutions exercise CNN
Presentation from the participants of the CNN assignment from Coursera
Walk-through
PyTorchLightning
PyTorch 303 (Lab 03)
😊
Go for your own through the Colab Notebook above (PyTorch303) and try to understand and repeat the steps for your own.
Do Week 4 of the Coursera Course
Please register at kaggle.com and join the competition. Go through the Exploratory Data Analysis Notebook session and create your own EDA. Here is the link to the competiton:
The main objective of this Kaggle competition is to develop a machine or deep learning-based model to classify the given paddy leaf images accurately. A training dataset of 10,407 (75%) labeled images across ten classes (nine disease categories and normal leaf) is provided. Moreover, the competition host also provides additional metadata for each image, such as the paddy variety and age. Your task is to classify each paddy image in the given test dataset of 3,469 (25%) images into one of the nine disease categories or a normal leaf.
So that is where we will be heading in the next sessions, trying different tools and techniques to tackle this challenge.
Here again my EDA Notebook:
😊😊
😊😊😊
RNNs
Faster Coding with ChatGPT, Stackoverflow and clever search
Deep dive
What are Embeddings?
Reinforcements of and insights into RNNs beyond Coursera
Solutions exercise RNN
Presentation from the participants of the RNN assignment from Coursera
Paddy challenge
Logistic regression baseline for the Paddy Competition
Walk-through
Basic CNN in PyTorchLightning
PyTorch 404 (Lab 04)
😊
Watch the videos on the next page
Build your own model(s) for the Paddy challenge and try to achieve the best accuracy. Log your results that you can present them in the class!
😊😊
Add the the test functionality and create a submission.csv and upload it to the leaderboard
Welcome and Introduction round
Introduction of the course, opencampus, the course instructor and the course participants
Tool Set-Up
Colab
Editor (VSCode)
Virtual Environments
Git/Github
Walk-through
PyTorch 101 (Lab 01)
A visual overview of the workflow in the Colab notebook you can get in the PyTorch diagram below:
😊
Watch the following introduction video to the PyTorch framework
Watch all the videos on the next page - they are derived from a former Coursera Course
Go for your own through the Colab Notebook above (Pytorch101) and try to understand and repeat the steps for your own. Thereby you should also solve Task 1-3 in the notebook. You can create therefore a copy of the notebook in your Drive or download the notebook to work locally on it. Ensure that you sufficient computing resources available (i.e gpu) if you choose to work locally.
😊😊
Try to improve the accuracy in the PyTorch 101 notebook by tweaking the amount of layers and number of neurons
😊😊😊
Familiarize yourself with basic PyTorch Tutorials:
Watch them all😊
Why Machine Learning is exciting
What is Machine Learning?
Logistic Regression
Interpretation of Logistic Regression
Motivation for Multilayer Perceptron
Multilayer Perceptron Concepts
Multilayer Perceptron Math Model
Deep Learning
Example: Document Analysis
Interpretation of Multilayer Perceptron
Transfer Learning
Model Selection
Early History of Neural Networks
Hierarchical Structure of Images
Convolutional Filters
Convolutional Neural Networks
CNN Math Model
How the Model learns
Advantages of Hierachical Features
CNN on Real Images
Applications and Use in Practice
Deep Learning and Transfer Learning
Done!
Sequence-to-Sequence Encoder and Decoder
The Transformer Network
Hands-on
Kaggle
Presentation of experiments with the goal of improving the classification accuracy
Transfer Learning
Theory and Applications
Walk-through
PyTorch 505
Transfer Learning CNN in PyTorchLightning:
😊
Watch the videos on the next page
Watch the following Seminar about Transformers:
😊😊
Go on using ideas discussed in this session and go on improving the accuracy on the Paddy Dataset
Word Vectors
Attention Mechanism
Explanatory Session Part 1
Self-attention and multihead attention
Hugging Face Introduction
Library and Use of HuggingFace for working with transfomer models
Explanatory Session Part 2
Transformer Encoder and Positional Encoding
Explanatory Session Part 3
Vision Transformer
Walk-through
Finetuning Vision Transformer on Kaggle Paddy Dataset
😊
Look at current Kaggle competitions and make proposals
😊😊
Go through this excellent site explaining Transformers:
Do Chapter 1-3 of the HuggingFace NLP course
😊😊😊
Watch this excellent "Build from Scratch" video from Andrej Karpathy
Look closer at the Pytorch module nn.Transformer
() and go through a on how to use it for next token prediction.
Final Session
Kaggle Competition
Learning and testing - a.k.a. don't do Bullshit Machine Learning
https://drive.google.com/file/d/18VsrKSqNFaWeWsL24ULFrNnwpghpdwJZ/view?usp=sharing
Walk-through
Hyperparameter experiment
The following notebook will show how to set up a hyperparameter experiment in plain PyTorch. More importantly it give you the results and enables you to analyze and play around
Kaggle
Introduction
Titanic
Solutions exercise MLP
Presentation from the participants of the MLP from Coursera
😊
Watch the videos on the next page
Go through the following notebooks and complete the second one (assignment notebook):
The next task is to analyze the results of the hyperparameter experiment and create a small presentation on your findings(e.g. batch size of 16 with lr=0.2 seems to equal batch size of 1 with lr=0.01). Here is the notebook again:
😊😊
Run your own hyperparameter experiment
😊😊😊
Do your own EDA on the Titanic Dataset and/or look at other EDA notebooks from competitors. Make a final presentable EDA notebook.
Familiarize yourself with this PyTorch Tutorials: