arrow-left

All pages
gitbookPowered by GitBook
triangle-exclamation
Couldn't generate the PDF for 192 pages, generation stopped at 100.
Extend with 50 more pages.
1 of 100

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Preparation

hashtag
Before the first class you should ...

  • Register yourself in the Opencampus Mattermost Chat

  • and for the , and enroll at least in the first course .

Register yourself in Courseraarrow-up-right
Deep Learning Specializationarrow-up-right
Neural Neworks and Deep Learningarrow-up-right

Requirements for a Certificate of Achievement or ECTS

The conditions to be met in order to receive a Certificate of Achievement and the ECTS are:

  • Attendance to at least 80% of the classes (it is allowed to miss maximum 2 times)

  • Delivery of the project with the needed documentation.

hashtag
Attendance:

Since the weekly session will be on Zoom, please use your full name when you join the Zoom Meeting. The full name should be the same that you used to register at edu.opencampus.sh, because we have an automatic check.

triangle-exclamation

We register automatically the attendance.

When you join the Zoom Session, please use the same name you have in the edu.opencampus.sh platform. You can change your name in the edu.opencampus.sh platform (click at the top-right on your profile photo) and in Zoom (click at the top-right of your video stream), so you should be able to use the same name during the weekly session.

If for any reason (no need to explain) you do not want to use the same name, but still need to be registered, please contact me at the beginning of the course.

hashtag
Projects:

Check the Projects section to learn more about the projects.

hashtag
Coursera:

Each weekly session is complemented with the videos and homework from the Coursera courses. Going through the video and doing the assignment allows you to learn and understand each session, so it is required for the course.

However, Coursera is indipendent from us and the completion of the Coursera assignment is NOT needed for the Opencampus Certificate. Completing all assignment will give you the Coursera Certificates (which is different)

Course Projectschevron-right

Week 6 - Optimization Algorithms

hashtag
This week you will..

See Mini-batches, Momentum, RMSProp and AdamOptimizer: an overview of optimization algorithm to train faster neural networks.

hashtag
Learning Resources

hashtag
Slides

Deep Learning for Computer Vision

Opencampus Course about Deep Learning based on various Coursera Courses

The Deep Learning for Computer Vision course continue the journey into neural network going into more details about Convolutional Neural Networks (CNN). This course is intended as a follow-up after the Deep Learning from Scratch course and aims at understanding and recreating complexing network architecture for fascinating and challenging projects.

hashtag
The Objective of the Course

The aim of this course is to reach a deep level of understanding about how CNN works and why they are so powerful. During the course you will have to complete assignments which will give you an insight about what can be done, for example object detection, face recognition, neural style transfer.

During the course, we will have weekly discussion to deepen our understanding of the subject and time to work on your own project.

hashtag
Requirements and Motivation

We assume you have knowledge about python, linear algebra and neural network. Ideally, you took the first course Deep Learning from Scratch (or a similar course) and you have done a small project. Having some experience is extremely beneficial in order to be able to keep up with the homeworks and the discussion, otherwise it tend to require a quite large amount of time.

The estimated time is around 5 to 10 hours a week distibuted between watching videos, doing homeworks and working on your project. Try to allocate enough time to manage to get through the whole course. If in doubt, ask us for advice.

hashtag
The Project

Groups of students will be formed to work on a project. The project idea can come from any of the students, can be picked from a template, proposed from us or can be the continuation of a project from the last semesters. The project is needed in order to finish the course, and a final presentation will be given in the last week of the course.

circle-info

For more information about the projects, check out the projects section and the sub-pages about requirements, possible and past projects!

hashtag
The Course Material

The course will be held weekly and will constitute of an online session of 1 hour and a half. The material and slides for each session and will be uploaded on each week's page after the class.

hashtag
ECTS

For further details about Certificates and ECTS please refer to the following page:

Course Projectschevron-right
Requirements for a Certificate of Achievement or ECTSchevron-right

Week 11 - Presentation of Final Projects, Part I

This session is entirely dedicated to the presentation of the final project from the students. Schedule and timing will be decided and published during the course.

Preparation

Before the course start, you should be familiar with:

  • our Opencampus.sh Mattermost Chatarrow-up-right

  • the Coursera platform and the Deep Learning Specializationarrow-up-right

  • the framework

What would be very beneficial for this course (but not mandatory, you will learn it anyway during the course):

  • knowledge of and packages

  • knowing how to work with images in python

  • knowing what a convolution is

Tensorflowarrow-up-right
numpyarrow-up-right
opencv pythonarrow-up-right

Requirements for a Certificate of Achievement or ECTS

The conditions to be met in order to receive a Certificate of Achievement and the ECTS are:

  1. Missing the weekly session maximum 2 times.

  2. Presenting and uploading the project with the needed documentation.

hashtag
Attendance:

triangle-exclamation

We register automatically the attendance.

When you join the Zoom Session, please use the same name you have in the edu.opencampus.sh platform. You can change your name in the edu.opencampus.sh platform (click at the top-right on your profile photo) and in Zoom (click at the top-right of your video stream), so you should be able to use the same name during the weekly session.

If for any reason (no need to explain) you do not want to use the same name, but still need to be registered, please contact me at the beginning of the course.

hashtag
Projects:

Check the Projects section to learn more about the projects.

hashtag
Coursera:

Each weekly session is complemented with the videos and homework from the Coursera courses. Going through the video and doing the assignment allows you to learn and understand each session, so it is required for the course.

However, Coursera is indipendent from us and the completion of the Coursera assignment is NOT needed for the Opencampus Certificate. Completing all assignment will give you the Coursera Certificates (which is different)

Week 2 - Foundations of Convolutional Neural Networks

hashtag
This week you will..

  • have a short talk about how to structure your idea for the project

  • start to form the groups for the project

  • discuss about advantages and limitations of CNN

  • take a look at the first homework

hashtag
Slides

hashtag
Until next week you should

  • check additional material on convolutional models

  • finish the second homework of the first week

Week 3 - Convolution Model Application

hashtag
This week you will..

  • Discuss about CNN models

  • Finalize groups and set objective for the project

  • Check out the homework

hashtag
Learning Resources

hashtag
Until next week you should

  • check out the videos for the

  • finish the homework on (only this one)

Course Projectschevron-right
Convolution Model Applicationarrow-up-right
second week of the coursera coursearrow-up-right
Residual Networkarrow-up-right

Week 8 - Face Recognition

hashtag
This week you will..

  • Check out the face recognition code

hashtag
Learning Resources

hashtag
Until next week you should

  • Finish the homework on

Art Generation with Neural Style Transferarrow-up-right

Week 7 - Project Checkpoint | Image Segmentation

hashtag
This week you will..

  • Follow a peer review process of the projects

  • Discuss the main difference with U-Net

  • Discuss about segmentations

hashtag
Learning Resources

hashtag
Until next week you should

  • Check the videos of the

  • finish the homework on

Week 4 - Residual Networks

hashtag
This week you will..

  • Discuss about residual networks

fourth week of the coursera coursearrow-up-right
Face Recognitionarrow-up-right
Discuss about training practices

hashtag
Learning Resources

hashtag
Until next week you should

  • Check additional material on Transfer Learning

  • finish the homework on

Transfer Learningarrow-up-right

Week 1 - General Introduction

hashtag
This week you will

  • get an introduction to the course and possible projects,

  • have a round of presentation

  • get an overview of the resources for the course

  • brainstorm ideas for the projects

hashtag
Slides

hashtag
Until next week you should

  • Check the videos of the first week of the Convolutional Neural Network Course

  • Finish the first homework (only the first for this week)

  • Write down and bring specific question (only if you have some)

Convolutional Model, Step by Steparrow-up-right

Week 5 - Transfer Learning

hashtag
This week you will..

  • Discuss about transfer learning

  • Have a small internal discussion about your project and present us some insights

hashtag
Learning Resources

hashtag
Until next week you should

  • Check out the videos for the

  • Finish the homework on

third week of the coursera coursearrow-up-right
Car detection with YOLOarrow-up-right

Week 11 - Final Presentation of the Projects

Every group will present their project. All details about timing and format will be discussed in the classroom and updated later here.

Week 6 - Detection Algorithms

hashtag
This week you will..

  • Discuss the car detection algorithms

  • Think about other detection/localization applications

hashtag
Learning Resources

hashtag
Until next week you should

  • Prepare a small presentation about the state of your project.

  • Finish the homework about

hashtag
The presentation should contain:

  • goal of the project

  • state of the art (was it already done? If yes, how? If no, why?)

  • your approach

Week 9 - Art Generation with Neural Style Transfer

hashtag
This week you will..

  • Discuss about neural network and art generation

  • Discuss about cost functions

hashtag
Learning Resources

hashtag
Check out Phil Wang's Repositories for more cool projects:

hashtag
Until next week you should

  • Work on your project - no fixed homework this week!

your dataset
  • your problems (if any :D )

  • Image Segmentation using U-Netarrow-up-right

    Week 10 - CNN Bonus

    hashtag
    This week you will..

    • discuss about CNN, advantages and limitation after the course

    • think about what can happens in the future

    • hear about transformer in computer vision

    hashtag
    Learning Resources

    hashtag
    Until next week you should

    • Prepare the presentation!

    Week 8 - Autocomplete and Language Models

    19-04-2023 - General Introduction

    hashtag
    This week you will...

    • get a basic idea of what MLOps is about

    • get a first idea about possible projects you might want to work on throughout the course

    hashtag
    Learning Resources

    hashtag
    Until next week you should...

    • register for the opencampus.sh program at Coursera (you will get an invitation link from opencampus.sh to do so) and enroll for the course

    • complete week 1 and week 2 of the course , including the provided exercises/labs.

    file-pdf
    3MB
    230419_MLOps Introduction Session.pdf
    PDF
    arrow-up-right-from-squareOpen
    Paper on the definition of DevOpsarrow-up-right
    Introduction to Machine Learning in Productionarrow-up-right
    Introduction to Machine Learning in Productionarrow-up-right

    Week 5 - Machine Translation and Document Search

    For this week you should have gone through the lectures of week 4 of the first Coursera course on NLP and the assignment.

    In this week we learn about one of the most important applications of NLP which is translating from one language to another.

    https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/4arrow-up-right

    Week 3 - Introduction to Transformers

    hashtag
    This week you will...

    • Master the basics of attention and transformers

    • Familiarize with advanced models

    hashtag
    Slides

    hashtag
    Learning Resources

    • on Attention

    hashtag
    Until next week you should...

    complete of the LLM Coursera course (Audit Mode)

    file-pdf
    3MB
    06_11_23_Introduction_to_transformers.pdf
    PDF
    arrow-up-right-from-squareOpen
    Stanford Lecture on transformersarrow-up-right
    Rasa seriesarrow-up-right
    herearrow-up-right
    herearrow-up-right
    week 1arrow-up-right

    TELeR: A General Taxonomy of LLM Prompts for Benchmarking Complex Tasks

    In one interesting paper the authors emphasize the importance of a standardized taxonomy for LLM prompts targeted towards solving complex tasks and, subsequently, provide such a taxonomy, i.e., TELeR, which can be utilized by multiple independent researchers who are conducting such research studies in order to report their results using a single unified standard.

    As developers we can look at the best level (i.e. level 6) and use all the listed promt details for our prompts we design for our task.

    Image Source: Santu et al. (2023)

    Week 7 - Fine-Tuning I

    hashtag
    This week you will...

    • know how to prepare the data for training LLMs.

    • get a better technical understanding of how to train LLMs.

    • learn about different alignment approaches such as RLHF and RLAIF using PPO, or DPO.

    hashtag
    Learning Resources

    • from the Hugging Face NLP course.

    • by Andrej Karpathy explaining how to train a GPT from scratch.

    hashtag
    Until next week you should...

    file-pdf
    5MB
    231204_Fine-Tuning I.pdf
    PDF
    arrow-up-right-from-squareOpen
    Training a causal language model from scratcharrow-up-right
    Videoarrow-up-right
    week 2arrow-up-right
    week 3arrow-up-right

    Week 4 - Deep Neural Networks

    hashtag
    This week you will...

    • Explore deep neural network.

    • First example of generalizing a neural network with L layers.

    • Discussion and choice about the projects.

    hashtag
    Learning Resources

    hashtag
    Projects

    hashtag
    Until next week you should...

    • Create a group for your project!

    • Finish the first week of the

    • Do the Programming Assignments: Initialization, Regularization and Gradient Checking

    Improving Deep Neural Network Coursearrow-up-right

    Archive

    Week 2 - Introduction to Deep Learning and Neural Network Basics

    hashtag
    This week you will...

    • Check if everything worked with the tools we started using

    • Have the first session with a small quiz and round of discussion.

    • Discuss about python environment, dot product against element wise multiplication,

    • Do you first exercise session training a small neural network recognizing cats!

    hashtag
    Learning Resources

    hashtag
    Slides

    hashtag
    Useful (external) Resourcers

    hashtag
    Until next week you should...

    Week 8 - Machine Learning Strategy 1 & 2

    hashtag
    This week you will..

    • Guidelines for the projects presentations, suggestions on what to put on them, baseline and human performances.

    • Talk about how to structure the training, test and validation set, and more general on how to structure the whole project.

    • What about using transfer learning, end-to-end approaches, divide the problem into smaller subproblems, or using multi-tasking? Sometimes the problem can be seen from another perspective.

    • Since there were no programming assignment for this week, walkthrough an LSTM tutorial on time series.

    hashtag
    Learning Resources

    hashtag
    Slides

    Week 1 - General Introduction

    A general introduction about the course structure and the participants

    hashtag
    This week you will...

    • Receive an introduction about the course and the people in it. A short overview of the course, contents and how it will work.

    • Information about accounts, forum and contacts are provided.

    hashtag
    Learning Resources

    hashtag
    Until next week you should...

    • Register in the

    • Register on Coursera and start the course,

    • Finish the first two weeks of the

    Do the Programming Assignment on Logistic Regression

  • Do the Programming Assignment on Python Numpy

  • Deep Learning Channel in the Mattermost Chatarrow-up-right
    Neural Networks and Deep Learningarrow-up-right
    coursearrow-up-right

    Week 7 - Hyperparameter Tuning

    hashtag
    This week you will..

    Quick overview of the projects, report from Hackathon, discussion about batch normalization and hyperparameters search, first assignment using Tensorflow to create a small neural network. Small discussion about Tensorflow ideas and modalities and difference between 1.0 and 2.0 versions.

    hashtag
    Learning Resources

    hashtag
    Slides

    Deep Learning from Scratch

    Opencampus Course about Deep Learning based on various Coursera Courses

    The Deep Learning course will guide you through the mathematics background of machine learning approaches. We will start from a simple neural network and go through the different components of a network to understand and be able to create your own project.

    hashtag
    The Objective of the Course

    The aim of this course is to develop a deeper understanding of how and why neural network work. The first part will be devoted to understanding and implementing the basic behind most of the neural network approaches, the forward- and back-propagation, loss function, optimization, training, hyper-parameters tuning and analysis.

    To gain a better understanding, we will implement those part in python (mostly using the numpy library). These methods already exists in popular frameworks (like Tensorflow or Pytorch, to cite a few), but using them without knowledge may be confusing.

    During the course, we will have weekly discussion to deeper our understanding of the subject and also you will work on your own project.

    hashtag
    Requirements and Motivation

    In order to get the best out of this course, some previous knowledge is required. We expect the participant to have an understanding about the fundaments of mathematics (not being afraid of derivatives), linear algebra (mostly matrix multiplication) and python programming.

    Based on the past semester, the estimated time is around 5 hours a week, ranging usually from 3 to 8 depending on the week's material. The project will start after 4 weeks of the course and will take some additional hours. However, assignment will decrease in the end of the course to leave you space for the project. Be sure to allocate enough time to manage to get through the whole course. If in doubt, ask us for advice.

    circle-info

    You do not have to be an expert, and sometimes enthusiasm and motivation may be enough. If you are unsure about some of the requirements, check out the Additional Resources or write us to discuss about it.

    hashtag
    The Project

    Groups of students should be formed to work on a project. The project idea can come from the student, from a template or proposed from us. The project is needed in order to finish the course, and a final presentation will be given in the last week of the course.

    circle-info

    For the complete requirements about the project, check out the Requirement page.

    circle-info

    For some example of projects from last years, check out the Past Projects page.

    hashtag
    The Course Material

    The course will be held weekly and will constitute of an online session of 1 hour and a half. The material and slides for each session are found in each week's page.

    hashtag
    ECTS

    For further details about Certificates and ECTS please refer to the following page:

    Learning Linear Algebrachevron-right
    Learning Pythonchevron-right
    How to Start, Complete, and Submit Your Projectchevron-right
    Requirements for a Certificate of Achievement or ECTSchevron-right

    Week 12 - Presentation of Final Projects, Part II

    Week 3 - Shallow Neural Networks

    hashtag
    This week you will...

    check weights initialization in the training and notebook example of planar data classification changing the number of hidden unit in a shallow network - only 1 hidden layer.

    hashtag
    Learning Resources

    hashtag
    Slides

    hashtag
    Useful (external) Resources

    hashtag
    Until next week you should...

    • Finish the

    • Finish both assignments

    • Think about your project: prepare an idea and find other people willing to collaborate (there is time also next week, but please start)

    fourth week of the coursearrow-up-right

    Week 5 - Practical Aspects of Deep Learning

    hashtag
    This week you will..

    Get practical hints about initialization and regularization techniques to avoid overfitting and improving the training of a neural network.

    hashtag
    Learning Resources

    hashtag
    Slides

    hashtag
    Until next week you should...

    • Form the groups, decide the project and communicate it to the teacher.

    • Finish the second week of the

    • Do the Programming Assignment

    hashtag

    Improving Deep Neural Network Coursearrow-up-right

    Requirements for a Certificate of Achievement or ECTS

    The conditions to be met in order to receive a Certificate of Achievement (and ECTS) are:

    hashtag
    Attendance:

    If you attend via Zoom, please make sure to use your full name, which should be the same that you used to register at edu.opencampus.sh. Otherwise your attendance will not be recorded!

    hashtag
    Projects:

    Check the Projects section to learn more about the projects.

    Week 9 - Neural Networks Architecture | Project Checkpoint

    hashtag
    This week you will..

    • Get an overview about the different architectures

    • Discuss the status of your project

    Week 10 - Bonus: most voted topic

    Week 5 - The Datasets Library

    hashtag
    This week you will...

    • insights on the benefits of the Hugging Face Datasets library

    • discuss the literature review on project tasks

    • get some ideas on how to visualize sequence data

    hashtag
    Learning Resources

    hashtag
    Until next class you should...

    create filter variables to group your input data according to special characteristics

  • consider the following questions:

    • What are potential biases in your training data?

    • Are there outliers in the dataset?

    • Are the classes balanced? (If you deal with a classification task.)

  • file-pdf
    3MB
    221123_The Datasets Library.pdf
    PDF
    arrow-up-right-from-squareOpen
    chapter arrow-up-right
    6arrow-up-right

    Week 3 - Introduction to Transformer Models

    hashtag
    This week you will...

    • learn about the basic structure of the transformer models

    • get to know what tokenization is about and why it is important.

    hashtag
    Learning Resources

    • To get an idea about the number of tokens a text or word is made off in the GPT models, you may want to checkout .

    hashtag
    Until next week you should...

    file-pdf
    2MB
    221109_Introduction to Transformers.pdf
    PDF
    arrow-up-right-from-squareOpen
    this sitearrow-up-right
    chapter 3arrow-up-right

    Week 2 - Self-Attention and Prompt Design

    hashtag
    This week you will...

    • get to know the mechanism underlying the self-attention approach.

    • get to know the basics of prompt design and how to apply them in a playground.

    • get to know examples for the application of transformer models.

    hashtag
    Learning Resources

    Additional resource explaining the Transformer model:

    • Text: by Jay Alammar

    • Video:

    hashtag
    Until next week you should...

    file-pdf
    5MB
    221102_Self-Attention and Prompt Design.pdf
    PDF
    arrow-up-right-from-squareOpen
    The Illustrated Transformerarrow-up-right
    Transformer Neural Networks - EXPLAINED!arrow-up-right
    video 3arrow-up-right
    video 4arrow-up-right
    chapter 2arrow-up-right

    Week 4 - Fine-Tuning Pretrained Models

    hashtag
    This week you will...

    • learn how to fine-tune a Hugging Face model with your own data

    hashtag
    Learning Resources

    hashtag
    Until next week you should...

    Preparation

    hashtag
    Before the first class you should..

    • Register yourself in the Opencampus Mattermost Chat

    Focus on the structure of the input and of the output

  • Are there pretrained models that you can use?

  • Which type of model is best suited? Do you need tokenization?

  • Do you need a type of embedding layer?

  • file-pdf
    2MB
    221116_Fine-Tuning Transformers.pdf
    PDF
    arrow-up-right-from-squareOpen
    chapter 4arrow-up-right
    chapter 5arrow-up-right
    and for the , and enroll at least in the first course .
    Register yourself in Courseraarrow-up-right
    Natural Language Processing Specializationarrow-up-right
    Natural Language Processing with Classification and Vector Spacesarrow-up-right

    Week 7 - Main NLP Tasks

    hashtag
    This week you will...

    • get insights on the preprocessing of different NLP and sequence classification tasks.

    • get an idea on plausible hyperparameters to fit transformer models for different tasks.

    • learn about different metrics to evaluate NLP models.

    hashtag
    Learning Resources

    hashtag
    Until next class you should...

    • prepare your final presentation as described in week 8 .

    file-pdf
    4MB
    230104_Main NLP Tasks.pdf
    PDF
    arrow-up-right-from-squareOpen
    here

    Week 6 - The Tokenizers Library

    hashtag
    This week you will...

    • get insights on the benefits of the Hugging Face Tokenizers library

    • discuss the characteristics of the data you use in your project

    hashtag
    Learning Resources

    hashtag
    Until next class you should...

    file-pdf
    2MB
    221130_The Tokenizers Library.pdf
    PDF
    arrow-up-right-from-squareOpen
    chapter 7arrow-up-right

    Requirements for a Certificate of Achievement or ECTS

    Week 2 - Sentiment Analysis with Logistic Regression

    For this week you should have gone through the lectures of week 1 of the first Coursera course on NLP, including the quiz and the assignment. https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/1arrow-up-right

    Let's start and dive into the fascinating world of NLP.

    For an introduction we will look at how NLP was done a few years ago so that we can appreciate in the following courses what has changed in natural language processing for the better.

    As a start we will do positive and negative sentiment analysis on Twitter tweets. πŸ™‚ Analyzing tweets is a huge topic in NLP. Hegdefonds use Twitter tweets to try to predict movement of stock prices, politician campaign managers analyze tweets to see how the sentiment about their candidate evolves. So we dive right to the heart of one of the use of NLP.

    In this week we will count for each word in the dictionary how often it appears and positive respective negative tweets. We will use that dictionary to produce our input to a simple linear regression model and train it. So we won't use the words as the input to the model what we will do in later weeks. See the course videos for more details.

    For the next week you should go through all the course videos, the assignment and the quiz of week 2 of course 1 in the NLP specialization. Take notes and notice if you have any questions about the material. In the next meeting we will discuss these.

    See you next week!

    https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/2arrow-up-right

    Application of Transformer Models

    Week 3 - Sentiment Analysis with NaΓ―ve Bayes

    For this week you should have gone through the lectures of week 2 of the first Coursera course on NLP, including the quiz and the assignment. https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/2arrow-up-right

    This week is very similiar to the first but instead of linear regression we will use naive bayes.

    If you are not familiar with this machine learning algorithm these videos will give you a head start since in the coursera course naive bayes is just covered on the fly.

    At the end you will be able to test your naive bayes model with your own tweets or other that you source from the internet.

    For the next week you should go through all the course videos, the assignment and the quiz of week 3 of course 1 in the NLP specialization. Take notes and notice if you have any questions about the material. In the next meeting we will discuss these.

    Have fun and see you next week.

    https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/3arrow-up-right

    Woche 5 - Kreatives Schreiben

    hashtag
    Diese Woche werdet Ihr...

    • durch die Diskussion der Übungsaufgaben der letzten Woche lernen, wie eigene Texte mit z.B. Neuroflash entstehen.

    • einen Einblick in die wissenschaftliche Arbeit von Vivian Gunser erhalten.

    hashtag
    Lernressourcen

    hashtag
    Bis zur nΓ€chsten Woche solltet Ihr...

    Hier der Link zum . Die Aufabe ist, die folgenden beiden dortz vorgestellten Prompts zu nutzen;

    1. Create a rubric for a 5-paragraph opinion essay based on the Texas Essential Knowledge and Skills for grade 10. The rubric should include 3 categories: 1. organization, 2. development of ideas, and 3. use of language/conventions. Each category is worth a maximum of 4 points for Exemplary, 3 points for Meeting, 2 points for Approaching, and 1 point for Beginning.

    2. Act as an expert educator who is able to read information, analyze text and give supportive feedback based on a rubric that I will give you. Students were given the following assignment: [paste student assignment here] When you are ready, I will give you the rubric. I will then begin to give you the student work to evaluate and you will provide specific, constructive and meaningful feedback in a first-person, supportive voice. If the score given is less than [total points for the rubric] points, provide a brief paragraph on specific steps the student can do to improve the work and earn full credit based on the rubric. Do you understand?

    file-download
    5MB
    ChatGPT-KI in der Lehre_Vivian Gunser.pptx
    arrow-up-right-from-squareOpen
    file-pdf
    1MB
    230525_LuLmKI - kreatives Schreiben2.pdf
    PDF
    arrow-up-right-from-squareOpen
    Blogpost https://blog.tcea.org/chatgpt-grading/arrow-up-right

    Week 4 - Vector Space Models

    For this week you should have gone through the lectures of week 3 of the first Coursera course on NLP and the assignment. https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/3arrow-up-right

    In this week we dive into representing our words as words vectors. Remember any ML Algorithm requires its input in a mathematical forms i.e in digits. The last two weeks we assiociated to each tweet two digits - one for posive sentiment and one for negative sentiment - and fed that vector in our models. From now on we will try a different approach indepedent from sentiment analysis ahd thus more general.

    For that we encode each word as a vector. Thus we need a dictionary mapping each word to its corresponding vector. In general we do not know what is the best vector to represent a word. So we have also to learn that. Luckily there are a lot of pretrained word embeddings online and we can normally use one of those πŸ™‚

    For the next week you should go through all the course videos, the assignment and the quiz of week 4 of course 1 in the NLP specialization. Take notes and notice if you have any questions about the material. In the next meeting we will discuss these.

    To quote the NLP tutor: "And remember to have fun".

    See you next week πŸ™‚

    Woche 6 - Automatische Klassifizierung von Textantworten

    hashtag
    Diese Woche werdet Ihr...

    • durch die Diskussion der Übungsaufgaben der letzten Woche lernen, wie mit KI Ergebnisse bewertet werden kΓΆnnen.

    • einen Einblick erhalten, wie wissenschaftliche in diesem geforscht wird.

    hashtag
    Lernressourcen

    https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/4arrow-up-right
    file-download
    2MB
    230601_LuLmKI - Automatische Klassifizierung von Textantworten.pptx
    arrow-up-right-from-squareOpen
    file-download
    3MB
    Automatisierte Annotation von Textaufgaben (3).pptx
    arrow-up-right-from-squareOpen

    Generative Adversarial Networks

    This section is currently re-written. More details on the GAN course will follow in this section soon.

    Week 7 - Part of Speech Tagging and Hidden Markov Models

    Week 9 - Word embeddings with neural networks

    Week 6 - Autocorrect

    Welcome to second course and to Natural Language Processing with Probabilistic Models! Congratulations that you have made it that far! πŸ™‚

    Let us start with some words on what probalistic models are. These are models that are based on the principle: "given these words what is the most likely next word". So a pretty reasonable approach πŸ™‚

    In this week we learn about autocorrect, minimum edit distance, and dynamic programming. At the end we build your own spellchecker to correct misspelled words!

    Week 1 - General Introduction to the course

    We will talk about the general outline of the course, the final project and other organizational course related material. Also you get the chance to introduce yourself briefly and get to know your fellow students.

    For the next week we will finally dive into the material. You should go through all the course videos, the assignment and the quiz of week 1 of course 1 in the NLP specialization.

    https://www.coursera.org/learn/classification-vector-spaces-in-nlp/home/week/1arrow-up-right

    Woche 9 - PrΓ€sentation Abschlussprojekte

    hashtag
    Diese Woche werdet Ihr...

    • die AbschlussprΓ€sentationen der Teilnehmenden sehen

    • den Kurs abschließen.

    hashtag
    Lernressource

    file-pdf
    393KB
    230622_LuLmKI - Projekte.pdf
    PDF
    arrow-up-right-from-squareOpen

    Lehren und Lernen mit KI

    Woche 1 - EinfΓΌhrung

    hashtag
    Diese Woche wirst Du...

    • die anderen Kursteilnehmenden kennenlernen.

    • den Ablauf, die Inhalte und die Ziele des Kurses kennenlernen.

    • lernen wie die ZugΓ€nge zu Chat-Anwendungen wie ChatGPT, Bing und der OpenAI API aussehen und wie sie sich unterscheiden.

    hashtag
    Lernressourcen

    hashtag
    Bis zur nΓ€chsten Woche solltest Du...

    file-pdf
    4MB
    230420_LuLmKI - EinfΓΌhrung.pdf
    PDF
    arrow-up-right-from-squareOpen

    Week 8 - Presentation of the Final Projects

    hashtag
    This week you will...

    • present your project in the final presentation. :-)

    The presentation should take about 15 minutes and include the following aspects that correspond to the content of a model card for the Hugging Face library:

    • model description

    • intended uses & limitations

    • training data

    Check this section on for more details.

    Additionally, please also include into your presentation any things that you tried out and that didn't work, so we can all learn from your experiences.

    Motivation - Things you can do with NLP

    This is a brief overview of use cases of NLP. The goal is to show you what is possible with current NLP techniques and inspire you to use some of these applications for your own. This guide does not attempt to be comprehensive, so if you know of other interesting application we would be happy to tell us about them

    hashtag
    1. Automatic text summarisation

    Take one or more text documents and create a summary that represents the most important/relevant information from the original text. These summaries can either be β€œgeneric” (a general overview of the original text) or β€œquery relevant” (a summary that only focuses on the text that is relevant to a picked topic). The summarisation process is either extractive (directly reproducing parts of the source text word-for-word) or abstractive (forming an internal semantic representation of the original content and using this to write the summary from scratch).

    training procedure
  • variable and Metrics

  • evaluation results

  • Building a model cardarrow-up-right

    Week 10 - Final Projects

    You can play around with a demo bot for this:

    hashtag
    2. Question answering

    This are tools that can answer questions asked in normal (natural) language.

    It replies based on either text it saw during training or on some text you provide to it at the same time as asking the question. As with text summarisation, the answering process can either be extractive (directly quoting the source text) or abstractive (writing the answer based on an internal semantic representation of the original content).

    hashtag
    3. Information extraction

    The task of automatically extracting structured information from text documents.

    Information extraction can facilitate further computation to be done on the previously unstructured data. There are two main types of information extraction: Named entity recognition and Relation extraction.

    Named entity recognition allows you to identify all entities of a predefined category (e.g. Extract all cities; or extract all company names).

    Relation extraction builds on top of named entity recognition. In addition to finding the entities, it allows you to detect the semantic relationships between them (e.g. Extract all countries and their capital cities; or extract all companies and the year they were founded in).

    Here is a demo website where you can enter your text and see what subjects are extracted

    hashtag
    4. Chat bots

    Normal chatbots can hold conversations, answer your questions and carry out simple tasks (e.g. changing a setting in your account, placing an order or scheduling a meeting for you).

    hashtag
    5. Text classification

    The process of sorting pieces of text into one or more predefined categories. Examples of how this can be used include:

    • Text sentiment classification;

    • Spam filters;

    • Determining whether the author is making a claim or not - as the first step in fact-checking;

    • Analysing trends in social media monitoring.

    hashtag
    6. Machine Translation

    Translate from one language to another or let your text be rewritten.

    Check out:

    hashtag
    7. Write Code using natural language

    Describe what you are trying to achieve, and let the AI draft the code for you (e.g. HTML, CSS, SQL query and Linux commands).

    At present the tools that can do this are imperfect and can only really be used to write a first draft that you would need to review

    Machine Learning Operations (MLOps)

    Course dates:

    • 19-04-2023 General Introduction

    • 26-04-2023 ML Lifecycle Overview and Model Selection

    • 03-05-2023 Data Definition and Collection

    • 10-05-2023 From Feature Engineering to Data Storage

    • 17-05-2023 Advanced Data Processing & Intro into Model Serving

    • 24-05-2023 Model Infrastructure & Delivery

    • 31-05-2023 Model Monitoring

    • 07-06-2023 Project Presentations

    17-05-2023 Advanced Data Processing & Intro into Model Serving

    hashtag
    Preparation

    hashtag
    This week you will...

    hashtag
    Learning Resources

    Reinforcement Learning

    Woche 7 - IQSH Handreichung zu CHatGPT

    hashtag
    Diese Woche werdet Ihr....

    • lernen was Philipp MΓΆller vom IQSH ΓΌber die EinsatzmΓΆglichkeiten von ChatGPT und von KI im Allgemeinen denkt.

    hashtag
    Lernressourcen

    file-download
    8MB
    230606_LuLmKI - IQSH.pptx
    arrow-up-right-from-squareOpen

    Woche 4 - Nicht-technische EinfΓΌhrung in die KI

    hashtag
    Diese Woche werdet Ihr...

    • durch die Diskussion der Prompts aus den Übungsaufgaben der letzten Woche lernen, wie man Prompts anreichern kann (bzw. muss) damit diese besser funktionieren.

    • eine nicht-technische EinfΓΌhrung in die Funktionsweise von Sprachmodellen bekommen.

    hashtag
    Lernressourcen

    hashtag
    Bis zur nΓ€chsten Woche solltet Ihr...

    wenn ihr an einem Projekt arbeitet:

    alle:

    file-pdf
    6MB
    230511_LuL mit KI_Nicht-technische EinfΓΌhrung.pdf
    PDF
    arrow-up-right-from-squareOpen
    Neuroflasharrow-up-right

    24-05-2023 Model Infrastructure & Delivery

    hashtag
    Preparation

    hashtag
    This week you will...

    hashtag
    Learning Resources

    10-05-2023 From Feature Engineering to Data Storage

    hashtag
    This week you will...

    • learn about meta data from a machine learning pipe line

    • learn about managing evolving data

    • get an overview on enterprise storage solutions

    hashtag
    Learning Resources

    • : Week 2 and Week 3

    hashtag
    Until next week you should..

    • complete week 4 of the course and week 1 of the course ; including the provided exercises/labs.

    Machine Learning Data Lifecycle in Productionarrow-up-right
    Machine Learning Data Lifecycle in Productionarrow-up-right
    Machine Learning Modeling Pipelines in Productionarrow-up-right

    26-04-2023 ML Lifecycle Overview and Model Selection

    hashtag
    Preparation

    hashtag
    This week you will...

    hashtag
    Learning Resources

    07-06-2023 Project Presentations

    hashtag
    Preparation

    hashtag
    This week you will...

    hashtag
    Learning Resources

    Week 1 - Introduction

    hashtag
    This week you will...

    • get to know the course structure and you fellow course mates

    • get an idea about possible projects and the offered default projects

    hashtag
    Learning Resources

    hashtag
    Until next week you should...

    file-pdf
    2MB
    231023_General Introduction.pdf
    PDF
    arrow-up-right-from-squareOpen
    introduction to transformersarrow-up-right
    Contextual Word Representations Part 1arrow-up-right

    03-05-2023 Data Definition and Collection

    hashtag
    This week you will...

    • Get an overview of data problems that occur in machine learning applications.

    • Learn some techniques to address these problems

    • get to know about data quality

    hashtag
    Learning Resources

    hashtag
    Until next week you should..

    • Complete the goal setting for your course project.

    • complete week 2 and week 3 of the course , including the provided exercises/labs.

    https://www.coursera.org/learn/introduction-to-machine-learning-in-production/home/week/3arrow-up-right
    https://www.coursera.org/learn/machine-learning-data-lifecycle-in-production/home/week/1arrow-up-right
    https://www.coursera.org/learn/machine-learning-data-lifecycle-in-production/home/week/2arrow-up-right

    Week 4 - Prompt Engineering

    hashtag
    This week you will...

    • Master Prompt Engineering

    • Familiarize yourself with different prompting frameworks

    hashtag
    Slides

    hashtag
    Until next week you should...

    • Go through the learning material below

    • Apply and try the learned prompt engineering techniques on your project and report on your findings in the next session

    hashtag
    Learning Resources

    Prompt engineering is a relatively new discipline for developing and optimizing prompts(a.k.a the text inputs) to get the best out of large language models (LLMs) for a wide variety of tasks. This means that we manipulate the text input to the model with the goal to get the best or most desired output out of the model.

    Prompt engineering skills generally help us to better understand the capabilities and limitations of LLMs just as they are very valuable to improve the capacity of LLMs on a wide range of common and complex tasks such as question answering and arithmetic reasoning.

    We can also look at it, that Prompt Engineering is referring to methods on how to communicate with the LLM to steer its behavior towards desired outcomes. One key point of prompt engineering methods is that they don't touch/change the model weights. So the LLM is completely frozen and the only change is happening in the input values - the prompts.

    Prompt engineering is a very empirical science and the effect of specific prompt engineering methods can vary a lot among models, thus requiring heavy experimentation and heuristics.

    We will now look at different prompt engineering methods..

    hashtag
    Basic Prompting

    Zero-shot and few-shot learning are two most basic approaches for prompting the model, pioneered by many LLM papers and commonly used for benchmarking LLM performance.

    hashtag
    Zero-Shot

    Zero-shot learning is to simply feed the task text to the model and ask for the result.

    Prompt:

    Output:

    As you see for sophisticated LLMs and easy enough tasks this is already enough to achieve the aim.

    hashtag
    Few-shot

    Few-shot learning presents a set of demonstrations, each consisting of both input and desired output, on the target task. Normally they are high quality examples. As the model first sees good examples, it can better understand human intention and criteria for what kinds of answers are expected. Therefore, few-shot learning often leads to better performance than zero-shot. However, it comes at the cost of more token consumption.

    Prompt:

    Output_

    For further improvements we first have to understand which are the elements of a prompt.

    hashtag
    Elements of a Prompt

    A prompt contains any of the following elements:

    Instruction - a specific task or instruction you want the model to perform

    Context - external information or additional context that can steer the model to better responses

    Input Data - the input or question that we are interested to find a response for

    Output Indicator - the type or format of the output.

    You do not need all the four elements for a prompt and the format depends on the task at hand.

    hashtag
    Best Practices

    Usually, the more specific and relevant the context is to the task we are trying to perform, the better.

    We should be very specific about the instruction and task we want the model to perform. The more descriptive and detailed the prompt is, the better the results. This is particularly important when we have a desired outcome or style of generation we are seeking. There aren't specific tokens or keywords that lead to better results. It's more important to have a good format and descriptive prompt. In fact, providing examples in the prompt is very effective to get desired output in specific formats.

    Here a the golden 6 bullet points for good prompting:

    1. Write as clearly and precisely as possible.

    2. Provide as much context as possible/necessary.

    3. Really use ChatGPT as a chat interaction.

    Next we will go to see specific prompting frameworks which increase the outputs especially for more complex tasks even more.

    Ressources:

    Iterate until you are satisfied with the results.
  • Break down the task into individual steps.

  • Provide examples.

  • https://lilianweng.github.io/posts/2023-03-15-prompt-engineering/arrow-up-right
    Classify the text into neutral, negative or positive. 
    
    Text: I think this course is amazing.
    Sentiment:
    positive
    A "whatpu" is a small, furry animal native to Tanzania. An example of a sentence that uses
    the word whatpu is:
    We were traveling in Africa and we saw these very cute whatpus.
    To do a "farduddle" means to jump up and down really fast. An example of a sentence that uses
    the word farduddle is:
    When we won the game, we all started to farduddle in celebration.

    31-05-2023 Model Monitoring

    hashtag
    Preparation

    hashtag
    This week you will...

    hashtag
    Learning Resources

    Week 1 - General Introduction

    hashtag
    This week you will...

    • get all details about the structure and form of the course.

    • get to know your fellow course participants.

    • get a general introduction into the current use of transformer models and their basic idea.

    hashtag
    Learning Resources

    hashtag
    Until next week you should...

    file-pdf
    3MB
    221026_General Introduction.pdf
    PDF
    arrow-up-right-from-squareOpen
    Presentation Slides
    video 1arrow-up-right
    video 2arrow-up-right
    chapter 1arrow-up-right

    Mathematik fΓΌr maschinelles Lernen

    TensorFlow Course: Week 10 - Special Issues Considering Your Final Projects

    hashtag
    This week you will...

    • get some additional ideas on models or areas of machine learning that were not covered in this course.

    • get a quick overview of possible next steps in your machine learning journey and how to stay updated with the fast moving development in this field.

    hashtag
    Learning Resources

    hashtag
    Until next week you should...

    • focus fully on your project again and prepare the project presentation for next week.

    • consider the following requirements for the final presentation shown.

    file-pdf
    2MB
    210119_What's next.pdf
    PDF
    arrow-up-right-from-squareOpen
    Course Presentation
    here
    How to Start, Complete, and Submit Your Projectchevron-right

    Deep Dive into LLMs

    Chain of Thought

    Chain of Thought is essentially a prompting approach that gives the model not only more or better context like the previous methods (e.g. few-shot prompting, 6 golden prompting rules..) but let model think longer to get the correct answer. More thinking here means more computation. As you can see in the figure below without chain-of-thought thinking the model has basically just one forward pas through the transformer model to get to the correct answer. That might be enough for simple tasks like sentiment analysis but not enough for more complex tasks. With this think step by step approach the model can use more computations to get to the correct result. This is actually not very different from who human are instructed to work step by step through complex problems.

    Image Source:
    Kojima et al. (2022)arrow-up-right

    Woche 3 - KI-Tools fΓΌr den Bildungsbereich

    hashtag
    Diese Woche werdet Ihr...

    • Doris Weßels und Moritz Larsen kennen lernen

    • von ihnen eine aktuelle Auswahl an KI-Tools gezeigt bekommen

    hashtag
    Lernressourcen

    hashtag
    Bis zur nΓ€chsten Woche solltet Ihr...

    p.s.: Zur zweiten Aufgabe: Fragt das Sprachmodell gerne auch mal nach einer Lernressource auf YouTube.

    (2) Bereitet Euch mit Hilfe eines Sprachmodells auf die nΓ€chste Woche vor, etwa mit folgendem initialen Prompt (fragt dann weiter nach Begriffen, die Ihr nicht versteht):

    Ich mΓΆchte verstehen wie die aktuellen Sprachmodelle wie ChatGPT funktionieren, habe aber keine Ahnung von maschinellem Lernen. Kannst Du mir auf einfache Art und Weise erklΓ€ren wie sie funktionieren?

    file-pdf
    832KB
    230504_LuLmKI _ KI-Tools.pdf
    PDF
    arrow-up-right-from-squareOpen
    file-pdf
    2MB
    Weßels-Lehre-Lernen-mit-KI-Tools-opencampus-2023-05-04.pdf
    PDF
    arrow-up-right-from-squareOpen

    Week 2 - Tokens & Embeddings revisted

    hashtag
    This week you will...

    • Gain a broad understanding of how language models have evolved, from early methods like n-grams to advanced transformer architectures.

    • Understand the significance and limitations of word embeddings and recurrent neural networks, including LSTMs.

    hashtag
    Slides

    hashtag
    Learning Resources

    • NLP with Deep Learning

    • : NLU

    hashtag
    Until next week you should...

    Complete the and play around with different embeddings

  • file-pdf
    851KB
    30_10_23_Token_and_embeddings_revisited.pdf
    PDF
    arrow-up-right-from-squareOpen
    Stanford CS224Narrow-up-right
    Stanford XCS224Uarrow-up-right
    Stanford lecturearrow-up-right
    Stanford lecturearrow-up-right
    Rasa Attention Seriesarrow-up-right
    Notebookarrow-up-right
    Stanford lecturearrow-up-right

    Week 10 - Project Presentations

    hashtag
    Week 10 - Project Presentations

    hashtag
    Week 5 - Agents

    hashtag
    This week you will...

    ...

    hashtag
    Until next week you should...

    Week 8 - Fine-Tuning II and Model Inference

    hashtag
    This week you will...

    • explore advanced training techniques designed to train large models efficiently, minimizing computational requirements.

    • gain comprehensive insights into the key hyperparameters for effective model inference.

    • discover the unique attributes of inference processes for streaming Large Language Models (LLMs).

    hashtag
    Learning Resources

    • and of the course Generative AI with Large Language Models

    hashtag
    Until next week you should...

    file-pdf
    3MB
    231211_Fine-Tuning II.pdf
    PDF
    arrow-up-right-from-squareOpen
    Week 2arrow-up-right
    week 3arrow-up-right

    Week 9 - Advisory Session

    hashtag
    This week you will...

    • ...

    hashtag
    Slides

    ...

    hashtag
    Learning Resources

    • ...

    hashtag
    Until next week you should...

    Prequisites

    There are certain requirements which form the basis for a successful course participation. If you do not have the mandatory requirements listed below, cosidering enrolling into a more basic course of our offerings. Alternatively bring yourself up to speed. Under additional ressources on the left sidebar you find the necessary ressources. Since the course has a really high pace it will be absolutely necessary to straighten these basic requirements before the course!

    hashtag
    Mandatory

    Python

    Here is a refresher notebook:

    Math

    Linear Algebra, Probability Theory (at least the basics)

    Machine Learning

    Basics:

    • What is a neural network

    • What is a forward/backprogragation

    • What is a loss

    hashtag
    Totally optional

    You can set up your PC for local development. A guiding notebook is here:

    Here different IDEs are presented and compared:

    What is an activation function

    Woche 2 - Anwendungsbeispiele #twlz

    hashtag
    Diese Woche werdet Ihr...

    • Twitter ganz allgemein als tolle Resource fΓΌr aktuelle Infos kennen lernen.

    • Explizit das 'Twitterlehrerzimmer' als tolle Quelle fΓΌr deutschsprachigen Content zum Thema kennen lernen

    • Jan Vedder als einen 'Twitterlehrer' kennen lernen und seinen Online Kurs 'ChatGPT in der Schule' vorgestellt bekommen

    hashtag
    Lernressourcen

    hashtag
    Bis zur nΓ€chsten Woche solltet Ihr...

    file-pdf
    2MB
    230427_LuLmKI - Jan Fedder.pdf
    PDF
    arrow-up-right-from-squareOpen
    file-pdf
    46MB
    KI CHATGPT & SCHULE.pdf
    PDF
    arrow-up-right-from-squareOpen

    Hello and welcome😊

    Great that you want to dive into the deep water in Machine Learning. These are exciting times with major advancements on a quarterly basis like ChatGPT, Whisper, StableDiffusion and so many more. Nevertheless all these exciting models were developed with solid ML knowledge which is what we want to acquire in this course.

    This is a course which brings you from beginner to intermediate or even advanced. It is formally called Intermediate Machine Learning but following HuggingFaceπŸ€— terms which we will use heavily in the course I like to call the course SmilingFace😊. This is meant ironically, because you will never laugh in the course😊. Okay joking aside the use of smileys during learning and practioning ML helps us to remember to have fun, laugh about our mistakes and take ourself not seriously as it was proposed by the HuggingFaceπŸ€— community. Therefore we will use our 😊 heavily in this course.

    On the next pages you can see what the content of each course week will be starting with what will happen during each of our course sessions. Then again the SmilingFace😊 will lead you to what else to do in the week. I have divided the course into three levels of course work:

    😊

    The part after one 😊 is mandatory for each course participant for a for successful participation

    😊😊

    The part after two 😊😊 is voluntary but recommended

    😊😊😊

    The part after three 😊😊😊 is completely voluntarily for the ones who really want to know

    Remember the course instructor(me) is also fallible so please question me if you see something that does not kind of seem right for you. Also always ask questions especially if you don't fully understand something. This is really why we give this course so that you understand everything😊

    Own contributions or suggestions for improving the course as well as feedback are always welcome😊

    Let's dive right in!

    More techniques

    hashtag
    Self Consistency

    The idea is to sample multiple, diverse reasoning paths through few-shot CoT, and use the generations to select the most consistent answer.

    hashtag
    Generated Knowledge Prompting

    First generate some knowledge given the question. Then answer the question with the knowledge as context in the second step

    hashtag
    Tree of Thoughts (ToT)

    ToT maintains a tree of thoughts, where thoughts represent coherent language sequences that serve as intermediate steps toward solving a problem. In plain english this means that multiple solution approaches are tried and one keeps kind of log what works or. If you have solved a Suduko you probalbly are familiar with this back and forth exploration and in one of the original ToT papers they reported huge success with this method for solving sudokus.

    The ToT approach essentially enables an LM to self-evaluate the progress intermediate thoughts make towards solving a problem through a deliberate reasoning process. The LM's ability to generate and evaluate thoughts is then combined with search algorithms (e.g., breadth-first search and depth-first search) to enable systematic exploration of thoughts with lookahead and backtracking. While this leads to much better complex task solving capabilities it takes hugely more LM calls and takes quite some time..so if you want to build a chatbot this approach should not be used.

    Week 3 - Intro Kaggle competition - EDA and baseline models with PyTorch

    Learning and testing - a.k.a. don't do Bullshit Machine Learning

    hashtag
    Course session

    Kaggle

    • Introduction

    • Titanic

    • Paddy

    • Exploratory Data Analysis(EDA) for Paddy Disease Classification

    Solutions exercise MLP

    Presentation from the participants of the MLP from Coursera

    Walk-through

    PyTorchLightning

    PyTorch 303 (Lab 03)

    hashtag
    To-do

    😊

    Go for your own through the Colab Notebook above (PyTorch303) and try to understand and repeat the steps for your own.

    Do Week 3 of the Coursera Course

    Please register at kaggle.com and join the competition. Go through the Exploratory Data Analysis Notebook session and then train a Logistic regression as baseline model!

    The main objective of this Kaggle competition is to develop a machine or deep learning-based model to classify the given paddy leaf images accurately. A training dataset of 10,407 (75%) labeled images across ten classes (nine disease categories and normal leaf) is provided. Moreover, the competition host also provides additional metadata for each image, such as the paddy variety and age. Your task is to classify each paddy image in the given test dataset of 3,469 (25%) images into one of the nine disease categories or a normal leaf.

    So that is where we will be heading in the next session trying different tools and techniques.

    EDA Notebook

    Logistic regression (try first on your own but if your stuck look at the notebook below):

    😊😊

    Build an MLP in PyTorchLightning for Paddy Challenge on Kaggle

    😊😊😊

    Do your own EDA on the Paddy Challenge and/or look at other EDA notebooks from competitors. Make a final presentable EDA notebook

    Transfer the CNN from the Coursera assignment to our Kaggle competition

    Familiarize yourself with this PyTorch Tutorials:

    Week 2 - Recap ML Basics, Intro to PyTorch

    hashtag
    Course session

    Quiz

    ML Basics recap

    Solutions exercises

    Presentation from the participants of the tasks from PyTorch101

    Walk-through

    PyTorch 202 (Lab 02)

    hashtag
    To-do

    😊

    Go for your own through the Colab Notebook above (Pytorch202) and try to understand and repeat the steps for your own.

    Do Week 2 of the Coursera Course

    The notebook from the Coursera Course of Week 2 can be accessed here:

    The redundancy between our notebooks and the Coursera notebooks is desired to reintroduce the concepts in a different way and hence enrich your learning experience!

    😊😊

    Try to improve the accuracy in the PyTorch 202 notebook by tweaking the amount of layers and number of neurons

    😊😊😊

    Familiarize yourself with basic PyTorch Tutorials:

    • (Second part)

    https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.htmlarrow-up-right
    https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial3/Activation_Functions.htmlarrow-up-right
    https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial4/Optimization_and_Initialization.htmlarrow-up-right

    Week 4 - Convolutional Neural Networks

    CNNs

    hashtag
    Course session

    Solutions exercise CNN

    Presentation from the participants of the CNN assignment from Coursera

    Kaggle

    • Homework presentation of Logistic Regression for Paddy Disease Classification

    Walk-through

    Basic CNN in PyTorch:

    PyTorch 404

    Basic CNN in PyTorchLightning:

    hashtag
    To-do

    😊

    Go for your own through the Kaggle Notebook and PyTorch404 above and try to understand and repeat the steps for your own.

    Do Week 4 of the Coursera Course

    😊😊

    Add the the test functionality and create a submission.csv and upload it to the leaderboard

    😊😊😊

    Week 5 - RAG and Agents

    hashtag
    This week you will...

    • Learn about Retrieval augmented generation

    • Agents

    hashtag
    Learning Resources

    • Watch OpenAIs Tips and Tricks on RAG and Finetuning

    • Get to know Openai Function calling

    hashtag
    Until next week you should...

    herearrow-up-right
    this short coursearrow-up-right
    Week 7 - Deep Learning from Scratch @ Opencampus - SoSe 2021deeplearning.freelab.orgchevron-right
    Week 3 - Deep Learning for Computer Vision @ Opencampusdeeplearning.freelab.orgchevron-right
    Week 2 - Deep Learning for Computer Vision @ Opencampusdeeplearning.freelab.orgchevron-right
    Week 4 - Deep Learning for Computer Vision @ Opencampusdeeplearning.freelab.orgchevron-right
    Week 1 - Deep Learning for Computer Vision @ Opencampus.shdeeplearning.freelab.orgchevron-right
    GitHub - lucidrains/deep-daze: Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnounGitHubchevron-right
    A super cool command tool to let network "imagine" stuff
    GitHub - lucidrains/DALLE-pytorch: Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in PytorchGitHubchevron-right
    An attempt of creating an open source version of DALL-E

    Week 1 - Course Introduction

    hashtag
    Course session

    Welcome and Introduction round

    Introduction of the course, opencampus, the course instructor and the course participants

    Tool Set-Up

    • Coursera

    • Colab

    • Editor (VSCode)

    • Virtual Environments

    • Git/Github

    Walk-through

    PyTorch 101 (Lab 01)

    A visual overview of the workflow in the Colab notebook you can get in the PyTorch diagram below:

    hashtag
    To-do

    😊

    1. Watch the following introduction video to the PyTorch framework

    1. Watch Week 1 of the Coursera Course

    1. Go for your own through the Colab Notebook above (Pytorch101) and try to understand and repeat the steps for your own. Thereby you should also solve Task 1-3 in the notebook. You can create therefore a copy of the notebook in your Drive or download the notebook to work locally on it. Ensure that you sufficient computing resources available (i.e gpu) if you choose to work locally.

    😊😊

    Try to improve the accuracy in the PyTorch 101 notebook by tweaking the amount of layers and number of neurons

    😊😊😊

    Familiarize yourself with basic PyTorch Tutorials:

    • (First part)

    file-pdf
    97KB
    pytorch diagram.pdf
    PDF
    arrow-up-right-from-squareOpen
    https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.htmlarrow-up-right
    https://uvadlc-notebooks.readthedocs.io/en/latest/tutorial_notebooks/tutorial2/Introduction_to_PyTorch.htmlarrow-up-right
    lucidrains - OverviewGitHubchevron-right
    GitHub - facebookresearch/deit: Official DeiT repositoryGitHubchevron-right
    Facebook's version of Transformers for Computer Vision

    Week 7 - Transformers Part 1

    Understanding the Transformer

    hashtag
    Course session

    Explanatory Session Part 1

    Self-attention and multihead attention

    Hugging Face Introduction

    Library and Walk-through of HuggingFace101

    Explanatory Session Part 2

    Transformer Encoder and Positional Encoding

    hashtag
    To-do

    😊

    Go through this excellent site explaining Transformers:

    Do Chapter 1 and Chapter 2 of the HuggingFace NLP course

    Go through the TransformerHW1

    😊😊

    😊😊😊

    Look closer at the Pytorch module nn.Transformer () and go through a on how to use it for next token prediction.

    documentationarrow-up-right
    tutorialarrow-up-right
    GitHub - lucidrains/vit-pytorch: Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in PytorchGitHubchevron-right
    Again, Phil Wang's replication of the Vision Transformers (originally from Google Team)

    Week 6 - Model Evaluation

    hashtag
    This week you will...

    • get to know Weights & Biases a popular platform to evaluate deep learning models.

    • understand different evaluation metrics.

    • get a high level introduction into training LLMs.

    hashtag
    Learning Resources

    • by Deeplearning.AI and Weights & Biases on how to use the Weights & Biases framework to track and evaluate your model results

    hashtag
    Until next week you should...

    Week 6 - CNN and RNN Applications

    Hands-on

    hashtag
    Course session

    Kaggle Finetuning

    Presentation of experiments with the goal of improving the classification accuracy

    file-pdf
    6MB
    231127_Model Evaluation.pdf
    PDF
    arrow-up-right-from-squareOpen
    short coursearrow-up-right
    this videoarrow-up-right
    Training a causal language model from scratcharrow-up-right
    Transfer Learning

    Theory and Applications

    hashtag
    To-do

    😊

    Watch the second half of Week 5 of the Coursera Course

    Watch the following Seminar about Transformers:

    😊😊

    Go on using ideas discussed in this session and go on improving the accuracy on the Paddy Dataset

    Week 5 - Recurrent Neural Networks

    RNNs

    hashtag
    Course session

    Faster Coding with ChatGPT, Stackoverflow and clever search

    Solutions exercise RNN

    Presentation from the participants of the RNN assignment from Coursera

    Deep dive

    • What are Embeddings?

    • Reinforcements of and insights into RNNs beyond Coursera

    Walk-through

    PyTorch 505

    Transfer Learning CNN in PyTorchLightning:

    hashtag
    To-do

    😊

    Watch first half of Week 5 of the Coursera Course

    !!!

    Using all techniques learned build the best model you can to achieve an accury at least above 70%. You can use Transfer Learning, Augmentation and other tricks. You can also take inspiration from fellow notebooks on Kaggle. Good ideas will be rewarded by special achievement badges for the course. Have fun and push the accuracy!😊

    !!!

    Woche 8 - VerΓ€nderungen in benΓΆtigten Kompetenzen

    hashtag
    Diese Woche werdet Ihr...

    • noch einmal in die Forschungsarbeit von Jenni und Thorben (IPN) Einblicke erhalten.

    • Es geht darum, welche Kompetenzen SchΓΌlerInnen und Lehrende zukΓΌnftig (nicht mehr) brauchen werden.

    hashtag
    Lernressourcen

    file-pdf
    1MB
    230615_LuLmKI - Kompetenzen.pdf
    PDF
    arrow-up-right-from-squareOpen
    file-download
    2MB
    Kompetenzen und LLM (3).pptx
    arrow-up-right-from-squareOpen

    Intermediate Machine Learning (Legacy SS2023)

    Hybrid Format - Every Wednesday 18h00

    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    GitHub - Mnpr/Art-Generation-GANs: :art: Series of progressive exploration and experimentation of Deep Generative Models on subset of WikiArt dataset to produce Realistic art Images.GitHubchevron-right
    Project from a participant of the last semester
    Week 4 - Deep Learning from Scratch @ Opencampusdeeplearning.freelab.orgchevron-right
    GitHub - opencampus-sh/bakery-sales-project: The template adapted with data and presentation for the bakery sales projectGitHubchevron-right
    Trending Papers - Hugging Facehuggingfacechevron-right
    Logo
    KΓ‘roly Zsolnai-FehΓ©r - Research ScientistKΓ‘roly Zsolnai-FehΓ©r - Research Scientistchevron-right
    GitHub - opencampus-sh/project_template_folder: A template folder that you can download and fill with the necessary information to upload the project from the Courses belonging to the Opencampus Machine Learning DegreeGitHubchevron-right
    Lecturescs230.stanford.educhevron-right
    Week 2 - Deep Learning from Scratch @ Opencampus.shdeeplearning.freelab.orgchevron-right
    Logo
    The most insightful stories about Deep Learning - MediumMediumchevron-right
    Week 9 - Deep Learning @ Opencampusdeeplearning.freelab.orgchevron-right
    Deep Learningwww.deeplearningbook.orgchevron-right
    Week 1 - Deep Learning @ Opencampus.shdeeplearning.freelab.orgchevron-right
    Kaggle: Your Machine Learning and Data Science CommunityKagglechevron-right
    Logo
    Week 3 - Deep Learning from Scratch @ Opencampusdeeplearning.freelab.orgchevron-right
    https://unity3d.com/machine-learningunity3d.comchevron-right
    Unity toolkit for Machine Learning
    opencampus.sh machine learning projects | opencampus.sh machine learning projectsopencampus-sh.github.iochevron-right
    Projects from the last semesters
    Logo
    Week 6 - Deep Learning from Scratch @ Opencampus - SoSe 2021deeplearning.freelab.orgchevron-right
    Logo
    Stanford CS25: V2 I Introduction to Transformers w/ Andrej KarpathyYouTubechevron-right
    displaCy Named Entity Visualizer Β· ExplosionExplosionchevron-right
    Logo
    Logo
    Logo
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    https://dandelion.eu/semantic-text/text-classification-demodandelion.euchevron-right
    Google Colabcolab.research.google.comchevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    https://www.summarizebot.com/text_api_demo.htmlwww.summarizebot.comchevron-right
    DeepL Translator | World's Most Accurate Translatorwww.deepl.comchevron-right
    Google Colabcolab.research.google.comchevron-right
    Logo
    Logo
    Paddy Doctor: Paddy Disease Classificationwww.kaggle.comchevron-right
    Paddy Doctor: Paddy Disease Classificationwww.kaggle.comchevron-right
    Logo
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Logo
    Opencampus Paddy EDAKagglechevron-right
    Opencampus Paddy EDAKagglechevron-right
    Logo
    Logo
    Opencampus Paddy PyTorch Logistic RegressionKagglechevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Logo
    Google Colabcolab.research.google.comchevron-right
    The assignment notebook
    Google Colabcolab.research.google.comchevron-right
    The first notebook
    Logo
    Logo
    Opencampus Basic CNN in PyTorchKagglechevron-right
    Logo
    Introduction to Machine LearningCourserachevron-right
    Introduction to Machine LearningCourserachevron-right
    Introduction to Machine LearningCourserachevron-right
    Introduction to Machine LearningCourserachevron-right
    Introduction to Machine LearningCourserachevron-right
    Tutorial 5: Inception, ResNet and DenseNet β€” UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
    Tutorial 5: Inception, ResNet and DenseNet β€” UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
    Logo
    Logo
    Google Colabcolab.research.google.comchevron-right
    Stanford CS224N NLP with Deep Learning | 2023 | PyTorch Tutorial, Drew KaulYouTubechevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    Logo
    The Illustrated Transformerjalammar.github.iochevron-right
    Tutorial 6: Transformers and Multi-Head Attention β€” UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Introduction - Hugging Face LLM Coursehuggingfacechevron-right
    Logo
    Google Colabcolab.research.google.comchevron-right
    Logo
    Logo