arrow-left

All pages
gitbookPowered by GitBook
1 of 1

Loading...

Week 7 - Transformers & Hugging Face

hashtag
Course session

Explanatory Session Part 1

Self-attention and multihead attention

Hugging Face Introduction

Library and Use of HuggingFace for working with transfomer models

Explanatory Session Part 2

Transformer Encoder and Positional Encoding

Explanatory Session Part 3

Vision Transformer

Walk-through

Finetuning Vision Transformer on Kaggle Paddy Dataset

hashtag
To-do

😊

Look at current Kaggle competitions and make proposals

😊😊

Go through this excellent site explaining Transformers:

Do Chapter 1-3 of the HuggingFace NLP course

😊😊😊

Look closer at the Pytorch module nn.Transformer () and go through a on how to use it for next token prediction.

Watch this excellent "Build from Scratch" video from Andrej Karpathy

documentationarrow-up-right
tutorialarrow-up-right
Google Colabcolab.research.google.comchevron-right
The Illustrated Transformerjalammar.github.iochevron-right
Tutorial 6: Transformers and Multi-Head Attention β€” UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
Introduction - Hugging Face LLM Coursehuggingfacechevron-right
Google Colabcolab.research.google.comchevron-right
Logo
Logo
Logo
Logo
Tutorial 15: Vision Transformers β€” UvA DL Notebooks v1.2 documentationuvadlc-notebooks.readthedocs.iochevron-right
Logo