Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
learn about the basic structure of the transformer models
get to know what tokenization is about and why it is important.
To get an idea about the number of tokens a text or word is made off in the GPT models, you may want to checkout this site.
The conditions to be met in order to receive a Certificate of Achievement (and ECTS) are:
If you attend via Zoom, please make sure to use your full name, which should be the same that you used to register at edu.opencampus.sh. Otherwise your attendance will not be recorded!
Check the Projects section to learn more about the projects.
get to know the mechanism underlying the self-attention approach.
get to know the basics of prompt design and how to apply them in a playground.
get to know examples for the application of transformer models.
Additional resource explaining the Transformer model:
Text: The Illustrated Transformer by Jay Alammar
get insights on the preprocessing of different NLP and sequence classification tasks.
get an idea on plausible hyperparameters to fit transformer models for different tasks.
learn about different metrics to evaluate NLP models.
prepare your final presentation as described in week 8 .
present your project in the final presentation. :-)
The presentation should take about 15 minutes and include the following aspects that correspond to the content of a model card for the Hugging Face library:
model description
intended uses & limitations
training data
training procedure
variable and Metrics
evaluation results
Check this section on Building a model card for more details.
Additionally, please also include into your presentation any things that you tried out and that didn't work, so we can all learn from your experiences.