arrow-left

All pages
gitbookPowered by GitBook
1 of 1

Loading...

Week 2 - Tokens & Embeddings revisted

hashtag
This week you will...

  • Gain a broad understanding of how language models have evolved, from early methods like n-grams to advanced transformer architectures.

  • Understand the significance and limitations of word embeddings and recurrent neural networks, including LSTMs.

hashtag
Slides

hashtag
Learning Resources

  • NLP with Deep Learning

  • : NLU

hashtag
Until next week you should...

Complete the Notebookarrow-up-right and play around with different embeddings

  • file-pdf
    851KB
    30_10_23_Token_and_embeddings_revisited.pdf
    PDF
    arrow-up-right-from-squareOpen
    Stanford CS224Narrow-up-right
    Stanford XCS224Uarrow-up-right
    Stanford lecturearrow-up-right
    Stanford lecturearrow-up-right
    Rasa Attention Seriesarrow-up-right