Week 2 - Tokens & Embeddings revisted
This week you will...
Gain a broad understanding of how language models have evolved, from early methods like n-grams to advanced transformer architectures.
Understand the significance and limitations of word embeddings and recurrent neural networks, including LSTMs.
Slides
Learning Resources
Stanford CS224N NLP with Deep Learning
Stanford XCS224U: NLU
Until next week you should...
Optional: Watch this Stanford lecture on Transformers
Last updated