Week 2 - Tokens & Embeddings revisted
Last updated
Last updated
Gain a broad understanding of how language models have evolved, from early methods like n-grams to advanced transformer architectures.
Understand the significance and limitations of word embeddings and recurrent neural networks, including LSTMs.
Stanford CS224N NLP with Deep Learning
Stanford XCS224U: NLU
Optional: Watch this Stanford lecture on Transformers