Week 2 - Tokens & Embeddings revisted
Last updated
Was this helpful?
Last updated
Was this helpful?
Gain a broad understanding of how language models have evolved, from early methods like n-grams to advanced transformer architectures.
Understand the significance and limitations of word embeddings and recurrent neural networks, including LSTMs.
NLP with Deep Learning
: NLU
Optional: Watch this on Transformers