Loading...
Gain a broad understanding of how language models have evolved, from early methods like n-grams to advanced transformer architectures.
Understand the significance and limitations of word embeddings and recurrent neural networks, including LSTMs.
NLP with Deep Learning
: NLU
Watch the last ten minutes of this on Attention
Watch the first 25 Minutes of this on Attention
Watch all 4 Videos of the
Complete the Notebookarrow-up-right and play around with different embeddings
The linked Repo contains the notebook as well as two exemplary vector stores to play around with
Optional: Watch this Stanford lecturearrow-up-right on Transformers