Background: We originally started tracking the FastAI course.
But because of a lack of depth, we decided instead to dissect blogs and tutorials.
Introductions!
What does your current work look like? How do you plan to use NLP?
- What is NLP?
- A changing field
- Resources
- Tools
- Python libraries
- Example applications
- Ethics issues
Notebook: What is NLP?
Video(s):
- FastAI: What is NLP?
- SOTA 2019
Video(s):
- FastAI: Topic Modeling with SVD & NMF
Notebook:
Resources:
- A Tutorial on PCA (closely related to SVD)
- SVD article, explaining intuition behind factorizing to 3 different matrices
Video(s):
Notebook:
Resources:
Questions:
- Why is it when you plot the sorted document-term matrix it shows those waves
Skipped
Video(s):
- FastAI: Transfer Learning
Notebook:
Resources:
Questions:
- When we pick a learning rate off the LR vs. Loss graph, we pick it halway down the descent to the lowest point. Why is that?
Blogs:
Optional:
- The Unreasonable Effectiveness of RNNs
- Illustrated Guide to LSTMs and GRUs
- Understanding LSTM Networks
- Tutorial on LSTMs: A Computational Perspective
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention)
- Seq2Seq with Attention and Beam Search
Optional:
- Intuitive Understanding of Attention Mechanism in Deep Learning
- NLP FROM SCRATCH: TRANSLATION WITH A SEQUENCE TO SEQUENCE NETWORK AND ATTENTION
- NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE
- Linear Algebra with code examples
- Linear Algebra Class(MIT Opencourseware)
- FastAI: Computational Linear Algebra