-
Notifications
You must be signed in to change notification settings - Fork 271
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #20 from ezioauditore-tech/master
Added Generative AI resources
- Loading branch information
Showing
1 changed file
with
23 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
## Generative AI | ||
This is a comprehensive guide to understanding and navigating the realm of Generative AI. Generative AI has gained significant traction in recent years due to its wide range of applications across various domains. From generating realistic images to aiding in natural language processing tasks, Generative AI has revolutionized how we interact with and create content. | ||
### Modue 1 - Introduction to Generative AI | ||
|
||
|
||
| Topic | References | | ||
| --------------------------------------------------------- |:-------------------------------------------------------------------------------------------------------------------------------------------- | | ||
| Introduction to Generative AI,Importance and Applications | [Intro to Generartive AI - Google Cloud Tech▶️](https://www.youtube.com/watch?v=G2fqAlgmoPo&pp=ygUdaW50cm9kdWN0aW9uIHRvIGdlbmVyYXRpdmUgYWk%3D) | | ||
| Autoencoders and Variational Autoencoders (VAEs) | [Variational Autoencoders - ArxivInsights▶️](https://www.youtube.com/watch?v=9zKuYvjFFS8&t=346s&pp=ygUXdmFyaWF0aW9uYWwgYXV0b2VuY29kZXI%3D)<br> [Autoencoders Explained Easily▶️](https://www.youtube.com/watch?v=xwrzh4e8DLs&t=2s)<br>[Autoencoders - Jeremy Jordan🧾 ](https://www.jeremyjordan.me/autoencoders/) | | ||
|Generative Adversarial Networks (GANs)| [A Friendly Introduction to Generative Adversarial Networks (GANs) - Serrano.Academy▶️](https://www.youtube.com/watch?v=8L11aMN5KY8&t=1076s&pp=ygUER2Fucw%3D%3D) <br> [6 GAN Architectures You Really Should Know - neptune.ai🧾](https://neptune.ai/blog/6-gan-architectures)| | ||
|Autoregressive Models and RBMs|[Guide to Autoregressive Models- Turing🧾](https://www.turing.com/kb/guide-to-autoregressive-models)<br> [Autoregressive Diffusion Models - Yannic Kilcher▶️](https://www.youtube.com/watch?v=2h4tRsQzipQ)<br>[Restricted Boltzmann Machines (RBM)- Serrano.Academy▶️](https://www.youtube.com/watch?v=Fkw0_aAtwIw)<br>| | ||
|Text Generation and Language Modeling| [Text Generation-HuggingFace](https://huggingface.co/tasks/text-generation)🧾| | ||
|
||
### Module 2 - Deep Learning Based Natural Language Processing | ||
| Topic | References | | ||
| ----- |:---------- | | ||
| Word Embedding | [Word Embedding and Word2Vec -StatQuest](https://www.youtube.com/watch?v=viZrOnJclY0)▶️<br> [Word2Vec, GloVe, FastText- CodeEmporium](https://www.youtube.com/watch?v=9S0-OC4LFNo&t=386s&pp=ygUQV29yZCBFbWJlZGRpbmdzIA%3D%3D)▶️<br> | | ||
|Representation Learning|[Representation Learning Complete Guide-AIM](https://analyticsindiamag.com/a-comprehensive-guide-to-representation-learning-for-beginners/#:~:text=Representation%20learning%20is%20a%20class,them%20to%20a%20given%20activity.)🧾| | ||
|Sequence-to-Sequence Models,Encoder-Decoder Architectures|[Sequence-to-Sequence (seq2seq) EncoderDecoder Neural Networks - StatQuest](https://www.youtube.com/watch?v=L8HKweZIOmg&pp=ygUbU2VxdWVuY2UtdG8tU2VxdWVuY2UgTW9kZWxz)▶️<br>[EncoderDecoder Seq2Seq Models - Kriz Moses](https://medium.com/analytics-vidhya/encoder-decoder-seq2seq-models-clearly-explained-c34186fbf49b)🧾| | ||
|seq2seq with Attention|[Sequence to Sequence (seq2seq) and Attention - Lena Voita](https://lena-voita.github.io/nlp_course/seq2seq_and_attention.html)🧾<br>[Attention for Neural Networks-StatQuest](https://www.youtube.com/watch?v=PSs6nxngL6k&pp=ygUsU2VxdWVuY2UgdG8gU2VxdWVuY2UgKHNlcTJzZXEpIGFuZCBBdHRlbnRpb24%3D)▶️| | ||
|Self Attention,Transformers| [Introduction to Transformers - Andrej Karpathy](https://www.youtube.com/watch?v=XfpMkf4rD6E)▶️<br>[Attention for Neural Networks-StatQuest](https://www.youtube.com/watch?v=PSs6nxngL6k)▶️<br>[Self attention-H2O.ai](https://h2o.ai/wiki/self-attention/)🧾<br>[What are Transformer Models and how do they work?-Serrano.Academy](https://www.youtube.com/watch?v=qaWMOYf4ri8)▶️| | ||
|Self-Supervised Learning |[Self-Supervised Learning: The Dark Matter of Intelligence-Yannic Kilcher](https://www.youtube.com/watch?v=Ag1bw8MfHGQ)▶️<br>[Self-Supervised Learning and Its Applications-neptune.ai🧾](https://neptune.ai/blog/self-supervised-learning)| | ||
|Advanced NLP|[Stanford CS224N: NLP with Deep Learning▶️](https://www.youtube.com/watch?v=rmVRLeJRkl4&list=PLoROMvodv4rMFqRtEuo6SGjY4XbRIVRd4&pp=iAQB)<br>[Natural Language Processing: Advance Techniques🧾](https://medium.com/analytics-vidhya/natural-language-processing-advance-techniques-in-depth-analysis-b67bca5db432) | |