Skip to content

My implementation of Machine Learning and Deep Learning papers from scratch.

Notifications You must be signed in to change notification settings

Chirayu-Tripathi/Paper-Implementations

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Paper-Implementations

My implementation of Machine Learning and Deep Learning papers from scratch.

Paper Name Link to Paper Year Published GitHub Folder
Improving Language Understanding by Generative Pre-Training GPT Paper 2018 GPT Implementation
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding BERT Paper 2019 BERT Implementation
Language Models are Unsupervised Multitask Learners GPT2 Paper 2019 GPT2 Implementation
LoRA: Low-Rank Adaptation of Large Language Models LoRA Paper 2021 LoRA Implementation

Some useful resources.

List of resources, that I found helpful while understanding and coding the concepts.

  1. Attention is all you need (Transformer) - Model explanation (including math), Inference and Training by Umar jamil: Youtube.

  2. Coding a Transformer from scratch on PyTorch, with full explanation, training and inference by Umar jamil: Youtube.

  3. Let's build GPT: from scratch, in code, spelled out by Andrej Karpathy. Youtube.

  4. Formal Algorithms for Transformers. arXiv

About

My implementation of Machine Learning and Deep Learning papers from scratch.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published