Skip to content

ereverter/micrograd-rnn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Extension of micrograd for RNNs

Simple extension of micrograd to consider a recurrent neural network (RNN) architecture. The RNN definition is found in extension.py. With collaboration of @amartorell98.

The only modification made to the original code is the inclusion of the extension.py script. Within the playground.ipynb file, you can locate the functions responsible for calculating the BPTT (Backpropagation Through Time) loss and conducting the training process. The data used for training is generated artificially. By adjusting parameters such as the number of epochs, gradient clipping value, learning rate value, and its decay, you can fine-tune the training process.

About

Extension of micrograd to handle basic RNNs.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published