Skip to content

An improved autograd engine that expands Andrej Karpathy's micrograd. Supports distributed tensors, enhanced operations, graph optimization, and visualization. Ideal for advanced machine learning concepts.

Notifications You must be signed in to change notification settings

SubhasmitaSw/picograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PicoGrad: The "Tiny" Autograd Engine

Because size doesn't always matter in ML

PicoGrad is a tiny autograd engine that implements backpropagation (reverse-mode autodiff) over a dynamically built DAG, with a focus on speed and support for distributed tensors. It's a supercharged version of micrograd with a few extra bells and whistles.

We called it "pico" for the same reason you might call your gaming PC a "little setup" – pure, delightful understatement.

Features

  • Implements a general-purpose Tensor class that supports distributed computing
  • Supports dynamic computational graph construction
  • Provides automatic differentiation (autograd) capabilities
  • Includes basic neural network building blocks
  • Offers graph optimization for improved performance

Extending to Neural Networks

PicoGrad can be used to build, train neural networks and visualize the computational graph.

trained model

Distributed Tensor Support

PicoGrad goes beyond micrograd by supporting distributed tensors, allowing for efficient computation on large datasets:

from picograd.engine import DistributedTensor
import numpy as np

data = DistributedTensor(np.random.rand(1000000, 1000))
print(data)

Graph Optimization

PicoGrad includes graph optimization techniques to improve computational efficiency:

Initial Graph

initial graph

Optimized Graph

optimized graph

About

An improved autograd engine that expands Andrej Karpathy's micrograd. Supports distributed tensors, enhanced operations, graph optimization, and visualization. Ideal for advanced machine learning concepts.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages