Skip to content

happybear-21/adam.py

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Custom Adam Optimizer vs PyTorch Adam on MNIST

This project benchmarks a custom implementation of the Adam optimizer against PyTorch's built-in Adam optimizer using the MNIST dataset. The comparison is visualized based on model testing accuracy over training epochs.


πŸ“Œ Features

  • Custom implementation of the Adam optimization algorithm
  • Training on the MNIST handwritten digits dataset
  • Comparison with PyTorch's built-in Adam
  • Visualization of model accuracy over time
  • Lightweight, pure PyTorch and NumPy based training loop

πŸš€ Project Structure

.
β”œβ”€β”€ main.py              # Main script for training and comparison
β”œβ”€β”€ adam.png             # Output accuracy plot (auto-generated)
β”œβ”€β”€ requirements.txt     # Python dependencies
└── README.md            # Project documentation

🧠 Model Architecture

Input:        28 x 28 (flattened)
Dropout(0.4)
Linear:       784 -> 1200
Dropout(0.4)
Linear:       1200 -> 10
LogSoftmax

πŸ§ͺ Optimizers Compared

  1. PyTorch's torch.optim.Adam

  2. Custom Adam class:

    • Manually updates weights using:

      • First and second moment estimates
      • Bias correction
      • Learning rate decay
    • Fully vectorized using PyTorch


πŸ“Š Output

A plot (adam.png) comparing testing accuracy of both optimizers over training epochs (every 100 epochs).

adam.png


βš™οΈ Setup & Installation

1. Clone the repository

git clone https://github.com/happybear-21/adam.py
cd adam.py

2. Create a virtual environment (optional but recommended)

python -m venv venv
source venv/bin/activate  # or venv\Scripts\activate on Windows

3. Install dependencies

pip install -r requirements.txt

🏁 Running the Project

python main.py

This will:

  • Train two models using both optimizers
  • Plot their testing accuracy during training
  • Save the results to adam.png

About

Adam: A Method For Stochastic Optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages