Skip to content

Experimental framework for research on Neural Networks

Notifications You must be signed in to change notification settings

vmerckle/Gradient-Work

Repository files navigation

Gradient Work - Neural Network Analysis Framework for Research

This repository contains a standalone research framework for short to long (+20 hours) experiments, targeted at iterative algorithms like gradient descent.

As well as three different algorithms for ReLU Shallow Networks training.

To do so, set what should be saved during the experiments and how often, when to stop the experiment and what to log in real time. Each experiment will be stored along with all the parameters in a file for future analysis and exploitation.

Example

Once it's finished, we can check the file that has been created.

Run plot.py without arguments to create plots for the latest experiment.

Experiments and helper files

  • config.py: configuration are python files (contain algo choice, data setup, hyperparameters...)
  • runner.py: different loops (animation, loss display...)
  • postprocess.py: compute indicators
  • utils.py: helper functions and such

This project uses but does not depend on pytorch, cvxpy. However it depends on NumPy.

Implemented Algorithms

Gradient Descent

  • algo_GD_torch.py: pytorch implem of 2 layer ReLU gradient descent

Convex Reformulation

  • algo_convex_cvxpy.py: 2 layer ReLU gradient descent convex solver

Wasserstein Gradient Flow Simulation

Proximal and Wasserstein Descent

  • algo_prox.py: Proximal Point
  • proxdistance.py : implements Frobenius, Wasserstein, Sliced Wasserstein distances

JKO-step and Proximal Solvers

  • algo_jko.py: Mean-field discretization using JKO, replace Wasserstein proxf by kl_div proxf.
  • jko_proxf_scipy.py: Proximal Scipy solver
  • jko_proxf_cvxpy.py: Proximal Cvxpy solver
  • jko_proxf_pytorch.py: Proximal solver using pytorch gradient descent

About

Experimental framework for research on Neural Networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages