Skip to content

Numpy implementation of Neural Networks with SGDM, ADAM and BFGS solvers, suitable for surface fitting

Notifications You must be signed in to change notification settings

S0852306/Numpy-Implementation-of-Neural-Nets

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Numpy-Implementation-of-Neural-Net

Numpy implementation of Neural Networks with various solvers.

  • Capable of handling multivariate function approximation tasks. ( $\mathbb{R}^{N} \rightarrow \mathbb{R}$ )
  • This repository implements a two-stage optimization method which is popular in the scientific machine learning community, outperforms several SGD-based methods such as Adam and SGDM in various scientific computing tasks.
  • A more generalized and sophisticated version ( $\mathbb{R}^{N} \rightarrow \mathbb{R}^{M}$ ) can be found in my MATLAB File Exchange.
  • "CompareWithTorch" provides a comparison between pure SGD-Based methods(Adam) and the two-stage optimization strategy.

Reference

  1. Numerical Optimization, Nocedal & Wright.
  2. Practical Quasi-Newton Methods for Training Deep Neural Networks, Goldfarb, et al.
  3. Kronecker-factored Quasi-Newton Methods for Deep Learning, Yi Ren, et al.

About

Numpy implementation of Neural Networks with SGDM, ADAM and BFGS solvers, suitable for surface fitting

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published