This package is a general purpose tool for helping users to implement gradient descent methods for function optimization. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.
This package should be considered experimental in this point of development.
Using the 'remotes' package:
install.packages("remotes")
remotes::install_github("vthorrf/optimg")