You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Compute the observed Fisher Information Matrix (FIM) for the given inputs. The FIM is computed using the square of the gradient, divided by the number of data points.
Motivation
Often it becomes necessary to know how sensitive the optimization problem is to changes in the parameters. Acquiring Hessian information can be challenging. FIM can be calculated from the gradient, and is already implemented for the Jax cost functions in PyBOP.
Possible implementation
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
Feature description
Compute the observed Fisher Information Matrix (FIM) for the given inputs. The FIM is computed using the square of the gradient, divided by the number of data points.
Motivation
Often it becomes necessary to know how sensitive the optimization problem is to changes in the parameters. Acquiring Hessian information can be challenging. FIM can be calculated from the gradient, and is already implemented for the Jax cost functions in PyBOP.
Possible implementation
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: