-
-
Notifications
You must be signed in to change notification settings - Fork 549
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Casadi vertcat is slow if using larger number of time steps #1996
Comments
From the casadi documentation i found: Concatenation means stacking matrices horizontally or vertically. Due to the column-major way of storing elements in CasADi, it is most efficient to stack matrices horizontally. Also someone in this forum gave a suggestion in his second point about a possible acceleration instead of using vercat. I have no experience with casadi whatsoever. |
@dion-w I don't really understand the solutions above ... One idea I had is to create an object |
@MarcBerliner is this still a bottleneck? not sure if your recent changes cover this? or if it will be fixed when we switch to |
@rtimms the timings still look about the same for the
Updated code for timing solver = pybamm.IDAKLUSolver()
simulation = pybamm.Simulation(model, parameter_values=param, solver=solver)
t_max = 60*60*10
t_eval = np.linspace(0, t_max, num=(t_max//2))
simulation.solve([0, t_max], t_interp=t_eval) |
great, thanks! i’ll leave this open for now but not a priority since we will eventually drop the casadi solver anyway |
Sure. I don't think it's a difficult fix if we ever want/need to patch this. |
If using many time steps, it doesn't take too long to integrate the model, but it takes a long time to create the solution:
Timings are as follows:
Most of the additional solve time is being spent in this vertcat
PyBaMM/pybamm/solvers/casadi_solver.py
Line 697 in 23157ae
It's strange because the solution arrays are being created by the solver really fast, but then just vertcat takes a long time. Maybe there is a faster way to do vertcat?
The text was updated successfully, but these errors were encountered: