Replies: 1 comment
-
It sounds like you've probably hooked up the C++ ODE correctly via the dymos ODE compute method, considering that it appears to be converging to the correct solution. I agree that it shouldnt be taking 900 seconds to compute a sparsity pattern, but it's difficult to do so without doing some profiling to see where the problem is. Is there much delay in your C++ connection? Are you interacting with a C++ library that's persistent in memory vs. repeatedly launching a C++ executable during every call to compute? How large is your jacobian, and what coloring does OpenMDAO come up with? You can find your total_coloring.html file in the reports directory generated by OpenMDAO. If your sparsity pattern is really big, its entirely possible that having the report generation turned on is itself the bottleneck. Perhaps try instantiating your problem with Lastly, there was a recent pull request that was just merged in a past few days that improved the coloring performance. We haven't released that version yet, but you can try to install the master branch of OpenMDAO and see if that helps. python -m pip install git+https://github.com/OpenMDAO/OpenMDAO If the problem persists, we might have to start profiling to see where the bottleneck is. |
Beta Was this translation helpful? Give feedback.
-
Hi everyone, I'm a fairly new Dymos user and am looking for some guidance on where to look to diagnose a problem I'm running into.
I have a problem setup that references a C++ simulation (with a Python wrapper) in it's compute function for its EOMS. I'm using complex step partials and Gauss Lobatto transcription (at least for now due to the way I'm sampling my initial guess). I'm using IPOPT and am generating my initial guess from initial runs of my C++ simulation.
I've been able to verify that the external simulation is hooked up correctly and that my problem is set up correctly by doing some test cases that match existing closed-form optimal solutions (proportional navigation, genex) and am hoping to move into some more complex design questions. Before I do that; however, I want to fix a problem I'm seeing.
Although IPOPT correctly converges to optimal solutions, the time to compute the sparsity of my Jacobian is much longer than any of the examples I've seen in Dymos resources. For a 7-8 second trajectory it takes 883 seconds and the problem compounds for longer trajectories. When I run full-length trajectories of about 60-70 seconds the sparsity takes almost an hour and a half. I can sidestep this problem when I'm extremely confident in the rest of my setup by just using fixed coloring but I want to fix the problem as I'm getting into problems that don't have known solution cases.
I'd like some advice on where to start looking. I've tried directly changing my ipopt settings but it seems like there should be a process to explicitly tell Dymos more information about my Jacobian directly. I'd love some pointers on (A) how to go about doing that in the Dymos framework and (B) am I missing anything theory-wise that I need to brush up on in case I'm missing something obvious (I'm straight out of school and some deeper understanding would go a long way!).
Beta Was this translation helpful? Give feedback.
All reactions