-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extremely high memory usage when generating fast graph #22
Comments
My bad. I did some more digging and I was accidentally inserting some weights near the upper limit of a u64. Once I fixed that, memory usage dropped significantly. |
Glad you figured out your issue already. Can you give some insights what kind of graph you are working with and how long |
Sure. Here is my code: https://git.scd31.com/stephen/rustic-roads I am using openstreetmap as my input. I've been experimenting with New Brunswick and Canada as my inputs. I'll try to get some better benchmarks on the weekend. I can provide rough path calculations though. These numbers are the time it took to create a path calculator, and run it between one source point, and 20 thousand randomly distributed destination points. New Brunswick - ~1.5 seconds Canada - ~22 seconds |
Ah interesting and thanks for the pointer to your routing engine, this is quite interesting indeed. |
Hello, my case is somewhat similar to this. I build a graph out of Madinah city, SA openstreet map data. Can you point out on which part the allocation happened and why, @easbar? Maybe user will be able to workaround or add some usage specification to this library.
|
I kind of understand better about the strange allocation now, CMIIW. First, the number of nodes is calculated based on the highest node added e.g. 1, 2, 3, 999 will make number of nodes 999. Then comes this vector allocation which allocate memory based on the number of nodes squared. So, for everyone on this issue, you can create ordered node catalog and add the index to the graph instead of the sparse number. |
https://github.com/a-b-street/abstreet/blob/master/map_model/src/pathfind/node_map.rs has some simple mappings and helper functions for working with fast_paths. The If I ever get more time, I'll clean up this module and upstream it in this crate, since likely everybody needs to solve this sort of problem. |
Thank you for pointing out, but I ended up using general purpose |
I am facing extremely high memory usage when generating a fast graph(>25GB)
I have 873k nodes and 900k edges. My code looks like this:
It starts swallowing memory at
let fast_graph = fast_paths::prepare(&input_graph);
. I tried creating a new params with a ratio of 0.01 and 10.0, which didn't seem to help. (I'm not entirely sure what this value does, which is why I tried a small and large value)I looked through the benchmark code but it doesn't look like I'm doing anything fundamentally different.
The text was updated successfully, but these errors were encountered: