Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[question] Why are build_requires version ranges of dependencies resolved? #17463

Open
1 task done
mpll-d opened this issue Dec 13, 2024 · 3 comments
Open
1 task done
Assignees

Comments

@mpll-d
Copy link

mpll-d commented Dec 13, 2024

What is your question?

Hello Conan people,
I'm currently experiencing (conan v2.10.2) quite slow conan install runs on my application (~8 min to run), most of the time being spent on the resolution of version ranges rather than binaries downloads.
I noticed that in particular, some of the libs I consume declare cmake and ninja as tool_requires, but my package does not.

I would suspect that, when building my app that does not list them as tool_requires, these packages would just be completely skipped, without spending time on resolving the revision across the various remotes.

I cannot fathom why Conan needs to actually look for them, could you clarify this? Otherwise, it would be very nice to optimize away this tool_require version range resolution when conan install is not called with --build=missing or similar options.

Thanks for your help,
Milan

Have you read the CONTRIBUTING guide?

  • I've read the CONTRIBUTING guide
@memsharded memsharded self-assigned this Dec 13, 2024
@memsharded
Copy link
Member

Hi @mpll-d

Thanks for your question.

Conan always expand the full dependency graph. That means that tool_requires recipes are evaluated always, the recipes might be downloaded, and then the existence of binaries will be evaluated. Most of the times, the binaries for those tool_requires are not necessary and they will be marked as "Skip", and the download will be avoided completely. But still the graph will be fully computed, this is the default because it avoids many issues, like lockfiles not being complete, later problems to reproduce a build because some missing tool-requires, etc. There are also situations in which users want their tool-requires to affect the produced binaries of the packages using those tool-requires.

For very large dependency graph (many thousands of nodes in the graph), it is possible that some of the checks above take some time. Still 8 minutes seems excessive, even for very large graphs. Can you please report your graph size? Like what is the number of nodes of your graph if you dump it with conan graph info --format=json > graph.json?

If you can also share your full output, that might help to understand what is happening.
It would also help to report the timings when things are already installed in the cache, what is the time of a second conan install?

Then, there are mechanisms to avoid this, like:

$ conan config list skip
...
tools.graph:skip_build: (Experimental) Do not expand build/tool_requires

This tools.graph:skip_build allows to completely prune the tool-requires. But I wouldn't consider this yet, it would be important to understand first what is happening.

@mpll-d
Copy link
Author

mpll-d commented Dec 13, 2024

First of all, thanks for the swift answer.
I'll try to address the different questions:

  • I have 89 nodes on that, which is not as huge as the thousands you mention.
  • Out of the 8 minutes I mentioned, it seems to be about 50/50 time spent on Computing dependency graph vs the download of the dependencies
  • A second run is almost instantaneous in regarding the Computing dependency graph step.
  • During the Computing dependency graph step, most of the time is spent "thinking" (~1min) before displaying:
ninja/1.12.1: Not found in local cache, looking in remotes...
ninja/1.12.1: Checking remote: conancenter
ninja/1.12.1: Downloaded recipe revision fd583651bf0c6a901943495d49878803

Where ninja typically is a tool_requires package defined with a revision range.
So I guess the time is spent trying to resolve this version / revision for ninja.

I think I also got a fix, as I have been reading through the documentation here. I saw that I should store the revs of the conancenter recipes I use on my own server, so I only store what I need, which should be much faster.
Also, adding a lockfiles makes it way faster. Though the issue with that is that I'm not sure how I should handle and maintain lockfiles. I'm building multiple packages and it seems a bit complicated to maintain consistent lockfiles around different repos, if I'm to commit them there.
Is there a best practice on how to store. sync & use lockfiles for a poly-repo codebase?

Thanks for your help!

@memsharded
Copy link
Member

I have 89 nodes on that, which is not as huge as the thousands you mention.
Out of the 8 minutes I mentioned, it seems to be about 50/50 time spent on Computing dependency graph vs the download of the dependencies

That is definitely not large, there are users with thousands of nodes. 4 minutes just resolving the dependency graph is very weird.
A way to measure the performance without the installation part, would be, on a blank cache, do a conan graph info command with the same arguments. I can typically resolve and download all binaries for the "tensorflow-lite" example (around 50 deps, some of them large ones), a few seconds. For example:

 conan graph info --requires="opencv/[*]" --requires="tensorflow-lite/[*]" --format=json > graph.json

Takes around 30 seconds, with a blank cache, using only ConanCenter as remote. I guess that you are using ConanCenter only? If you have your own server, can you test using just your server (pre-uploading the packages there). I often do tests with a local ArtifactoryCE (from our downloads page) running on my machine to investigate issues and rule out client or server issues.

It is true that it seems that ConanCenter is slower than usual, we are investigating it, but still seems a different order of magnitude.

During the Computing dependency graph step, most of the time is spent "thinking" (~1min) before displaying:

For a single dependency? just for the recipe-revision? That looks weird, maybe this is related to the issues in ConanCenter, and is affecting some packages more than others. I'll let the team know, and keep you updated.

Regarding the lockfiles questions, maybe lets keep this one focus on the performance issue until we understand it properly, because it shouldn't be that slow, and if you want you can open new tickets for the lockfiles and any other question you might have. Thanks for your feedback.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants