Size of spacy-transformers on x86_64 amd is huge. #13411
-
Hello, I'm using a custom transformer model Thanks. |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 1 reply
-
Hi!
|
Beta Was this translation helpful? Give feedback.
-
Hi, thanks for prompt reply. Please have a look at below libraries. |
Beta Was this translation helpful? Give feedback.
-
Can you paste (not screenshot) the full console output, including the command(s) you ran? |
Beta Was this translation helpful? Give feedback.
-
I'm running in a docker container with multi-stage build
Requirements.txt
|
Beta Was this translation helpful? Give feedback.
-
Thanks! That clarifies things for me as I don't get these nvidia libraries installed on my machine. So this must be due to how PyTorch gets installed - looks like it's installing the latest Pytorch 2.2.2 for Cuda 12.1. Instead of relying on If you look at the instructions over at https://pytorch.org/, there is a CPU version of the
|
Beta Was this translation helpful? Give feedback.
-
Hi, This solution worked for me. Thank you very much as it reduced my docker image size significantly 50%! |
Beta Was this translation helpful? Give feedback.
Thanks! That clarifies things for me as I don't get these nvidia libraries installed on my machine.
So this must be due to how PyTorch gets installed - looks like it's installing the latest Pytorch 2.2.2 for Cuda 12.1.
Instead of relying on
spacy[transformers]
to pull intorch
, I would suggest to install it beforehand, so you can customize how that happens. Then afterwards spaCy will skip it if it's already present in the environment.If you look at the instructions over at https://pytorch.org/, there is a CPU version of the
pip install
command that might be relevant to you, for 2.2.2 on Linux with Pip this should be: