You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello!when I use 16 * A10(16 * 23g) to inference llama2-70b, it appears error:
I ask many people to solve this problem,but failed.
I know 8 gpu can work it! But I need to increase the prompt of llama2, the 8 GPU is not enough!
Do you have some ideas, thanks!
The text was updated successfully, but these errors were encountered:
babytdream
changed the title
a buig to use 16 * A10(16 * 23g) to inference llama2-70b
a bug to use 16 * A10(16 * 23g) to inference llama2-70b
Aug 25, 2023
I have 16 gpus in one machine.
Hello!when I use 16 * A10(16 * 23g) to inference llama2-70b, it appears error:
I ask many people to solve this problem,but failed.
I know 8 gpu can work it! But I need to increase the prompt of llama2, the 8 GPU is not enough!
Do you have some ideas, thanks!
The text was updated successfully, but these errors were encountered: