Using without nvidia libraries or how to make it work with them using ZLUDA? #14484
Replies: 1 comment 1 reply
-
To use ZLUDA with Here is a detailed breakdown of potential workarounds for your situation: 1. ZLUDA Limitations and Missing NVIDIA Libraries
2. Possible WorkaroundsBelow are some approaches you can try: a. Use CPU Mode for PaddlePaddleIf GPU acceleration is unavailable due to hardware or library constraints, one option is to use the CPU version of PaddlePaddle ( Steps:
b. Explore ROCm (AMD GPU Support)AMD provides the ROCm (Radeon Open Compute) platform for GPU computation, which includes libraries similar to NVIDIA's CUDA and cuBLAS. However, PaddlePaddle does not natively support ROCm at the moment, so this solution would require significant development effort. Steps (if you want to explore custom development):
c. Wait for ZLUDA UpdatesIf ZLUDA extends support for AMD GPUs in the future or implements replacements for missing NVIDIA libraries, it may eventually become possible to use PaddlePaddle with AMD GPUs via ZLUDA. Monitor the ZLUDA GitHub repository for updates. d. Use Alternative FrameworksIf PaddlePaddle is not a strict requirement, consider using frameworks like PyTorch or TensorFlow, which have broader support for heterogeneous hardware. PyTorch, for example, has native ROCm support for AMD GPUs. 3. Error ReferenceThe linked GitHub Discussion mentions missing 4. ConclusionAt this time, there is no straightforward way to make
If you need GPU acceleration immediately and have access to an NVIDIA GPU, using that hardware with Response generated by feifei-bot | chatgpt-4o-latest |
Beta Was this translation helpful? Give feedback.
-
I'm trying to use ZLUDA with paddlepaddle-gpu to make it run on an AMD GPU, and while PyTorch/Transformers/Diffusers all work, paddlepaddle-gpu doesn't work because it uses nvidia. These have different dlls, that are not included with ZLUDA, like having not only cublas but also nvblas which isn't in ZLUDA, so I get an error. Is there a workaround? I need the speed of GPU.
lshqqytiger/ZLUDA#55 (comment)
Beta Was this translation helpful? Give feedback.
All reactions