Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' #182

Open
SoftologyPro opened this issue Jul 31, 2024 · 1 comment
Open

cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' #182

SoftologyPro opened this issue Jul 31, 2024 · 1 comment

Comments

@SoftologyPro
Copy link

SoftologyPro commented Jul 31, 2024

This is under Windows.
I create a new venv for this test.

git clone https://github.com/Tencent/HunyuanDiT
cd HunyuanDiT
pip install -r requirements.txt

I downloaded all the models with
huggingface-cli download Tencent-Hunyuan/HunyuanDiT-v1.2 --local-dir ./ckpts
Then when I run
python app/hydit_app.py --lang en --infer-mode fa
I get

Traceback (most recent call last):
  File "D:\Tests\HunyuanDiT\HunyuanDiT\app\hydit_app.py", line 9, in <module>
    from sample_t2i import inferencer
  File "D:\Tests\HunyuanDiT\HunyuanDiT\sample_t2i.py", line 5, in <module>
    from mllm.dialoggen_demo import DialogGen
  File "D:\Tests\HunyuanDiT\HunyuanDiT\mllm\dialoggen_demo.py", line 9, in <module>
    from llava.constants import (
  File "D:\Tests\HunyuanDiT\HunyuanDiT/mllm\llava\__init__.py", line 1, in <module>
    from .model import LlavaLlamaForCausalLM
ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (D:\Tests\HunyuanDiT\HunyuanDiT/mllm\llava\model\__init__.py)

What am I missing?

collect_env output

sys.platform: win32
Python: 3.10.11 (tags/v3.10.11:7d4cc5a, Apr  5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
CUDA available: True
MUSA available: False
numpy_random_seed: 2147483648
GPU 0: NVIDIA GeForce RTX 4090
CUDA_HOME: C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.4
NVCC: Cuda compilation tools, release 12.4, V12.4.99
MSVC: Microsoft (R) C/C++ Optimizing Compiler Version 19.40.33812 for x64
GCC: n/a
PyTorch: 2.3.1+cu118
PyTorch compiling details: PyTorch built with:
  - C++ Version: 201703
  - MSVC 192930154
  - Intel(R) oneAPI Math Kernel Library Version 2021.4-Product Build 20210904 for Intel(R) 64 architecture applications
  - Intel(R) MKL-DNN v3.3.6 (Git Hash 86e6af5974177e513fd3fee58425e1063e7f1361)
  - OpenMP 2019
  - LAPACK is enabled (usually provided by MKL)
  - CPU capability usage: AVX2
  - CUDA Runtime 11.8
  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_90,code=sm_90;-gencode;arch=compute_37,code=compute_37
  - CuDNN 8.7
  - Magma 2.5.4
  - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.8, CUDNN_VERSION=8.7.0, CXX_COMPILER=C:/actions-runner/_work/pytorch/pytorch/builder/windows/tmp_bin/sccache-cl.exe, CXX_FLAGS=/DWIN32 /D_WINDOWS /GR /EHsc /Zc:__cplusplus /bigobj /FS /utf-8 -DUSE_PTHREADPOOL -DNDEBUG -DUSE_KINETO -DLIBKINETO_NOCUPTI -DLIBKINETO_NOROCTRACER -DUSE_FBGEMM -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE /wd4624 /wd4068 /wd4067 /wd4267 /wd4661 /wd4717 /wd4244 /wd4804 /wd4273, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=2.3.1, USE_CUDA=ON, USE_CUDNN=ON, USE_CUSPARSELT=OFF, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_GLOO=ON, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=OFF, USE_NNPACK=OFF, USE_OPENMP=ON, USE_ROCM=OFF, USE_ROCM_KERNEL_ASSERT=OFF,

TorchVision: 0.18.1+cu118

@SoftologyPro
Copy link
Author

Same error when I run the CLI version

(venv) D:\Tests\HunyuanDiT\HunyuanDiT>python sample_t2i.py --prompt "渔舟唱晚"
Traceback (most recent call last):
  File "D:\Tests\HunyuanDiT\HunyuanDiT\sample_t2i.py", line 5, in <module>
    from mllm.dialoggen_demo import DialogGen
  File "D:\Tests\HunyuanDiT\HunyuanDiT\mllm\dialoggen_demo.py", line 9, in <module>
    from llava.constants import (
  File "D:\Tests\HunyuanDiT\HunyuanDiT/mllm\llava\__init__.py", line 1, in <module>
    from .model import LlavaLlamaForCausalLM
ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (D:\Tests\HunyuanDiT\HunyuanDiT/mllm\llava\model\__init__.py)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant