Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Doc]: minicpmv2 inference #7

Open
xiaohuihui52309 opened this issue Jul 25, 2024 · 1 comment
Open

[Doc]: minicpmv2 inference #7

xiaohuihui52309 opened this issue Jul 25, 2024 · 1 comment
Labels
documentation Improvements or additions to documentation

Comments

@xiaohuihui52309
Copy link

📚 The doc issue

How to use VLLM inference for fine-tuned minicpmv2 models

Suggest a potential alternative/fix

No response

@xiaohuihui52309 xiaohuihui52309 added the documentation Improvements or additions to documentation label Jul 25, 2024
Copy link

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

1 participant