-
-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我做了一个本地llm翻译、总结的版本,12G显存即可食用,欢迎来玩儿~ #96
Comments
多谢大佬 colab免费GPU有24小时限制 刚想怎么本地化您这就搞出来了
有些地方忘写了好像 问题还蛮多了 先凑活吧 大佬这个项目想小白友好可能还得改改 |
很棒~我想支持本地的llm模型会是各类开源ai应用接下来的一个大趋势。比如这个项目https://github.com/openai-translator/openai-translator 也刚刚宣布支持本地模型了,可以做很不错的参考。 |
这是一处代码错误,感谢反馈,以修正;能详细看下翻译的报错吗? |
谢谢,目前我是在想说 怎么弄个简单gui 这个其实有点难 |
pyqt吧 |
大佬 已Fork 说说小白从无到有碰到的几个坑(本地化就是win平台了)
|
|
https://github.com/sanbuphy/WhisperTranslator
The text was updated successfully, but these errors were encountered: