Skip to content

Commit

Permalink
Merge pull request #815 from irexyc/typo
Browse files Browse the repository at this point in the history
  • Loading branch information
0xJacky authored Jan 12, 2025
2 parents 0701124 + f1ed7e1 commit 8cf7884
Show file tree
Hide file tree
Showing 13 changed files with 18 additions and 18 deletions.
4 changes: 2 additions & 2 deletions app/src/language/ar/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2505,11 +2505,11 @@ msgstr ""
#: src/views/preference/OpenAISettings.vue:48
#, fuzzy
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
"لاستخدام نموذج كبير محلي، قم بنشره باستخدام vllm أو imdeploy. فهي توفر نقطة "
"لاستخدام نموذج كبير محلي، قم بنشره باستخدام vllm أو lmdeploy. فهي توفر نقطة "
"نهاية API متوافقة مع OpenAI، لذا قم فقط بتعيين baseUrl إلىAPI المحلية الخاصة "
"بك."

Expand Down
4 changes: 2 additions & 2 deletions app/src/language/de_DE/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2657,12 +2657,12 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
"Um ein lokales großes Modell zu verwenden, implementiere es mit ollama, vllm "
"oder imdeploy. Sie bieten einen OpenAI-kompatiblen API-Endpunkt, also setze "
"oder lmdeploy. Sie bieten einen OpenAI-kompatiblen API-Endpunkt, also setze "
"die baseUrl auf deine lokale API."

#: src/views/preference/OpenAISettings.vue:72
Expand Down
2 changes: 1 addition & 1 deletion app/src/language/en/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2598,7 +2598,7 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
Expand Down
4 changes: 2 additions & 2 deletions app/src/language/es/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2579,11 +2579,11 @@ msgstr ""
#: src/views/preference/OpenAISettings.vue:48
#, fuzzy
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
"Para utilizar un modelo local grande, impleméntelo con vllm o imdeploy. "
"Para utilizar un modelo local grande, impleméntelo con vllm o lmdeploy. "
"Estos proporcionan un API endpoint compatible con OpenAI, por lo que solo "
"debe configurar la baseUrl en su API local."

Expand Down
2 changes: 1 addition & 1 deletion app/src/language/fr_FR/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2621,7 +2621,7 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
Expand Down
2 changes: 1 addition & 1 deletion app/src/language/ko_KR/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2585,7 +2585,7 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
Expand Down
2 changes: 1 addition & 1 deletion app/src/language/messages.pot
Original file line number Diff line number Diff line change
Expand Up @@ -2400,7 +2400,7 @@ msgid "To make sure the certification auto-renewal can work normally, we need to
msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid "To use a local large model, deploy it with ollama, vllm or imdeploy. They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API."
msgid "To use a local large model, deploy it with ollama, vllm or lmdeploy. They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API."
msgstr ""

#: src/views/preference/OpenAISettings.vue:72
Expand Down
2 changes: 1 addition & 1 deletion app/src/language/ru_RU/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2565,7 +2565,7 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
Expand Down
4 changes: 2 additions & 2 deletions app/src/language/tr_TR/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2779,11 +2779,11 @@ msgstr ""
#: src/views/preference/OpenAISettings.vue:48
#, fuzzy
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
"Yerel bir büyük model kullanmak için, vllm veya imdeploy ile dağıtın. OpenAI "
"Yerel bir büyük model kullanmak için, vllm veya lmdeploy ile dağıtın. OpenAI "
"uyumlu bir API uç noktası sağlarlar, bu nedenle baseUrl'yi yerel API'nize "
"ayarlamanız yeterlidir."

Expand Down
2 changes: 1 addition & 1 deletion app/src/language/vi_VN/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2619,7 +2619,7 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
Expand Down
4 changes: 2 additions & 2 deletions app/src/language/zh_CN/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2453,11 +2453,11 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
"要使用本地大型模型,可使用 ollama、vllm 或 imdeploy 进行部署。它们提供了与 "
"要使用本地大型模型,可使用 ollama、vllm 或 lmdeploy 进行部署。它们提供了与 "
"OpenAI 兼容的 API 端点,因此只需将 baseUrl 设置为本地 API 即可。"

#: src/views/preference/OpenAISettings.vue:72
Expand Down
2 changes: 1 addition & 1 deletion app/src/language/zh_TW/app.po
Original file line number Diff line number Diff line change
Expand Up @@ -2504,7 +2504,7 @@ msgstr ""

#: src/views/preference/OpenAISettings.vue:48
msgid ""
"To use a local large model, deploy it with ollama, vllm or imdeploy. They "
"To use a local large model, deploy it with ollama, vllm or lmdeploy. They "
"provide an OpenAI-compatible API endpoint, so just set the baseUrl to your "
"local API."
msgstr ""
Expand Down
2 changes: 1 addition & 1 deletion app/src/views/preference/OpenAISettings.vue
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ const models = shallowRef([
:validate-status="errors?.openai?.base_url ? 'error' : ''"
:help="errors?.openai?.base_url === 'url'
? $gettext('The url is invalid.')
: $gettext('To use a local large model, deploy it with ollama, vllm or imdeploy. '
: $gettext('To use a local large model, deploy it with ollama, vllm or lmdeploy. '
+ 'They provide an OpenAI-compatible API endpoint, so just set the baseUrl to your local API.')"
>
<AInput
Expand Down

0 comments on commit 8cf7884

Please sign in to comment.