Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MODELS=`[ variable problem when I docker run #1436

Closed
avirgos opened this issue Aug 26, 2024 · 5 comments
Closed

MODELS=`[ variable problem when I docker run #1436

avirgos opened this issue Aug 26, 2024 · 5 comments
Labels
support A request for help setting things up

Comments

@avirgos
Copy link

avirgos commented Aug 26, 2024

Hello,

I want to use Ollama to use Mistral model and I followed the documentation below : https://huggingface.co/docs/chat-ui/configuration/models/providers/ollama

deploy.sh :

#!/bin/bash

sudo docker compose down
sudo docker rm -f mongodb && sudo docker rm -f chat-ui

# nginx and ollama
sudo docker compose up -d

# mongodb
sudo docker run -d -p 27017:27017 -v mongodb-data:/data/db --name mongodb --network backend mongo:latest

# chat-ui
sudo docker run -d -p 3000:3000 --env-file .env.local -v chat-ui:/data --name chat-ui --network proxy ghcr.io/huggingface/chat-ui-db && sudo docker network connect backend chat-ui

docker-compose.yml :

services:
  nginx:
    image: nginx:latest
    container_name: nginx
    ports:
      - 80:80
      - 443:443
    networks:
      - proxy
    volumes:
      - ./nginx:/etc/nginx/conf.d
      - ./ssl:/etc/ssl
    restart: unless-stopped

  ollama:
    build:
      context: ./ollama
      dockerfile: Dockerfile
    image: ollama-with-ca
    container_name: ollama
    ports:
      - 11434:11434
    networks:
      - backend
    environment:
      - HTTPS_PROXY=http://<username>:<password>@proxy.test.fr:8090
    volumes:
      - ollama-data:/data
    restart: unless-stopped
    entrypoint: ["/bin/bash", "start-mistral.sh"]

networks:
  backend:
  proxy:
    external: true

volumes:
  ollama-data:

.env.local :

MONGODB_URL=mongodb://mongodb:27017
HF_TOKEN=hf_*****

MODELS=`[
  {
    "name": "Ollama Mistral",
    "chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{

{/if}

}{

{/if}

} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s> {{/ifAssistant}}{{/each}}",
    "parameters": {
      "temperature": 0.1,
      "top_p": 0.95,
      "repetition_penalty": 1.2,
      "top_k": 50,
      "truncate": 3072,
      "max_new_tokens": 1024,
      "stop": ["</s>"]
    },
    "endpoints": [
      {
        "type": "ollama",
        "url" : "ollama://ollama:11434",
        "ollamaName" : "mistral"
      }
    ]
  }
]`

When I start my script, at the end of the execution, the container doesn't want to launch, I get the following error :

docker: poorly formatted environment: variable '"name": "Ollama Mistral",' contains whitespaces.
See 'docker run --help'.

I already tried to put chat-ui and mongodb containers in the docker-compose.yml and it doesn't works, same as this issue : #614

Any solutions ?

Thanks in advance.

@avirgos avirgos added the support A request for help setting things up label Aug 26, 2024
@nsarrazin
Copy link
Collaborator

Docker doesn't like the formatting of .env.local

Try passing it's content in a DOTENV_LOCAL var so something like this

DOTENV_LOCAL=$(<.env.local) sudo docker run -d -p 3000:3000 --env-file /dev/null -e DOTENV_LOCAL  -v chat-ui:/data --name chat-ui --network proxy ghcr.io/huggingface/chat-ui-db && sudo docker network connect backend chat-ui

that should work?

@nsarrazin
Copy link
Collaborator

nsarrazin commented Aug 26, 2024

you can also bind mount the .env.local and read it at runtime (though I prefer the DOTENV_LOCAL)

docker run --mount type=bind,source="$(pwd)/.env.local",target=/app/.env.local -p 3000:3000 chat-ui

something like this

@huggingface huggingface deleted a comment Aug 26, 2024
@avirgos
Copy link
Author

avirgos commented Aug 27, 2024

@nsarrazin

Docker doesn't like the formatting of .env.local

Try passing it's content in a DOTENV_LOCAL var so something like this

DOTENV_LOCAL=$(<.env.local) sudo docker run -d -p 3000:3000 --env-file /dev/null -e DOTENV_LOCAL  -v chat-ui:/data --name chat-ui --network proxy ghcr.io/huggingface/chat-ui-db && sudo docker network connect backend chat-ui

that should work?

I tried this solution but there are none of my changes and when I access the chat-ui container doing this command sudo docker exec -it chat-ui /bin/bash I see that my .env.local file is empty.

I tried this solution also, using --preserve-env option :

DOTENV_LOCAL=$(<.env.local) sudo --preserve-env=DOTENV_LOCAL docker run -d -p 3000:3000 --env-file /dev/null -e DOTENV_LOCAL -v chat-ui:/data --name chat-ui --network proxy ghcr.io/huggingface/chat-ui-db && sudo docker network connect backend chat-ui

The file .env.local is not empty and it contains all the changes I want to apply but I have a 500 error "An error occured" ❌. (I think this is a bad solution because when I run the env command, I find all my changes in DOTENV_LOCAL variable and I don't think this is the expected behaviour)

you can also bind mount the .env.local and read it at runtime (though I prefer the DOTENV_LOCAL)

docker run --mount type=bind,source="$(pwd)/.env.local",target=/app/.env.local -p 3000:3000 chat-ui

something like this

Also, I tried this solution : my .env.local is not empty it contains all the changes I want to apply but I have a 500 error "An error occured" ❌.

Using this solution, when I remove MODELS=`[ variable it works well, I think the problem is related about Ollama configuration.

@avirgos
Copy link
Author

avirgos commented Aug 27, 2024

Here's an update on the situation.

In documentation, https://huggingface.co/docs/chat-ui/configuration/models/providers/ollama, this causes the error :

    "chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{

{/if}

}{

{/if}

} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s> {{/ifAssistant}}{{/each}}",

I changed this part using this documentation https://github.com/huggingface/chat-ui/blob/main/docs/source/configuration/models/providers/ollama.md and I dont have the 500 error "An error occured" ❌ anymore ! :

"chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s> {{/ifAssistant}}{{/each}}",

But now I have this error "fetch error" ❌ when I ask a question to my AI model, here is my error when I execute sudo docker logs -f chat-ui command :

{"level":50,"time":1724751972409,"pid":30,"hostname":"********","err":{"type":"TypeError","message":"fetch failed: unknown scheme","stack":"TypeError: fetch failed\n    at node:internal/deps/undici/undici:13178:13\n    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n    at async file:///app/build/server/chunks/models-DCHXW_8X.js:4222:25\n    at async generate (file:///app/build/server/chunks/_server.ts-Z5_xg7ds.js:432:30)\n    at async textGenerationWithoutTitle (file:///app/build/server/chunks/_server.ts-Z5_xg7ds.js:497:3)\ncaused by: Error: unknown scheme\n    at makeNetworkError (node:internal/deps/undici/undici:8961:35)\n    at schemeFetch (node:internal/deps/undici/undici:10363:34)\n    at node:internal/deps/undici/undici:10205:26\n    at mainFetch (node:internal/deps/undici/undici:10224:11)\n    at fetching (node:internal/deps/undici/undici:10172:7)\n    at fetch (node:internal/deps/undici/undici:10041:20)\n    at fetch (node:internal/deps/undici/undici:13176:10)\n    at fetch (node:internal/bootstrap/web/exposed-window-or-worker:72:12)\n    at file:///app/build/server/chunks/models-DCHXW_8X.js:4222:31\n    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)"},"msg":"fetch failed"}

I don't know what that means exactly. I don't think I have a Ollama endpoint problem because I don't have error indicating that chat-ui container cannot find ollama container. Maybe it's due to my proxy ? Should I create a new issue ?

@avirgos
Copy link
Author

avirgos commented Aug 27, 2024

Everything now works on my side, it was due to the configuration of Ollama in the MODELS=` variable.

MODELS=`[
  {
    "name": "Ollama Mistral",
    "chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s> {{/ifAssistant}}{{/each}}",
    "parameters": {
      "temperature": 0.1,
      "top_p": 0.95,
      "repetition_penalty": 1.2,
      "top_k": 50,
      "truncate": 3072,
      "max_new_tokens": 1024,
      "stop": ["</s>"]
    },
    "endpoints": [
      {
        "type": "ollama",
        "url" : "http://ollama:11434",
        "ollamaName" : "mistral"
      }
    ]
  }
]`

Thanks a lot !

@avirgos avirgos closed this as completed Aug 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
support A request for help setting things up
Projects
None yet
Development

No branches or pull requests

2 participants