Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to build via Docker: "An Error Occurred" #1306

Open
adhishthite opened this issue Jun 25, 2024 · 2 comments
Open

Unable to build via Docker: "An Error Occurred" #1306

adhishthite opened this issue Jun 25, 2024 · 2 comments

Comments

@adhishthite
Copy link
Contributor

Context

I have made some custom changes in the repository, and whenever I do npm run dev, chat-ui works completely fine.

However, when I use Docker to build this, chat-ui builds successfully, but I get the following when I visit the localhost:3000:

image

Here are my environment variables:

ALL ENV VARIABLES {
  HOSTNAME: 'c0329fabc523',
  YARN_VERSION: '1.22.22',
  PWD: '/app',
  HOME: '/home/user',
  INCLUDE_DB: 'false',
  SHLVL: '1',
  PATH: '/home/user/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin',
  NODE_VERSION: '20.15.0',
  _: '/usr/local/bin/dotenv',
  MONGODB_URL: 'mongodb+srv://username:password@dev-azure-mongo-cluster-v1.mongocluster.cosmos.azure.com/',
  MONGODB_DB_NAME: 'chat-ui',
  HF_TOKEN: '1234567',
  OPENID_CONFIG: '{\n' +
    '  "PROVIDER_URL": "https://xyz.okta.com/oauth2/default",\n' +
    '  "CLIENT_ID": "client_id",\n' +
    '  "CLIENT_SECRET": "token",\n' +
    '  "SCOPES": "openid profile email",\n' +
    '  "NAME_CLAIM": ""\n' +
    '}',
  LLM_SUMMERIZATION: 'true',
  LOG_LEVEL: 'info',
  METRICS_PORT: '5566',
  COOKIE_NAME: 'chat-ui',
  ENABLE_ASSISTANTS: 'false',
  ENABLE_ASSISTANTS_RAG: 'false',
  USE_LOCAL_WEBSEARCH: 'false',
  WEBSEARCH_BLOCKLIST: '["youtube.com","twitter.com"]',
  MESSAGES_BEFORE_LOGIN: '0',
  MODELS: '[\n' +
    '    {\n' +
    '        "id":"gpt-35-turbo-16k-0613",\n' +
    '        "name":"Azure OpenAI GPT-3.5-16k",\n' +
    '        "displayName":"Azure OpenAI GPT-3.5-16k",\n' +
    '        "websiteUrl":"https://openai.com/",\n' +
    '        "logoUrl":"https://storage.googleapis.com/openai/dev/openai-icon.png",\n' +
    '        "modelUrl":"https://platform.openai.com/docs/models/gpt-3-5-turbo",\n' +
    '        "description":"The latest GPT-3.5 Turbo model with higher accuracy at responding in requested formats and a fix for a bug which caused a text encoding issue for non-English language function calls. Returns a maximum of 16k output tokens.",\n' +
    '        "parameters":{\n' +
    '            "temperature":0.8,\n' +
    '            "max_new_tokens":8192\n' +
    '        },\n' +
    '        "endpoints":[\n' +
    '            {\n' +
    '                "type":"openai",\n' +
    '                "baseURL":"https://host-dev.openai.azure.com/openai/deployments/host-gpt35",\n' +
    '                "defaultHeaders":{\n' +
    '                    "api-key":"token"\n' +
    '                },\n' +
    '                "defaultQuery":{\n' +
    '                    "api-version":"2024-02-15-preview"\n' +
    '                }\n' +
    '            }\n' +
    '        ],\n' +
    '        "promptExamples":[\n' +
    '            {\n' +
    '                "title":"Write an Email to a Customer",\n' +
    '                "prompt":"Write a concise professional email to a customer informing them about new OpenAI features."\n' +
    '            },\n' +
    '            {\n' +
    '                "title":"Marketing Copy",\n' +
    '                "prompt":"Write a short, captivating marketing copy for the new Google GenAI product launch."\n' +
    '            }\n' +
    '        ]\n' +
    '    },\n' +
    '    
    ']',
  TEXT_EMBEDDING_MODELS: '[\n' +
    '    {\n' +
    '        "name":"azure-openai-text-embedding-3-large",\n' +
    '        "displayName":"Azure OpenAI Text Embedding 3 Large",\n' +
    '        "description":"Azure OpenAI Text Embedding 3 Large model hosted on Azure. This model is capable of generating embeddings for text data and can be used for a variety of tasks such as semantic search, clustering, and more.",\n' +
    '        "chunkCharLength":2048,\n' +
    '        "endpoints":[\n' +
    '            {\n' +
    '                "type":"openai",\n' +
    '                "url":"https://host-dev.openai.azure.com/openai/deployments/embed/embeddings?api-version=2024-02-01",\n' +
    '                "apiKey":"token"\n' +
    '            }\n' +
    '        ]\n' +
    '    }\n' +
    ']',
  MONGODB_DIRECT_CONNECTION: 'false',
  HF_API_ROOT: 'https://api-inference.huggingface.co/models',
  OPENAI_API_KEY: '',
  ANTHROPIC_API_KEY: '',
  CLOUDFLARE_ACCOUNT_ID: '',
  CLOUDFLARE_API_TOKEN: '',
  COHERE_API_TOKEN: '',
  HF_ACCESS_TOKEN: '',
  YDC_API_KEY: '',
  SERPER_API_KEY: '',
  SERPAPI_KEY: '',
  SERPSTACK_API_KEY: '',
  SEARXNG_QUERY_URL: '',
  WEBSEARCH_ALLOWLIST: '[]',
  OPENID_CLIENT_ID: '',
  OPENID_CLIENT_SECRET: '',
  OPENID_SCOPES: 'openid profile',
  OPENID_NAME_CLAIM: 'name',
  OPENID_PROVIDER_URL: 'https://huggingface.co',
  OPENID_TOLERANCE: '',
  OPENID_RESOURCE: '',
  USE_CLIENT_CERTIFICATE: 'false',
  CERT_PATH: '',
  KEY_PATH: '',
  CA_PATH: '',
  CLIENT_KEY_PASSWORD: '',
  REJECT_UNAUTHORIZED: 'true',
  OLD_MODELS: '[]',
  TASK_MODEL: '',
  PARQUET_EXPORT_DATASET: '',
  PARQUET_EXPORT_HF_TOKEN: '',
  ADMIN_API_SECRET: '',
  PARQUET_EXPORT_SECRET: '',
  RATE_LIMIT: '',
  APP_BASE: '',
  EXPOSE_API: 'true',
  REQUIRE_FEATURED_ASSISTANTS: 'false',
  ENABLE_LOCAL_FETCH: 'false',
  ALTERNATIVE_REDIRECT_URLS: '[]',
  WEBHOOK_URL_REPORT_ASSISTANT: '',
  ALLOWED_USER_EMAILS: '[]',
  USAGE_LIMITS: '{}'
}

Listening on 0.0.0.0:3000

Can you please help me with this?

@nsarrazin

@nsarrazin
Copy link
Collaborator

nsarrazin commented Jun 25, 2024

How do you start the docker container ? 👀 Any logs there?

@adhishthite
Copy link
Contributor Author

adhishthite commented Jun 25, 2024

@nsarrazin :

docker build -t chatui-build-dev .

docker run -p 3000:3000 chatui-build-dev

In the Dockerfile, I have added this line:

COPY --chown=1000 .env.local /app/.env.local

Also, I do not see any logs with issues.

{"log.level":"info","@timestamp":"2024-06-25T13:31:33.605Z","log.logger":"elastic-apm-node","ecs.version":"8.10.0","agentVersion":"4.5.3","env":{"pid":22,"proctitle":"node","os":"linux 6.6.26-linuxkit","arch":"arm64","host":"c0329fabc523","timezone":"UTC+00","runtime":"Node.js v20.15.0"},"config":{"environment":{"source":"start","value":"development"},"logLevel":{"source":"default","value":"info","commonName":"log_level"},"serverUrl":{"source":"start","value":"https://host.apm.asia-south1.gcp.elastic-cloud.com/","commonName":"server_url"},"secretToken":{"source":"start","value":"[REDACTED]","commonName":"secret_token"},"serviceName":{"source":"start","value":"development","commonName":"service_name"},"serviceVersion":{"source":"default","value":"0.0.1","commonName":"service_version"}},"activationMethod":"import","message":"Elastic APM Node.js Agent v4.5.3"}

Connected to Elasticsearch successfully!
Listening on 0.0.0.0:3000

I have added Elasticsearch client on top of this for APM, but I do not see any issues there. It also works locally without issues.

Please help @nsarrazin

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants