Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dependency failed to start: container cognita-postgres is unhealthy #421

Open
Amitt1412 opened this issue Jan 16, 2025 · 0 comments
Open

Comments

@Amitt1412
Copy link

Amitt1412 commented Jan 16, 2025

Dear Team,

it is to update you that I gave been trying to deploy the app locally using my windows system & docker desktop. The following has been the output in the terminal. Could you please suggest me as how can I resolve the issue here in.

I', trying to run the build service with - docker-compose --env-file compose.env up

cognita-postgres  | fixing permissions on existing directory /var/lib/postgresql/data ... ok                                                                                       
cognita-postgres  | creating subdirectories ... ok
cognita-postgres  | selecting dynamic shared memory implementation ... posix
cognita-postgres  | selecting default max_connections ... 20                                                                                                                       
cognita-postgres  | selecting default shared_buffers ... 400kB
cognita-postgres  | selecting default time zone ... Etc/UTC
cognita-postgres  | creating configuration files ... ok                                                                                                                            
cognita-postgres  | 2025-01-16 10:59:37.027 UTC [83] FATAL:  data directory "/var/lib/postgresql/data" has invalid permissions
cognita-postgres  | 2025-01-16 10:59:37.027 UTC [83] DETAIL:  Permissions should be u=rwx (0700) or u=rwx,g=rx (0750).
cognita-postgres  | child process exited with exit code 1
cognita-postgres  | initdb: removing contents of data directory "/var/lib/postgresql/data"
cognita-postgres  | running bootstrap script ...
cognita-postgres exited with code 1
Gracefully stopping... (press Ctrl+C again to force)
dependency failed to start: container cognita-postgres is unhealthy
PS C:\Users\AmitTiwari\PycharmProjects\cognita> 

Here is the configuration at models_cofig.yaml, I have been trying to deploy with -

model_providers:
  ############################ Local ############################################
  #   Uncomment this provider if you want to use local models providers         #
  #   using ollama and infinity model server                                    #
  ###############################################################################

#  - provider_name: local-ollama
#    api_format: openai
#    base_url: http://ollama-server:11434/v1/
#    api_key_env_var: ""
#    llm_model_ids:
#      - "qwen2:1.5b"
#    embedding_model_ids: []
#    reranking_model_ids: []
#    default_headers: {}

#  - provider_name: local-infinity
#    api_format: openai
#    base_url: http://infinity-server:7997/
#    api_key_env_var: INFINITY_API_KEY
#    llm_model_ids: []
#    embedding_model_ids:
#      - "mixedbread-ai/mxbai-embed-large-v1"
#    reranking_model_ids:
#      - "mixedbread-ai/mxbai-rerank-xsmall-v1"
#    default_headers: {}

#  - provider_name: faster-whisper
#    api_format: openai
#    base_url: http://faster-whisper:8000
#    api_key_env_var: ""
#    llm_model_ids: []
#    embedding_model_ids: []
#    reranking_model_ids: []
#    audio_model_ids:
#      - "Systran/faster-distil-whisper-large-v3"
#    default_headers: {}
############################ OpenAI ###########################################
#   Uncomment this provider if you want to use OpenAI as a models provider    #
#   Remember to set `OPENAI_API_KEY` in container environment                 #
###############################################################################

 - provider_name: openai
   api_format: openai
   api_key_env_var: OPENAI_API_KEY
   llm_model_ids:
     - "gpt-3.5-turbo"
     - "gpt-4o"
   embedding_model_ids:
     - "text-embedding-3-small"
     - "text-embedding-ada-002"
   reranking_model_ids: []
   default_headers: {}

############################ TrueFoundry ###########################################
#   Uncomment this provider if you want to use TrueFoundry as a models provider    #
#   Remember to set `TFY_API_KEY` in container environment                         #
####################################################################################

# - provider_name: truefoundry
#   api_format: openai
#   base_url: https://llm-gateway.truefoundry.com/api/inference/openai
#   api_key_env_var: TFY_API_KEY
#   llm_model_ids:
#     - "openai-main/gpt-4o-mini"
#     - "openai-main/gpt-4-turbo"
#     - "openai-main/gpt-3-5-turbo"
#   embedding_model_ids:
#     - "openai-main/text-embedding-3-small"
#     - "openai-main/text-embedding-ada-002"
#   reranking_model_ids: []
#   default_headers: {}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant