Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to connect to the server. Please try again later. #572

Open
yipy0005 opened this issue Jan 25, 2025 · 26 comments
Open

Failed to connect to the server. Please try again later. #572

yipy0005 opened this issue Jan 25, 2025 · 26 comments

Comments

@yipy0005
Copy link

Hi, I have followed the instructions and here's my config.toml:

[GENERAL]
PORT = 3001 # Port to run the server on
SIMILARITY_MEASURE = "cosine" # "cosine" or "dot"
KEEP_ALIVE = "5m" # How long to keep Ollama models loaded into memory. (Instead of using -1 use "-1m")

[API_ENDPOINTS]
SEARXNG = "http://localhost:32768" # SearxNG API URL
OLLAMA = "http://host.docker.internal:11434" # Ollama API URL - http://host.docker.internal:11434

When I run docker compose up -d, I am able to access the frontend at http://localhost:3000. However, the following error appeared: Failed to connect to the server. Please try again later.

Appreciate if I could get some advice on this. :)

Best,
YM

@yuval-kahan
Copy link

Check this :
*Note - before ALL check ports 3000 and 3001 are not taken by other process , if it does change the port in perplexica or close the open ports - if you do chose to change perpleica ports do it on this files :
.env
docker-compose.yaml
and config.toml - maybe if you need

### this is what i did to solve the same problem (after i closed ports 3000 and 3001):

  1. run ollama only on docker
  2. check that ollama is running and have models (you can do it via running "ollama list" in the Exec or by the command "docker exec -it <docker id number(it's under the ollama/olama in the pic - i hide it in red mark)> /bin/sh")
  3. check config.toml , here is mine:
    [API_KEYS]
    OPENAI = ""
    GROQ = ""
    ANTHROPIC = ""
    GEMINI = ""

[API_ENDPOINTS]
OLLAMA = "http://host.docker.internal:11434" #make sure Ollama run in docker with this port
SEARXNG = "http://localhost:32768"

[GENERAL]
PORT = 3_001
SIMILARITY_MEASURE = "cosine"
KEEP_ALIVE = "5m"

4.go to UI folder and change the env file to only be .env (look at the pic)

Image

Image

should work
let me know im here for help im responding fast brother !

@yuval-kahan
Copy link

yuval-kahan commented Jan 25, 2025

also in perplexica make sure this :
1.go to settings and make sure :
OLLAMA API URL: http://host.docker.internal:11434
chat model: -your ollama model-
chat model provider: OLLAMA

Image

@hieutrn1205
Copy link

Hello @yuval-kahan, I did install the ollama on docker, it is running but has empty list of models. And the Perplexica still having the failed to connect to the server.

Image

@yuval-kahan
Copy link

looks like Backend problem
check if you have something running on port 3001

linux / mac :
sudo netstat -tuln | grep :3001
windows:
netstat -ano | findstr :3001

if you have something running on port 3001 i recommend you to close it or changing the backend port on perlexica
also can you please share the files:
*config.toml
*.env file
and perplexica settings ? - the problem is with the backend
also it will be the best to show the logs - press on the Name of the compose (the first in the pic - only the perplexica and then press on the backend - the backend container will be in orange color on the side - press run and it will crash ang give you the logs of the crashing on the other side)

@yuval-kahan
Copy link

I'm pretty sure you have something running on port 3001 and you need to close port 3001 to "free" the port

@yuval-kahan
Copy link

By the way - where is the frontend container ? i see only the back in the pic -?....??

@hieutrn1205
Copy link

I was trying to rename the sample.config.toml but it didnt let me to rename.
The port 3001 is only reserved for perplexica, the front end container is below.

@yuval-kahan
Copy link

you should have back and front
Image

@hieutrn1205
Copy link

this is the settings on front-end

Image

@yuval-kahan
Copy link

wait but i don't see the frontend container in your docker ....

@hieutrn1205
Copy link

hieutrn1205 commented Jan 26, 2025

it was cut
Image

@hieutrn1205
Copy link

This is the log of the failed container

Image

@yuval-kahan
Copy link

....it's not the error logs.... you should have an error when you try to run it
press on the perplexica name (only perplexica name it will open all the containers that was created in the docker composer file and then run the backend)
likethis:

Image

Image

@yuval-kahan
Copy link

also make sure to receive connections from all (0.0.0.0) but let's try first the port thing as i'm pretty sure that is the problem

@hieutrn1205
Copy link

Thanks for the suggestion. But how do I set the config to receive connection from all?
This is what I run to find if any is assigned to :3001
Image

@yuval-kahan
Copy link

looks like you need to kill what is running on port 3001 and then the backend should work
try to kill what is on 3001 and run perplexica
the 0.0.0.0 is probably not the problem - you shouldn't change the local host to 0.0.0.0

@yuval-kahan
Copy link

but double check it
kill port 3001 process and then check again there's some software like anydesk that run on port 3001 but once you closed it they might run again on 3001 so you should always double check when you kill process on port
so -
kill the process on port 3001
netstat again port 3001
kill again if still something else run on 3001
run perplexica

@chandujr
Copy link

chandujr commented Feb 1, 2025

I'm also facing the same issue. I ran sudo netstat -tuln | grep :3001 and there are no processes running on it. I added the OLLAMA_HOST environment variable in the service file and restarted ollama service and Perplexica container. Still I'm getting the screen "Failed to connect to the server. Please try again later.". Why is that?

@chandujr
Copy link

chandujr commented Feb 1, 2025

The issues seemed to be multi-fold. My 3001 port is already used by Open WebUI, so I had to change all the port numbers in Perplexica configs. Second, I had to change all 127.0.0.1 entries to localhost because of CORS errors. Third, since I'm on Linux, the Ollama API URL should be http://<local ip address>:11434, not the default http://host.docker.internal:11434. Fourth, the Ollama service file needs to have the environment set to 0.0.0.0

Summary of changes

config.toml

PORT = 3002
.
.
[API_ENDPOINTS]
SEARXNG = "http://<local ip addr>:32768" # SearxNG API URL
OLLAMA = "http://<local ip addr>:11434" # Ollama API URL

docker-compose.yaml

perplexica-backend:
.
.
ports:
      - 3002:3002
.
.
perplexica-frontend:
.
.
args:
        - NEXT_PUBLIC_API_URL=http://localhost:3002/api
        - NEXT_PUBLIC_WS_URL=ws://localhost:3002
.
.
ports:
      - 3003:3000

ui/.env file

Create the file if it doesn't exist.

NEXT_PUBLIC_WS_URL=ws://localhost:3002
NEXT_PUBLIC_API_URL=http://localhost:3002/api

/etc/systemd/system/ollama.service

.
.
[Service]
Environment="OLLAMA_HOST=0.0.0.0"

Then run:

systemctl daemon-reload
systemctl restart ollama

After making all these changes:

docker compose down
docker compose build
docker compose up -d

Now I can access Perplexica in http://localhost:3003.

@morozoff-dev
Copy link

I've done all of this but nothing helped(
Also (I have the same issue), I see error only in UI. In docker-compose logs of perplexica there are no errors

@chandujr
Copy link

chandujr commented Feb 5, 2025

I've done all of this but nothing helped( Also (I have the same issue), I see error only in UI. In docker-compose logs of perplexica there are no errors

Are you using Linux? Try changing the port numbers used. Can you post the log that you see? If you are using a browser to access Perplexica, open the console and see if there are any errors there.

@qinjinghub2
Copy link

I am deployed on an Oracle server, have opened the firewall and VPS port rules, and have performed the above operations, yet it still reports. https://www.docker.com

@gngglobetech
Copy link

XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Remove the Host Network: Line
And use the direct URL

XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

version: "3.9" # Specify a version for best compatibility

services:
searxng:
image: docker.io/searxng/searxng:latest
volumes:
- ./searxng:/etc/searxng:rw
ports:
- 4000:8080
networks:
- perplexica-network
restart: unless-stopped

perplexica-backend:
build:
context: .
dockerfile: backend.dockerfile
image: itzcrazykns1337/perplexica-backend:main
environment:
- SEARXNG_API_URL=http://searxng:8080 # Keep this as searxng, Docker will resolve it
depends_on:
- searxng
ports:
- 3001:3001
volumes:
- backend-dbstore:/home/perplexica/data
- uploads:/home/perplexica/uploads
- ./config.toml:/home/perplexica/config.toml
# Remove extra_hosts - Not needed when using IP directly in frontend
networks:
- perplexica-network
restart: unless-stopped

perplexica-frontend:
build:
context: .
dockerfile: app.dockerfile
args:
- NEXT_PUBLIC_API_URL=http://192.168.1.69:3001/api # Use the remote IP
- NEXT_PUBLIC_WS_URL=ws://192.168.1.69:3001 # Use the remote IP
image: itzcrazykns1337/perplexica-frontend:main
depends_on:
- perplexica-backend
ports:
- 3000:3000
networks:
- perplexica-network
restart: unless-stopped

networks:
perplexica-network:

volumes:
backend-dbstore:
uploads:

@gngglobetech
Copy link

Image

@gngglobetech
Copy link

Image

@trinilopez99
Copy link

In my case it was the ufw firewall. I had to allow access to port 11434 from the docker IP 172.x.x.x, which I got using docker inspect perplexica-backend

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants