-
Notifications
You must be signed in to change notification settings - Fork 1
Open WebUI Notes
NOTE: This assumes that you have downloaded Ollama beforehand. If not, please refer to the appropriate guide.
In a new terminal run (see below if this doesn't work):
docker run -it --name [INSERT CONTAINER NAME] -p 8080:8080 southerncrossai/modelworks:modelworks
or, alternatively, to run container with GPU:
docker run -it --gpus all --runtime=nvidia --name [INSERT CONTAINER NAME] -p 8080:8080 southerncrossai/modelworks:modelworks /bin/bash
Use cd to move to the dir you want to work in, then run
python3 -m venv ./app/venv
source ./app/venv/bin/activate
pip install open-webui
After this is done, write:
open-webui serve
Then open a new terminal, and reopen your container:
docker exec -it "[INSERT CONTAINER NAME]" bash
And in your container write the following:
source ./app/venv/bin/activate
ollama serve
After this, you can open a browser and go to http://localhost:8080
If this is your first time, you would need to create an admin account to access the rest of Open WebUI.

You can do so by entering your email, name, and any password you'd like.

Otherwise, just log in with your pre-existing credentials.

After this, you should be greeted by the following page:

If the appropriate instances are being run in the background, you can access your models around the top-left corner of the chat window:

- Again, this guide assumes you have downloaded Ollama in your container. If you haven't done so already, install the models you'd like to use.
Another important tab to visit is the Workspaces one located near the top-left corner of your whole screen:

From there you can locate your models, knowledge, prompts, and tools, which can help with customising and improving your UI:

When a model sends a response, Open WebUI gives the user the option to rate it:
This can be used for features involving assessing the confidence or satisfaction score related to the model's responses.
Welcome to the ModelWorks Wiki! Use the links below to navigate our resources quickly.
Product Information
- π¦ !! JoeyLLM Guide !! π¦
- Shared Product Vision
- Personas, Scenarios, & User Stories
- Procedure to Resolve Conflicts
- Project Milestones
- Decision Making Protocol
- Story Point Estimation Algorithm
- Gradio Notes
- Javascript Notes (split into two)
- Ollama Notes
- Langchain Notes
- Chroma Notes
- GPU Monitor Notes
- Open WebUI Notes
- Web Search Notes
- Chat History Notes
- Meeting Minute 1
- Meeting Minute 2
- Meeting Minute 3
- Meeting Minute 4
- Meeting Minute 5
- Meeting Minute 6
- Meeting Minute 7
- Meeting Minute 8
- Meeting Minute 9
- Meeting Minute 10
- Meeting Minute 11
