Setup Ollama stack on macOS.
- A Metal capable Mac device.
- Mods: AI for the command line, built for pipelines.
- Docker: The fastest way to containerize applications.
graph LR;
subgraph Host
subgraph CLI
B(Mods)
G(Collector)
end
subgraph Server
C(Ollama)
D[Metal]
end
end
subgraph Container
E(LiteLLM Proxy)
F(Ollama Web UI)
H(Prometheus)
I(Grafana)
end
A(User) --> |Terminal|B;
A --> |Browser|F;
B --> |OpenAI API|E;
E --> |REST API|C;
F --> |REST API|C;
C-. Link .-> D;
H --> |Client API|G;
I --> |Data Source|H;
$ git clone https://github.com/yeahdongcn/OllamaStack.git
$ cd OllamaStack
$ ./start.sh
$ ./stop.sh