👨💻 Developed by Matheus Ramalho de Oliveira
🏗️ Brazilian Software Engineer
✉️ [email protected]
🦫 LinkedIn • Instagram
This application is a frontend for the LLM (large language model) Ollama. Ollama is an interface created by Meta that facilitates the use of artificial intelligence.
Craco
Flux Architecture
React.js
React Hooks Global State
React Router
React Transition Group
Styled Components
TypeScript
- You need to have the Ollama server installed on your machine, or configure the app to use an external URL;
- Make a clone of this repository;
- Open the project folder in a terminal;
- Run
yarn
to install dependencies; - Run
yarn start
to launch athttp://localhost:3000
.
- You need to have the Ollama server installed on your machine, or configure the app to use an external URL;
- By default, the app uses the llama3 model, you can install it with the command:
ollama run llama3
; - If you have the local server, run it with the following command releasing CORS:
export OLLAMA_ORIGINS=https://*.github.io && ollama serve
; - Access at: kastorcode.github.io/ollama-gui-reactjs.
<kastor.code/>