Skip to content

kastorcode/ollama-gui-reactjs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama LLM Graphical User Interface

👨‍💻 Developed by Matheus Ramalho de Oliveira
🏗️ Brazilian Software Engineer
✉️ [email protected]
🦫 LinkedInInstagram


This application is a frontend for the LLM (large language model) Ollama. Ollama is an interface created by Meta that facilitates the use of artificial intelligence.


Screenshots


Technologies

Craco
Flux Architecture
React.js
React Hooks Global State
React Router
React Transition Group
Styled Components
TypeScript


Installation and execution

  1. You need to have the Ollama server installed on your machine, or configure the app to use an external URL;
  2. Make a clone of this repository;
  3. Open the project folder in a terminal;
  4. Run yarn to install dependencies;
  5. Run yarn start to launch at http://localhost:3000.

Running from GitHub Pages

  1. You need to have the Ollama server installed on your machine, or configure the app to use an external URL;
  2. By default, the app uses the llama3 model, you can install it with the command: ollama run llama3;
  3. If you have the local server, run it with the following command releasing CORS: export OLLAMA_ORIGINS=https://*.github.io && ollama serve;
  4. Access at: kastorcode.github.io/ollama-gui-reactjs.

<kastor.code/>