macOS app for interacting with local LLMs, including Ollama. Focused on superprompting and accessing local data.
-
Updated
Jul 4, 2024 - Python
macOS app for interacting with local LLMs, including Ollama. Focused on superprompting and accessing local data.
📜 A quest will be assigned to you by LLM.
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
LLMX; Easiest 3rd party Local LLM UI for the web!
Describe images by using Llava
A Fun project using Ollama, Streamlit & PyShark to chat with PCAP/PCAPNG files locally, privately!
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
A client library that makes it easy to connect microcontrollers running MicroPython to the Ollama server
Frontend for the Ollama LLM, built with React.js and Flux architecture.
Elia+ 👉 An experimental, snappy, and keyboard-centric UI for interacting with AI agents and augmenting humans using AI! Chat about any thing with any agent. ⚡
Ollama client for Swift
🤖 Discord bot for users to create and interact with locally hosted AI chat models. Powered by Ollama.
this repo demonstrates the ai capabilities over spring boot
Go package and example utilities for using Ollama / LLMs
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."