A tiny browser‑based chat UI packaged as a single static HTML file. It is intended for lightweight, one-off usage.
- Install dependencies:
bun install - Dev:
bun dev - Build:
bun build-dist
- Click the Settings icon in the header.
- Enter your OpenAI‑compatible base URL (e.g.
https://api.openai.com/v1) and API key, then click "Save". - Click "Sync from API" to fetch model list.
- Enter a prompt and press Enter to send (Shift+Enter for newline).
- Hover over a message to reveal actions: copy, edit, delete (removes the node and reconnects its children to the parent), or split (detach from its parent to start a new thread).
- Drop plaintext files into the message area to append their contents to the input.
- Import or export the full conversation tree via the header buttons; exported JSON captures every branch, not just the active chat path.
- Snapshots now use the tree format introduced in this refactor; older graph exports are not supported.
- The API key is stored locally in the browser (IndexedDB) and requests are sent directly from the client. Treat the key as accessible to any JavaScript code running in the page.
- React 18, Mantine UI, Tailwind CSS
- Vercel AI SDK
- Rsbuild (Rspack)
- lmg-anon/mikupad: LLM Frontend in a single html file
MIT