Skip to content
@godsfromthemachine

gods from the machine

Intelligent programs powered by local AI. Autonomous software that builds, runs, and improves itself.

Gods from the Machine

Inspired by the Gods from the Machine raid in Star Wars: The Old Republic — a pantheon of ancient, sentient super-weapons worshiped as deities on the mechanical Dyson sphere planet Iokath.

Vision

We build autonomous, AI-powered software — "gods in the machine" — that operate entirely on local, private, offline AI models. No cloud dependencies. No API keys to external services. Just raw local inference powering intelligent programs.

Every project is designed to run on consumer hardware using small, capable models (Qwen 3.5 family: 0.8B, 2B, 4B, 9B) served via llama.cpp.

Projects

Project Language Description
gilgamesh Go TDD-driven local AI coding agent (v0.6). CLI + MCP + HTTP API. 7 tools, 7 built-in skills, streaming markdown rendering, graceful Ctrl+C, error classification, config validation, env var overrides, memory persistence, conversation history, 243 tests. Includes Go benchmark suite for model trialing.
zeus Zig Zig-as-a-build-system for llama.cpp-powered local GGUF inference. (incomplete)
raijin Rust ONNX CPU inference engine. (incomplete)
garuda Zig Directory tree viewer. (incomplete)
godsfromthemachine.github.io Hugo Official project website — architecture, roadmap, and documentation.

Principles

  • Local-first: All intelligence comes from free, local, private AI models
  • Blazing fast: Built in Go, Rust, Zig, and other high-performance languages
  • Lean: Minimal token overhead for CPU inference — first response in seconds, not minutes
  • Open source: MIT/GNU licensed — free to use, learn from, and contribute to
  • CLI + MCP + API: Every god has CLI, MCP, and HTTP/API interfaces
  • Mythological: Projects named after gods and legends — each one a specialized autonomous entity

The CLI / MCP / API Duality

Every god exposes the same capabilities through three interfaces — because every MCP tool functions just as well as a CLI command given to an agent via a shell, and just as well as an HTTP endpoint called by another program. We build all three so gods can interface with human users, other agents, external programs, and each other.

CLI / MCP / API Architecture

Research

We run controlled model trials on consumer hardware — no GPU, just CPU inference. Our benchmark suite tests models across 6 stages from raw inference to full agent edit tasks.

Key findings:

  • Qwen3.5 2B Q4_K_M: speed sweet spot (19 tok/s TG on CPU)
  • Qwen3.5 4B Q4_K_M: quality ceiling (2 tool calls vs 9 for same task)
  • 12 threads optimal on 16-core EPYC (16 threads degrades TG by 30%)
  • KV cache q4_0 quantization saves 5-7% RAM with no quality loss
  • Token budget under 1,600 tokens — competitors use 10,000-40,000

See the full research page and trial methodology.

Get involved

All projects are open source. Browse the repos, open issues, submit PRs. We're building the future of autonomous local AI software.

godsfromthemachine.github.io — Visit the website for architecture docs, roadmap, and more.

Popular repositories Loading

  1. gilgamesh gilgamesh Public

    Local AI-powered coding agent with a test-driven approach to software engineering. Part of Gods from the Machine.

    Go

  2. raijin raijin Public

    lightning fast CPU inference in ONNX for deepseek-r1-distill-qwen-1.5b

    Rust

  3. garuda garuda Public

    directory tree viewer in Zig that replicates the functionality of PowerShell's tree command

    Zig

  4. zeus zeus Public

    zig-as-a-build-system for llama.cpp-powered local gguf inference

    Zig

  5. .github .github Public

  6. godsfromthemachine.github.io godsfromthemachine.github.io Public

    Official website for Gods from the Machine — autonomous programs powered by local AI

    CSS

Repositories

Showing 6 of 6 repositories

Top languages

Loading…

Most used topics

Loading…