OpenAI-compatible Assistants API backed by Bee Agent Framework
Tip
🚀 The fastest way to setup Bee (UI + API) is through Bee Stack.
- Create
.env
(from.env.example
) and fill in values. - Run
pnpm install
to install dependencies. - Start the server with
pnpm start:dev
- Fastify as the web framework
- MikroORM backed by MongoDB as the database layer
- BullMQ backed by Redis as the job executor
- Bee Agent Framework as the agent execution engine
The Assistants API consists mostly of CRUDL endpoints for managing API resources like assistants, threads, runs and more. Furthermore, some resources are asynchronous in a sense that they contain status
changing over time as the background execution progresses. Clients use polling or streaming to watch for status updates of such resources.
The infrastructure consists of:
- REST API server
- MongoDB
- Redis
The REST API server stores resources in MongoDB database. Redis is used by BullMQ, rate limiter and as pub/sub broker for event streaming. Agent execution is performed by the Bee Agent Framework using various adapters for inference and embeddings.
The codebase contains several types of modules:
*.modules.ts
containing endpoint handlers*.services.ts
containing services for the handlersdtos/*.ts
containing JSON schema definitions for resources*.entity.ts
containing ORM definitions for database entities*.queue.ts
containing BullMQ queues and workers for asynchronous execution
These modules are connected in the following manner
module ---> dto
---> service ---> entity
---> queue ---> entity
OpenAPI schema is auto-generated from the dtos
and exposed on the /docs
endpoint.
docker run -d -p 27017:27017 mongo:latest
docker run -d -p 6379:6379 redis:latest