This document provides details about the endpoints available in the BBai API.
All endpoints are relative to: https://<hostname>:<port>/api/v1
- GET
/status
- Check the status of the API.
- Response:
{ "status": "OK", "message": "API is running" }
- GET
/conversation
- Retrieve a list of conversations with pagination and filtering options.
- Query Parameters:
page
(integer, default: 1): Page number for paginationpageSize
(integer, default: 10): Number of items per pagestartDate
(string, format: date): Filter conversations starting from this dateendDate
(string, format: date): Filter conversations up to this datellmProviderName
(string): Filter conversations by LLM provider namestartDir
(string, required): The starting directory for the project
- Response: List of conversations with pagination details
- GET
/conversation/:id
- Retrieve details of a specific conversation.
- Query Parameters:
startDir
(string, required): The starting directory for the project
- Response: Conversation details including messages, LLM provider, and token usage
- POST
/conversation/:id
- Continue an existing conversation.
- Request Body:
{ "statement": "string", "startDir": "string" }
- Response: LLM-generated response with conversation details
- DELETE
/conversation/:id
- Delete a specific conversation.
- Query Parameters:
startDir
(string, required): The starting directory for the project
- Response: Deletion confirmation message
- POST
/conversation/:id/clear
- Clear the history of a specific conversation.
- Query Parameters:
startDir
(string, required): The starting directory for the project
- Response: Confirmation message
- GET
/ws/conversation/:id
- Establish a WebSocket connection for real-time conversation updates.
- The client can send messages with the following format:
{ "task": "greeting" | "converse" | "cancel", "statement": "string", "startDir": "string" }
- The server will emit events for conversation updates, including:
conversationReady
conversationContinue
conversationAnswer
conversationError
conversationCancelled
The BBai API supports various LLM tools that can be used within conversations. Here are the available tools:
Moves one or more files or directories to a new location within the project.
Parameters:
sources
: Array of strings representing the paths of files or directories to be moved.destination
: String representing the path of the destination directory.overwrite
(optional): Boolean indicating whether to overwrite existing files at the destination (default: false).
Example usage in a conversation:
{
"toolName": "move_files",
"toolInput": {
"sources": ["path/to/file1.txt", "path/to/directory"],
"destination": "path/to/new/location",
"overwrite": true
}
}
The following features are mentioned in the codebase but are not fully implemented or exposed through the API:
- Adding files to a conversation
- Removing files from a conversation
- Listing files in a conversation
- Retrieving token usage
- Running CLI commands
- Loading external content
- Retrieving conversation logs
- Undoing the last change in a conversation
These features may be implemented in future versions of the API.
All endpoints may return appropriate HTTP status codes for various error conditions. Common error responses include:
- 400 Bad Request: For invalid input or missing required parameters
- 404 Not Found: When a requested resource (e.g., conversation) is not found
- 500 Internal Server Error: For unexpected server-side errors
Detailed error messages will be provided in the response body when applicable.
The current implementation does not include authentication. It is designed for local use only. Ensure proper security measures are in place when deploying this API in a production environment.
This documentation is for API version 1 (v1
). Future versions may introduce changes to the endpoint structure or functionality.