Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenBB agent copilot #15

Open
wants to merge 8 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions openbb-agent-copilot/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# OpenBB Agent Copilot

This example provides a copilot that leverages the OpenBB Platform to retrieve financial data.

## Overview

This implementation utilizes a FastAPI application to serve as the backend for
the copilot.

## Getting started

Here's how to get your copilot up and running:

### Prerequisites

Ensure you have poetry, a tool for dependency management and packaging in
Python, as well as your OpenAI API key.
DidierRLopes marked this conversation as resolved.
Show resolved Hide resolved

### Installation and Running

1. Clone this repository to your local machine.
2. Set the OpenAI API key as an environment variable in your .bashrc or .zshrc file:

``` sh
# in .zshrc or .bashrc
export OPENAI_API_KEY=<your-api-key>
export OPENBB_PAT=<your-openbb-pat>
```

The later can be found here: https://my.openbb.co/app/platform/pat
DidierRLopes marked this conversation as resolved.
Show resolved Hide resolved

3. Install the necessary dependencies:

``` sh
poetry install --no-root
```

4.Start the API server:

``` sh
poetry run uvicorn openbb_agent_copilot.main:app --port 7777 --reload
```

This command runs the FastAPI application, making it accessible on your network.
Empty file.
13 changes: 13 additions & 0 deletions openbb-agent-copilot/openbb_agent_copilot/copilots.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"example_copilot": {
"name": "OpenBB Agent Copilot",
"description": "AI financial analyst using the OpenBB Platform.",
"image": "https://github.com/user-attachments/assets/010d7590-0a65-4b3f-b21a-0cbc0d95bcb9",
"hasStreaming": false,
"hasDocuments": false,
"hasFunctionCalling": false,
"endpoints": {
"query": "http://localhost:7778/v1/query"
}
}
}
82 changes: 82 additions & 0 deletions openbb-agent-copilot/openbb_agent_copilot/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
import re
import json
from pathlib import Path
from typing import AsyncGenerator
from openbb_agents.agent import openbb_agent
import os

from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import JSONResponse
from magentic import (
UserMessage,
AssistantMessage,
AsyncStreamedStr,
)

from dotenv import load_dotenv
from .models import AgentQueryRequest

load_dotenv(".env")
app = FastAPI()

origins = [
"http://localhost",
"http://localhost:1420",
"http://localhost:5050",
"https://pro.openbb.dev",
"https://pro.openbb.co",
]

app.add_middleware(
CORSMiddleware,
allow_origins=origins,
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)


def sanitize_message(message: str) -> str:
"""Sanitize a message by escaping forbidden characters."""
cleaned_message = re.sub(r"(?<!\{)\{(?!{)", "{{", message)
cleaned_message = re.sub(r"(?<!\})\}(?!})", "}}", cleaned_message)
return cleaned_message


async def create_message_stream(
content: AsyncStreamedStr,
) -> AsyncGenerator[dict, None]:
async for chunk in content:
yield {"event": "copilotMessageChunk", "data": {"delta": chunk}}


@app.get("/copilots.json")
def get_copilot_description():
"""Widgets configuration file for the OpenBB Terminal Pro"""
return JSONResponse(
content=json.load(open((Path(__file__).parent.resolve() / "copilots.json")))
)


@app.post("/v1/query")
async def query(request: AgentQueryRequest):
"""Query the Copilot."""

chat_messages = []
for message in request.messages:
if message.role == "ai":
chat_messages.append(
AssistantMessage(content=sanitize_message(message.content))
)
elif message.role == "human":
chat_messages.append(UserMessage(content=sanitize_message(message.content)))

try:
result = openbb_agent(str(chat_messages), verbose=False, openbb_pat=os.getenv("OPENBB_PAT"))

return {"output": result}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You'll still need to stream the result back (just as a single chunk) using the create_message_stream function. We now use named events for our SSEs for things to be parsed and rendered correctly on the front-end.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually tried this and had no success with it, can you share if we have more documentation on how to do this? Because in this case the output is literally just a string.

I know we have an async openbb_agent but I looked at the repo, and it's missing the PAT argument which is really important as it grants it access to all my data.

PS: I kept getting that the following screenshot. cc @mnicstruwig

Screenshot 2024-08-20 at 12 57 51 AM

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DidierRLopes I think there might be something I've missed on my part. Let me investigate for you, and I'll give you an update 🙏 .

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DidierRLopes Fixed the outputs using SSEs -- we just had to use the hasStreaming=true in Copilot (since Terminal Pro officially only supports streaming output now). We just stream back the entire answer in a single chunk.


except Error as e:
DidierRLopes marked this conversation as resolved.
Show resolved Hide resolved
error_message = e.json_body.get('error', {}).get('message', 'An unknown error occurred.')
return JSONResponse(status_code=400, content={"error": error_message})
DidierRLopes marked this conversation as resolved.
Show resolved Hide resolved
48 changes: 48 additions & 0 deletions openbb-agent-copilot/openbb_agent_copilot/models.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
from typing import Any
from uuid import UUID
from pydantic import BaseModel, Field, field_validator
from enum import Enum


class RoleEnum(str, Enum):
ai = "ai"
human = "human"


class LlmMessage(BaseModel):
role: RoleEnum = Field(
description="The role of the entity that is creating the message"
)
content: str = Field(description="The content of the message")


class BaseContext(BaseModel):
uuid: UUID = Field(description="The UUID of the widget.")
name: str = Field(description="The name of the widget.")
description: str = Field(
description="A description of the data contained in the widget"
)
content: Any = Field(description="The data content of the widget")
metadata: dict[str, Any] | None = Field(
default=None,
description="Additional widget metadata (eg. the selected ticker, etc)",
)


class AgentQueryRequest(BaseModel):
messages: list[LlmMessage] = Field(
description="A list of messages to submit to the copilot."
)
context: list[BaseContext] | None = Field(
default=None,
description="Additional context.",
)
use_docs: bool = Field(
default=None, description="Set True to use uploaded docs when answering query."
)
DidierRLopes marked this conversation as resolved.
Show resolved Hide resolved

@field_validator("messages")
def check_messages_not_empty(cls, value):
if not value:
raise ValueError("messages list cannot be empty.")
return value
Loading