-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenBB agent copilot #15
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A couple of tweaks 🙏 .
try: | ||
result = openbb_agent(str(chat_messages), verbose=False, openbb_pat=os.getenv("OPENBB_PAT")) | ||
|
||
return {"output": result} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You'll still need to stream the result back (just as a single chunk) using the create_message_stream
function. We now use named events for our SSEs for things to be parsed and rendered correctly on the front-end.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I actually tried this and had no success with it, can you share if we have more documentation on how to do this? Because in this case the output is literally just a string.
I know we have an async openbb_agent but I looked at the repo, and it's missing the PAT argument which is really important as it grants it access to all my data.
PS: I kept getting that the following screenshot. cc @mnicstruwig
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@DidierRLopes I think there might be something I've missed on my part. Let me investigate for you, and I'll give you an update 🙏 .
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@DidierRLopes Fixed the outputs using SSEs -- we just had to use the hasStreaming=true
in Copilot (since Terminal Pro officially only supports streaming output now). We just stream back the entire answer in a single chunk.
Co-authored-by: Michael Struwig <[email protected]>
Co-authored-by: Michael Struwig <[email protected]>
Co-authored-by: Michael Struwig <[email protected]>
Co-authored-by: Michael Struwig <[email protected]>
@@ -3,7 +3,7 @@ | |||
"name": "OpenBB Agent Copilot", | |||
"description": "AI financial analyst using the OpenBB Platform.", | |||
"image": "https://github.com/user-attachments/assets/010d7590-0a65-4b3f-b21a-0cbc0d95bcb9", | |||
"hasStreaming": false, | |||
"hasStreaming": true, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AHHHHH - this might have been the culprit.
Thank you!
It leverages the OpenBB Agent repo: https://github.com/OpenBB-finance/openbb-agents