-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conditional flow with tool calls #33
Comments
Hi, thanks for your question @r-leyshon. I'm having a hard time understanding the question though. Could you outline, maybe in pseudo-code, what you're wanting to do? I think in particular I'm having a hard time seeing what you mean with "the tool call response ... being available for additional conditional flow". |
@gadenbuie thanks for the quick look at that, I'll try to show what I intend to do with the streamed model response below:
I've struggled to inspect the streamed parts and do things with them, at least when trying to await certain events for async appending to the message stream. My fallback is to switch off streaming but I'd rather not do that. |
Thanks for the clarification. Right now, I don't think it's possible to do what you're trying to do, at least not with the streaming response approach. Given that, I think it's very reasonable to use something like a notification to indicate that a tool was called. We'll definitely keep your use case in mind as we develop chatlas and the Shiny chat component. |
Gotcha, no worries I can fall back to removing streaming. Incase I wasn't clear, I need to handle the response for reasons other than showing a notification to the user. The notification would be a placeholder for the conditional logic that I'm using to pull the model response apart for other things that would be too onerous to share in the little example above. I wanted to say that the chatlas approach seems very friendly and in the future, if I can keep each tool as a functional unit, I shall prefer to use chatlas over handling the response myself. |
Could you describe the kinds of things you're wanting to do with the model response? That would help us understand the use case. I'm going to reopen this issue so that we make sure we keep it in mind moving forward. |
Sure thing, the code I've fallen back to lives here: https://github.com/ministryofjustice/github-chat , Specifically this bit:
|
Hi there,
I'm interested in this package for a more complex chatbot that I've been developing. I need to preferably stream openAI responses with tools. Getting this to work with the shiny chat component has been a little tricky, but then I came across a reference to chatlas in the docstring for the shiny Chat class. Working through your docs and examples, I can get so far with it.
This approach allows for streaming responses with tool calls and shiny ui
widgets. Though the tool calls need to be self-contained. Whatever the tool
returns is fed into the model for a subsequent response, rather than being
available for additional conditional flow.
I would like to instead receive the function name and parameter values and
call get_current_temperature(), passing the json response back to model.
This would allow me to display a notification without relying on a side
effect.
The text was updated successfully, but these errors were encountered: