-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Not finished][Do not review] #6823
base: main
Are you sure you want to change the base?
Conversation
…lly" This reverts commit 28e6c27.
|
||
# Add local metrics to observation | ||
if self.state and self.state.local_metrics: | ||
observation.llm_metrics = copy.deepcopy(self.state.local_metrics) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if we added it, instead of observation, to the action? Somewhere
- after we got it from the agent at line 689
action = self.agent.step(self.state)
- and just before it's added to the event stream, like before this line 751
self.event_stream.add_event(action, action._source) # type: ignore [attr-defined]
Then it would be saved in the stream, so the server would be able to read it in its dict it sends to frontend. 🤔
I think we have some difficult problem atm with sending extra bits of information to the frontend, I'm sorry about that. This idea might be one way to do it, and it should work for our purpose.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤗I tried to hard-code llm_metrics in session.py before sending the "oh_event" response in send, and it successfully appeared on the frontend. Then, I went back to check where preparing the related event and found that adding llm_metrics in actions with EventSource.AGENT allows the content to be successfully pushed into the event stream.
As a result, all subscribers, including EventStreamSubscriber.SERVER in session.py, can fetch it. Ultimately, it gets wrapped in the "oh_event" and sent to the frontend.
I guess I understand the full logic now.Including your advice's backend reason
🤔But initially when I tried to add the code following your advice to test, it doesn't work. Let me check again and continue the debugging process. Thanks for your kindness. I have idea now I think!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I think eventstream.add_event
saves the event, with everything in it, including _llm_metrics
if we set it on the event. So the event gets transmitted to subscribers, including SERVER, with its contents.
@AutoLTX Thank you for the work on this! Please don't worry about the fact that it's unfinished, it's fine, that's why it's a draft. Everyone does temporary stuff in a draft. This PR ends up much more complex than we first thought, I apologize about that! We seem to have two difficulties here:
What do you say if we split this PR in two, maybe three:
That may make it easier to keep in sync with the high activity here, and easier to review and approve. |
@enyst Thanks for the advice! I'd like to breakdown the PR into 3 parts like following for quick review. This can also avoid solving conflict many times.
Do you think this is an acceptable way to do the breakdown? |
Definitely, and I think this one is conceptually first, we need the data, right?
Can you please help me understand, where will that line be? |
End-user friendly description of the problem this fixes or functionality that this introduces
Give a summary of what the PR does, explaining any non-trivial design decisions
Link of any specific issues this addresses