-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Not finished][Do not review] #6823
Draft
AutoLTX
wants to merge
23
commits into
All-Hands-AI:main
Choose a base branch
from
AutoLTX:tilin/displayCost
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Changes from 20 commits
Commits
Show all changes
23 commits
Select commit
Hold shift + click to select a range
0ab12fa
add single button test
711ee23
use toggle switch
5425b73
add npm package
62db6f8
update to avoid package conflict for react 19
c645ac1
expose usage in frontend
23f871c
fix
70c7c39
use alert temporarily
a4a9580
Merge branch 'All-Hands-AI:main' into tilin/displayCost
AutoLTX 28e6c27
add logs for verify whether llm_metrics record cost successfully
732d63d
Merge branch 'All-Hands-AI:main' into tilin/displayCost
AutoLTX 1188d95
Revert "add logs for verify whether llm_metrics record cost successfu…
ae19486
add logs for verify llm metrics
250e7c9
use llm metric result
5a77243
add more logs for debugging llm metrics
6ac69b0
add logs during action and observation converting process for debugging
5d486d0
try to update the llm metric for observation in controller mode
1b9b7da
fix bugs in log
d705ac6
add more logs and a fix
ff9e092
fix
6d1f829
add one more position log
21eabba
add metrics in action and more logs
108aa37
expose llm_metrics in frontend
ffc0c48
Merge branch 'All-Hands-AI:main' into tilin/displayCost
AutoLTX File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if we added it, instead of observation, to the action? Somewhere
Then it would be saved in the stream, so the server would be able to read it in its dict it sends to frontend. 🤔
I think we have some difficult problem atm with sending extra bits of information to the frontend, I'm sorry about that. This idea might be one way to do it, and it should work for our purpose.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤗I tried to hard-code llm_metrics in session.py before sending the "oh_event" response in send, and it successfully appeared on the frontend. Then, I went back to check where preparing the related event and found that adding llm_metrics in actions with EventSource.AGENT allows the content to be successfully pushed into the event stream.
As a result, all subscribers, including EventStreamSubscriber.SERVER in session.py, can fetch it. Ultimately, it gets wrapped in the "oh_event" and sent to the frontend.
I guess I understand the full logic now.Including your advice's backend reason
🤔But initially when I tried to add the code following your advice to test, it doesn't work. Let me check again and continue the debugging process. Thanks for your kindness. I have idea now I think!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I think
eventstream.add_event
saves the event, with everything in it, including_llm_metrics
if we set it on the event. So the event gets transmitted to subscribers, including SERVER, with its contents.