-
Notifications
You must be signed in to change notification settings - Fork 872
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Running Prompt flow locally produces errors #3751
Comments
We see something similar, however in our case it is an error and failing the execution:
|
Seeing similar errors on my end, this is halting flow execution:
|
Same issue, resulting in flow being terminated: |
Some keys in the tokens have values that are not int, causing the issue.
As a temporary solution, tracing can be disabled by setting PF_DISABLE_TRACING=true. |
Rolling back the version of openai also works, |
@asos-oliverfrost thank you for finding that! Rolling back worked for me. It looks like this is the specific line that is breaking the behavior in prompt flow, because the new field completion_token_details is optional: openai/openai-python@v1.44.1...v1.45.0#diff-d85f41ac9f419751206af46c34ef5c8c74258660be492aa703dcbebcfc96a41bR25 |
Rolling back OpenAI did not work for me, Pydantic is so dynamic that I still get
|
Also having this issue, with similar errors:
This happens when running a version of the |
This worked for me |
oh. @mallapraveen @jomalsan YOu saved me. Thank you millions. |
Traceback (most recent call last): Getting this error. |
I'm able to run locally, but I get TypeError: unsupported operand type(s) for +: 'int' and 'NoneType' after deploying using Azure AI Studio |
This did the trick for me !! thank you so much <3 |
I have found PR that may be relevant to this issue. |
I'm also noticing this issue recently (was working a month ago) when deploying prompt flow from Azure AI Studio, created from Chat playground. |
We encountered the problem as well. environment_variables:
PF_DISABLE_TRACING: true |
As @JanWerder suggested this did the trick for me
|
This is fixed with promptflow-tracing 1.16.1 |
@luigiw It does not seem that this fix works with the base docker promptflow runtime image. As the image is still based on Python 3.9 |
@luigiw after updating to 1.16.1 the error goes away, and that is good, but a lot of warning still persists:
And what I notice on AppInsights if that the logs for llm tools shows token consumptions (completion, prompt and total); but the "same" metrics for te whole flow shows only cosumption token for total token and "0" for the other two: |
I have found PR that may be relevant to this issue: #3806 |
Describe the bug
Running a flow using
pf test
seems to work but exceptions are reported locally.How To Reproduce the bug
Run
pf test
on a flow locally. The flow executes successfully but exceptions are generated when collecting token metrics for openai.Expected behavior
A clean run without exceptions
Screenshots
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute 'computed.cumulative_token_count.completion' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute 'llm.usage.completion_tokens_details' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
WARNING:opentelemetry.attributes:Invalid type NoneType for attribute 'computed.cumulative_token_count.completion' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types
Running Information(please complete the following information):
{
"promptflow": "1.15.0",
"promptflow-azure": "1.15.0",
"promptflow-core": "1.15.0",
"promptflow-devkit": "1.15.0",
"promptflow-tracing": "1.15.0"
}
Executable 'c:\git\azure-ai-prompt-flow.venv\Scripts\python.exe'
Python (Windows) 3.10.5 (tags/v3.10.5:f377153, Jun 6 2022, 16:14:13) [MSC v.1929 64 bit (AMD64)]
Additional context
Seems like the exception is in here
On the line
key: self._span_id_to_tokens[parent_span_id].get(key, 0) + tokens.get(key, 0)
. Not all steps in my flow are LLM steps. I wonder if this could be causing the issue where we have some steps in the flow that don't have any tokens.The text was updated successfully, but these errors were encountered: