You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that when I search for "In-situ Flow Information Telemetry", I will encounter an error written below:
openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens, however you requested 12301 tokens (12301 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.
the same thing happened when I run it locally.
Please check your text length limit to see if the restriction you define don't meet with the token length limit.
The text was updated successfully, but these errors were encountered:
I think that the chunk you set is 1000, and the sentences length you require is larger than 8. that means that the text chunk you send to openai is possible to cause that problem, since it is larger than 8000.
I noticed that when I search for "In-situ Flow Information Telemetry", I will encounter an error written below:
openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens, however you requested 12301 tokens (12301 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.
the same thing happened when I run it locally.
Please check your text length limit to see if the restriction you define don't meet with the token length limit.
The text was updated successfully, but these errors were encountered: