Skip to content

Commit

Permalink
Fixed num_context_tokens always being 0 if the value of max_context_t…
Browse files Browse the repository at this point in the history
…okens is not exceeded
  • Loading branch information
mopemope authored and seratch committed Nov 24, 2023
1 parent 6112bc2 commit b1ca551
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions app/openai_ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,9 @@ def messages_within_context_window(
if not removed:
# Fall through and let the OpenAI error handler deal with it
break
else:
num_context_tokens = num_tokens

return messages, num_context_tokens, max_context_tokens


Expand Down

0 comments on commit b1ca551

Please sign in to comment.