Skip to content

Commit

Permalink
Merge pull request #741 from parea-ai/fix-get-tokens-no-model
Browse files Browse the repository at this point in the history
fix: get tokens if no model is provided
  • Loading branch information
joschkabraun committed Apr 12, 2024
2 parents 07c817a + 1d054e3 commit 3d03909
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
5 changes: 3 additions & 2 deletions parea/evals/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -169,10 +169,11 @@ def run_evals_synchronous(trace_id: str, log: Log, eval_funcs: List[EvalFuncTupl
def get_tokens(model: str, text: str) -> List[int]:
if not text:
return []
fallback_model = "cl100k_base"
try:
encoding = tiktoken.encoding_for_model(model)
encoding = tiktoken.encoding_for_model(model or fallback_model)
except KeyError:
encoding = tiktoken.get_encoding("cl100k_base")
encoding = tiktoken.get_encoding(fallback_model)
try:
return encoding.encode(text)
except Exception as e:
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ build-backend = "poetry.core.masonry.api"
[tool.poetry]
name = "parea-ai"
packages = [{ include = "parea" }]
version = "0.2.129"
version = "0.2.130"
description = "Parea python sdk"
readme = "README.md"
authors = ["joel-parea-ai <[email protected]>"]
Expand Down

0 comments on commit 3d03909

Please sign in to comment.