Skip to content

Commit

Permalink
Merge pull request #212 from Mirascope/chore/docs-retries
Browse files Browse the repository at this point in the history
chore: updated docs with retries
  • Loading branch information
willbakst authored May 7, 2024
2 parents 7f8f65c + d777793 commit c604c90
Show file tree
Hide file tree
Showing 4 changed files with 58 additions and 22 deletions.
13 changes: 12 additions & 1 deletion docs/concepts/defining_and_extracting_schemas.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,18 @@ Notice how instead of “Patrick Rothfuss” the extracted author is “Rothfuss

## Retries

Sometimes the model will fail to extract the schema. This can often be a result of the prompt; however, sometimes it’s simply a failure of the model. If you want to retry the extraction some number of times, you can set `retries` equal to however many retries you want to run (defaults to 0).
Sometimes the model will fail to extract the schema. This can often be a result of the prompt; however, sometimes it’s simply a failure of the model. If you want to retry the extraction some number of times, you can set `retries` equal to the number of runs (defaults to 1). Alternatively, you can pass in [tenacity.Retrying](https://tenacity.readthedocs.io/en/latest/) so that you can customize the behavior of retries. Mirascope will automatically pass in the error to the next call to give context.

```python
from tenacity import Retrying, stop_after_attempt

retries = Retrying(
stop=stop_after_attempt(3),
)
task_details = TaskExtractor(task=task).extract(retries=retries)
```

As you can see, Mirascope makes extraction extremely simple. Under the hood, Mirascope uses the provided schema to extract the generated content and validate it (see [Validation](validation.md) for more details).

```python
book = BookExtractor().extract(retries=3) # will retry up to 3 times
Expand Down
15 changes: 0 additions & 15 deletions docs/concepts/extracting_structured_information_using_llms.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,18 +44,3 @@ assert isinstance(task_details, TaskDetails)
print(TaskDetails)
#> due_date='next Friday' priority='high' description='Submit quarterly report'
```

### Retry

Extraction can fail due to a wide variety of reasons. When this happens, the simpliest approach is to introduce retries. Mirascope uses [Tenacity](https://tenacity.readthedocs.io/en/latest/) so that you can customize the behavior of retries or pass in an integer. With the same example above add:

```python
from tenacity import Retrying, stop_after_attempt

retries = Retrying(
stop=stop_after_attempt(3),
)
task_details = TaskExtractor(task=task).extract(retries=retries)
```

As you can see, Mirascope makes extraction extremely simple. Under the hood, Mirascope uses the provided schema to extract the generated content and validate it (see [Validation](validation.md) for more details).
35 changes: 33 additions & 2 deletions docs/concepts/generating_content.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ from mirascope import OpenAICall, OpenAICallParams
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"


class RecipeRecommender(OpenAIPrompt):
class RecipeRecommender(OpenAICall):
prompt_template = "Recommend recipes that use {ingredient} as an ingredient"

ingredient: str
Expand All @@ -49,7 +49,7 @@ from mirascope import OpenAICall, OpenAICallParams
os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"


class RecipeRecommender(OpenAIPrompt):
class RecipeRecommender(OpenAICall):
prompt_template = "Recommend recipes that use {ingredient} as an ingredient"

ingredient: str
Expand Down Expand Up @@ -177,3 +177,34 @@ recipe = recommend_recipe(food_type="japanese", ingredient="apples")
print(recipe)
# > Certainly! Here's a recipe for a delicious and refreshing Wagyu Beef and Apple roll: ...
```

### Retrying

Calls to LLM providers can fail due to various reasons, such as network issues, API rate limits, or service outages. To provide a resilient user experience and handle unexpected failures gracefully, Mirascope directly integrates with [Tenacity](https://tenacity.readthedocs.io/en/latest/).
By passing the `retries` parameter to any Mirascope class that extends `BaseCall`, you can easily enable automatic retry functionality out of the box. Using the same basic `RecipeRecommender` we can take advantage of tenacity to retry as many times as we need to generate a response that does not have certain words.

```python
import os

from mirascope import OpenAICall, OpenAICallParams
from tenacity import Retrying, retry_if_result

os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"

retries = Retrying(
before=lambda details: print(details),
after=lambda details: print(details),
retry=retry_if_result(lambda result: "Cheese" in result.content),
)

class RecipeRecommender(OpenAICall):
prompt_template = "Recommend recipes that use {ingredient} as an ingredient"

ingredient: str

call_params = OpenAICallParams(model="gpt-3.5-turbo-0125")


response = RecipeRecommender(ingredient="apples").call(retries=retries)
print(response.content) # Content will not contain "Cheese"
```
17 changes: 13 additions & 4 deletions mirascope/base/calls.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,11 @@
Optional,
Type,
TypeVar,
Union,
)

from tenacity import AsyncRetrying, Retrying

from .prompts import BasePrompt
from .tools import BaseTool
from .types import BaseCallParams, BaseCallResponse, BaseCallResponseChunk
Expand All @@ -36,7 +39,9 @@ class BaseCall(
)

@abstractmethod
def call(self, **kwargs: Any) -> BaseCallResponseT:
def call(
self, retries: Union[int, Retrying] = 1, **kwargs: Any
) -> BaseCallResponseT:
"""A call to an LLM.
An implementation of this function must return a response that extends
Expand All @@ -46,7 +51,9 @@ def call(self, **kwargs: Any) -> BaseCallResponseT:
... # pragma: no cover

@abstractmethod
async def call_async(self, **kwargs: Any) -> BaseCallResponseT:
async def call_async(
self, retries: Union[int, AsyncRetrying] = 1, **kwargs: Any
) -> BaseCallResponseT:
"""An asynchronous call to an LLM.
An implementation of this function must return a response that extends
Expand All @@ -56,7 +63,9 @@ async def call_async(self, **kwargs: Any) -> BaseCallResponseT:
... # pragma: no cover

@abstractmethod
def stream(self, **kwargs: Any) -> Generator[BaseCallResponseChunkT, None, None]:
def stream(
self, retries: Union[int, Retrying] = 1, **kwargs: Any
) -> Generator[BaseCallResponseChunkT, None, None]:
"""A call to an LLM that streams the response in chunks.
An implementation of this function must yield response chunks that extend
Expand All @@ -67,7 +76,7 @@ def stream(self, **kwargs: Any) -> Generator[BaseCallResponseChunkT, None, None]

@abstractmethod
async def stream_async(
self, **kwargs: Any
self, retries: Union[int, AsyncRetrying] = 1, **kwargs: Any
) -> AsyncGenerator[BaseCallResponseChunkT, None]:
"""A asynchronous call to an LLM that streams the response in chunks.
Expand Down

0 comments on commit c604c90

Please sign in to comment.