Skip to content

Commit eef63da

Browse files
cmilesbCopilot
andauthored
Apply suggestions from code review
Co-authored-by: Copilot <[email protected]>
1 parent 94f6ac9 commit eef63da

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

content/operate/rc/langcache/use-langcache.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ title: Use the LangCache API with your GenAI app
1111
weight: 10
1212
---
1313

14-
You can use the LangCache API from your client app to store and retrieve LLM, RAG or agent responses.
14+
You can use the LangCache API from your client app to store and retrieve LLM, RAG, or agent responses.
1515

1616
To access the LangCache API, you need:
1717

@@ -64,7 +64,7 @@ Place this call in your client app right before you call your LLM's REST API. If
6464

6565
If LangCache does not return a response, you should call your LLM's REST API to generate a new response. After you get a response from the LLM, you can [store it in LangCache](#store-a-new-response-in-langcache) for future use.
6666

67-
You can also scope the responses returned from LangCache by adding an `attributes` object or `scope` object to the request. LangCache will only return responses that match the attributes you specify.
67+
You can also scope the responses returned from LangCache by adding an `attributes` object to the request. LangCache will only return responses that match the attributes you specify.
6868

6969
```sh
7070
POST https://[host]/v1/caches/{cacheId}/search

0 commit comments

Comments
 (0)