@@ -35,7 +35,28 @@ from lmnr import Laminar
3535Laminar.initialize(project_api_key = " <PROJECT_API_KEY>" )
3636```
3737
38- Note that you need to only initialize Laminar once in your application.
38+ You can also skip passing the ` project_api_key ` , in which case it will be looked
39+ in the environment (or local .env file) by the key ` LMNR_PROJECT_API_KEY ` .
40+
41+ Note that you need to only initialize Laminar once in your application. You should
42+ try to do that as early as possible in your application, e.g. at server startup.
43+
44+ ## Set-up for self-hosting
45+
46+ If you self-host a Laminar instance, the default connection settings to it are
47+ ` http://localhost:8000 ` for HTTP and ` http://localhost:8001 ` for gRPC. Initialize
48+ the SDK accordingly:
49+
50+ ``` python
51+ from lmnr import Laminar
52+
53+ Laminar.initialize(
54+ project_api_key = " <PROJECT_API_KEY>" ,
55+ base_url = " http://localhost" ,
56+ http_port = 8000 ,
57+ grpc_port = 8001 ,
58+ )
59+ ```
3960
4061## Instrumentation
4162
@@ -171,49 +192,78 @@ You can run evaluations locally by providing executor (part of the logic used in
171192
172193Read the [ docs] ( https://docs.lmnr.ai/evaluations/introduction ) to learn more about evaluations.
173194
174- ## Laminar pipelines as prompt chain managers
195+ ## Client for HTTP operations
175196
176- You can create Laminar pipelines in the UI and manage chains of LLM calls there.
197+ Various interactions with Laminar [ API] ( https://docs.lmnr.ai/api-reference/ ) are available in ` LaminarClient `
198+ and its asynchronous version ` AsyncLaminarClient ` .
177199
178- After you are ready to use your pipeline in your code, deploy it in Laminar by selecting the target version for the pipeline.
200+ ### Agent
179201
180- Once your pipeline target is set, you can call it from Python in just a few lines.
181-
182- Example use:
202+ To run Laminar agent, you can invoke ` client.agent.run `
183203
184204``` python
185- from lmnr import Laminar
205+ from lmnr import LaminarClient
186206
187- Laminar.initialize( ' <YOUR_PROJECT_API_KEY>' , instruments = set () )
207+ client = LaminarClient( project_api_key = " <YOUR_PROJECT_API_KEY>" )
188208
189- result = Laminar.run(
190- pipeline = ' my_pipeline_name' ,
191- inputs = {' input_node_name' : ' some_value' },
192- # all environment variables
193- env = {' OPENAI_API_KEY' : ' sk-some-key' },
209+ response = client.agent.run(
210+ prompt = " What is the weather in London today?"
194211)
212+
213+ print (response.result.content)
195214```
196215
197- Resulting in:
216+ #### Streaming
217+
218+ Agent run supports streaming as well.
198219
199220``` python
200- >> > result
201- PipelineRunResponse(
202- outputs = {' output' : {' value' : [ChatMessage(role = ' user' , content = ' hello' )]}},
203- # useful to locate your trace
204- run_id = ' 53b012d5-5759-48a6-a9c5-0011610e3669'
205- )
221+ from lmnr import LaminarClient
222+
223+ client = LaminarClient(project_api_key = " <YOUR_PROJECT_API_KEY>" )
224+
225+ for chunk in client.agent.run(
226+ prompt = " What is the weather in London today?" ,
227+ stream = True
228+ ):
229+ if chunk.chunkType == ' step' :
230+ print (chunk.summary)
231+ elif chunk.chunkType == ' finalOutput' :
232+ print (chunk.content.result.content)
206233```
207234
208- ## Semantic search
209-
210- You can perform a semantic search on a dataset in Laminar by calling ` Laminar.semantic_search ` .
235+ #### Async mode
211236
212237``` python
213- response = Laminar.semantic_search(
214- query = " Greatest Chinese architectural wonders" ,
215- dataset_id = uuid.UUID(" 413f8404-724c-4aa4-af16-714d84fd7958" ),
238+ from lmnr import AsyncLaminarClient
239+
240+ client = AsyncLaminarClient(project_api_key = " <YOUR_PROJECT_API_KEY>" )
241+
242+ response = await client.agent.run(
243+ prompt = " What is the weather in London today?"
216244)
245+
246+ print (response.result.content)
217247```
218248
219- [ Read more] ( https://docs.lmnr.ai/datasets/indexing ) about indexing and semantic search.
249+ #### Async mode with streaming
250+
251+ ``` python
252+ from lmnr import AsyncLaminarClient
253+
254+ client = AsyncLaminarClient(project_api_key = " <YOUR_PROJECT_API_KEY>" )
255+
256+ # Note that you need to await the operation even though we use `async for` below
257+ response = await client.agent.run(
258+ prompt = " What is the weather in London today?" ,
259+ stream = True
260+ )
261+ async for chunk in client.agent.run(
262+ prompt = " What is the weather in London today?" ,
263+ stream = True
264+ ):
265+ if chunk.chunkType == ' step' :
266+ print (chunk.summary)
267+ else if chunk.chunkType == ' finalOutput' :
268+ print (chunk.content.result.content)
269+ ```
0 commit comments