Demonstrates combining LangGraphPlugin (durable task execution) with Temporal's LangSmithPlugin for full tracing of LLM calls through Temporal workflows, using LangGraph's @task and @entrypoint decorators.
- Using
LangSmithPluginon the Temporal client for automatic trace propagation - Using
LangGraphPluginon the Worker for durable LangGraph execution @traceablein three places: on the@task(Activity) itself, on a helper called from inside the@task, and on a helper called from inside the@entrypoint(Workflow)- Both plugins working together: durability + observability
- The Temporal client is created with
LangSmithPlugin(add_temporal_runs=True). - A Worker registers the
chattask withLangGraphPlugin. - When the Workflow runs, the
chattask executes as a Temporal Activity. @traceabledecorators emit trace data to LangSmith for the task, an in-task helper, and an in-entrypoint helper.
Prerequisites: uv sync --group langgraph and a running Temporal dev server (temporal server start-dev).
export ANTHROPIC_API_KEY='your-key'
export LANGCHAIN_API_KEY='your-key'
uv run langgraph_plugin/functional_api/langsmith_tracing/main.pyTraces will appear in your LangSmith dashboard.
| File | Description |
|---|---|
workflow.py |
@traceable chat task + helpers, @entrypoint, and ChatFunctionalWorkflow |
main.py |
Starts a Worker and executes the Workflow in a single process |