Skip to content

How to access and pass artifact in AgentExecutor / Agentic workflows #32675

@caesarw0

Description

@caesarw0

Checked other resources

  • This is a bug, not a usage question. For questions, please use the LangChain Forum (https://forum.langchain.com/).
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Example Code

The following code:

from langchain_core.tools import tool
from langchain_core.messages import HumanMessage, ToolMessage
from langchain_google_genai import ChatGoogleGenerativeAI
import random
import os
from langchain.agents import AgentExecutor, create_structured_chat_agent
from langchain import hub

google_api_key = os.getenv("GEMINI_API_KEY")

@tool(response_format="content_and_artifact")
def generate_random_int():
    """Generate a random integer between 1 and 100"""
    v = random.randint(1, 100)
    print(f"Generated a random int from function directly: {v}")
    return "Generated a random int", v


def test_llm_with_tools():
    llm = ChatGoogleGenerativeAI(model="gemini-2.5-flash", api_key=google_api_key)
    llm_with_tools = llm.bind_tools([generate_random_int])

    messages = [HumanMessage(content="Generate a random number")]
    ai_msg = llm_with_tools.invoke(messages)
    print(f"AI message: {ai_msg}")
    messages.append(ai_msg)

    for tc in ai_msg.tool_calls:
        # Call the tool with the proper tool call format to get content_and_artifact
        tool_call = {
            "name": tc["name"],
            "args": tc["args"],
            "id": tc["id"],
            "type": "tool_call"
        }
        
        # This will return a ToolMessage with .content and .artifact attributes
        tool_result = generate_random_int.invoke(tool_call)
        
        # Extract content and artifact from the ToolMessage
        if hasattr(tool_result, 'content') and hasattr(tool_result, 'artifact'):
            content = tool_result.content
            artifact = tool_result.artifact
            print(f"✓ Tool returned content: {content}")
            print(f"✓ Tool returned artifact: {artifact}")
        else:
            # Fallback if the tool doesn't return ToolMessage
            print(f"⚠️ Tool returned: {tool_result} (type: {type(tool_result)})")
            if isinstance(tool_result, tuple) and len(tool_result) == 2:
                content, artifact = tool_result
            else:
                content = str(tool_result)
                artifact = tool_result
        
        messages.append(ToolMessage(
            content=content, 
            tool_call_id=tc["id"],
            additional_kwargs={"artifact": artifact}
        ))

    artifacts = [m.additional_kwargs["artifact"] for m in messages if isinstance(m, ToolMessage)]
    print("Artifacts:", artifacts)

def test_agent_executor():
    llm = ChatGoogleGenerativeAI(model="gemini-2.5-flash", api_key=google_api_key)
    prompt = hub.pull("hwchase17/structured-chat-agent")

    agent = create_structured_chat_agent(
                llm, 
                [generate_random_int], 
                prompt
            )
    agent_executor = AgentExecutor(
                agent=agent,
                tools=[generate_random_int],
                verbose=True,
                handle_parsing_errors=True,
                max_iterations=10,
                return_intermediate_steps=True  # Enable artifact access
            )
    result = agent_executor.invoke({"input": "Generate a random number"})
    print(f"Agent executor result: {result}")


if __name__ == "__main__":
    test_llm_with_tools()
    test_agent_executor()
    

Error Message and Stack Trace (if applicable)

No response

Description

Hi LangChain team 👋

I’ve been working with AgentExecutor and tool binding in an agentic workflow, and I need clarification on the artifact concept:

What I tried

  • I created a tool with @tool(response_format="content_and_artifact") that returns a tuple (content, artifact).
  • When running the tool directly (via tool.invoke(tool_call)), I get a ToolMessage object that exposes .content and .artifact as expected.
  • However, when using AgentExecutor.invoke({"input": ...}) with return_intermediate_steps=True, the intermediate_steps observations still come back as plain strings, not ToolMessage objects with an accessible .artifact.

Questions

  1. Accessing artifact after AgentExecutor
  • Is there a recommended way to get the actual .artifact from intermediate_steps instead of just the stringified content?
  • Should AgentExecutor (or LangGraph workflows) automatically surface the artifact field when response_format="content_and_artifact" is used?
  1. Passing artifacts between tools
  • If Tool A produces an artifact (e.g., a parsed dataframe or structured object), what’s the best practice to pass this downstream to Tool B without forcing the LLM to see the full object?

  • Is the intended design that the artifact flows in the state graph while the LLM only sees the summarized content?

Why this matters

Artifacts are crucial for workflows where the tool output is a large/structured object (e.g., JSON, DataFrame, vector embeddings). Passing them around as content inflates tokens and loses structure.

Appreciate your help!

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 24.5.0: Tue Apr 22 19:54:25 PDT 2025; root:xnu-11417.121.6~2/RELEASE_ARM64_T6020
Python Version: 3.10.13 | packaged by conda-forge | (main, Dec 23 2023, 15:35:25) [Clang 16.0.6 ]

Package Information

langchain_core: 0.3.74
langchain: 0.3.27
langchain_community: 0.3.27
langsmith: 0.4.11
langchain_google_genai: 2.1.9
langchain_langgraph_test: Installed. No version info available.
langchain_ollama: 0.3.6
langchain_text_splitters: 0.3.9
langgraph_sdk: 0.2.0

Optional packages not installed

langserve

Other Dependencies

aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
dataclasses-json<0.7,>=0.5.7: Installed. No version info available.
filetype: 1.2.0
google-ai-generativelanguage: 0.6.18
httpx-sse<1.0.0,>=0.4.0: Installed. No version info available.
httpx<1,>=0.23.0: Installed. No version info available.
httpx>=0.25.2: Installed. No version info available.
jsonpatch<2.0,>=1.33: Installed. No version info available.
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-azure-ai;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.66: Installed. No version info available.
langchain-core<1.0.0,>=0.3.70: Installed. No version info available.
langchain-core<1.0.0,>=0.3.72: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-perplexity;: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.9: Installed. No version info available.
langchain-together;: Installed. No version info available.
langchain-xai;: Installed. No version info available.
langchain<1.0.0,>=0.3.26: Installed. No version info available.
langsmith-pyo3>=0.1.0rc2;: Installed. No version info available.
langsmith>=0.1.125: Installed. No version info available.
langsmith>=0.1.17: Installed. No version info available.
langsmith>=0.3.45: Installed. No version info available.
numpy>=1.26.2;: Installed. No version info available.
numpy>=2.1.0;: Installed. No version info available.
ollama<1.0.0,>=0.5.1: Installed. No version info available.
openai-agents>=0.0.3;: Installed. No version info available.
opentelemetry-api>=1.30.0;: Installed. No version info available.
opentelemetry-exporter-otlp-proto-http>=1.30.0;: Installed. No version info available.
opentelemetry-sdk>=1.30.0;: Installed. No version info available.
orjson>=3.10.1: Installed. No version info available.
orjson>=3.9.14;: Installed. No version info available.
packaging>=23.2: Installed. No version info available.
pydantic: 2.11.7
pydantic-settings<3.0.0,>=2.4.0: Installed. No version info available.
pydantic<3,>=1: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic>=2.7.4: Installed. No version info available.
pytest>=7.0.0;: Installed. No version info available.
PyYAML>=5.3: Installed. No version info available.
requests-toolbelt>=1.0.0: Installed. No version info available.
requests<3,>=2: Installed. No version info available.
requests>=2.0.0: Installed. No version info available.
rich>=13.9.4;: Installed. No version info available.
SQLAlchemy<3,>=1.4: Installed. No version info available.
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
typing-extensions>=4.7: Installed. No version info available.
vcrpy>=7.0.0;: Installed. No version info available.
zstandard>=0.23.0: Installed. No version info available.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing feature

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions