Skip to content

fix: guard None text in ItemHelpers.extract_last_content#3394

Merged
seratch merged 1 commit into
openai:mainfrom
ioleksiuk:fix/guard-none-text-extract-helpers
May 15, 2026
Merged

fix: guard None text in ItemHelpers.extract_last_content#3394
seratch merged 1 commit into
openai:mainfrom
ioleksiuk:fix/guard-none-text-extract-helpers

Conversation

@ioleksiuk
Copy link
Copy Markdown
Contributor

Summary

Extends the fix in #3375 (`fix: guard None text in text_message_output ...`) to the two sibling helpers in `ItemHelpers` that have the same shape.

`ResponseOutputText.text` is typed as `str` per the Responses API schema, but `src/agents/items.py:714-720` already documents (in the comment above the original #3375 fix) that provider gateways like LiteLLM and `model_construct` paths during streaming surface `None` values. PR #3375 fixed `text_message_output` for this case. Two sibling helpers were missed:

  • `extract_last_content` (declared `-> str`, items.py:678) — silently returned `None` when `last_content.text` was `None`, violating its declared type contract.
  • `extract_last_text` (declared `-> str | None`, items.py:693) — returned `None` only by passthrough, so callers using truthy semantics couldn't distinguish "no text content" from "text is None".

Repro for `extract_last_content` (declared `-> str`, returns `None`):

```python

text_part = ResponseOutputText.model_construct(text=None, type="output_text", annotations=[])
msg = ResponseOutputMessage(id="m", role="assistant", status="completed",
... type="message", content=[text_part])
ItemHelpers.extract_last_content(msg)
None # but the function is typed -> str
```

Fix mirrors the #3375 pattern: `return last_content.text or ""` in `extract_last_content`, `return last_content.text or None` in `extract_last_text` for explicit semantics.

Test plan

Issue number

N/A — found while auditing for the same pattern that #3375 fixed.

Checks

  • I've added new tests (if relevant)
  • I've added/updated the relevant documentation (no doc changes needed)
  • I've run `make lint` and `make format`
  • I've made sure tests pass

Comment thread src/agents/items.py
@ioleksiuk ioleksiuk force-pushed the fix/guard-none-text-extract-helpers branch from 6b52f73 to 48f00e7 Compare May 14, 2026 01:10
@ioleksiuk
Copy link
Copy Markdown
Contributor Author

Thanks for the review! Added inline comments on both extract_last_content (line 687) and extract_last_text (line 701) explaining the rationale — same shape as the existing comment in extract_text (items.py:714-720):

last_content.text is typed as str per the Responses API schema, but provider gateways (e.g. LiteLLM) and model_construct paths during streaming have been observed surfacing None.

The two comments differ slightly in the second sentence to match their respective return types (-> str coerces to "", -> str | None coerces to None so truthiness checks still work).

Force-pushed the amended commit.

Extends the fix in openai#3375 (`fix: guard None text in text_message_output ...`)
to one more sibling helper that had the same `-> str` type contract.

`ResponseOutputText.text` is typed as `str` per the Responses API
schema, but `src/agents/items.py:714-720` already documents that
provider gateways (e.g. LiteLLM) and `model_construct` paths during
streaming surface `None` values. PR openai#3375 fixed `text_message_output`
for this case. `extract_last_content` (declared `-> str`,
items.py:678) silently violated its type contract by returning `None`
when `last_content.text` was `None`.

Repro:

    >>> text_part = ResponseOutputText.model_construct(
    ...     text=None, type="output_text", annotations=[])
    >>> msg = ResponseOutputMessage.model_construct(
    ...     id="m", role="assistant", status="completed",
    ...     type="message", content=[text_part])
    >>> ItemHelpers.extract_last_content(msg)
    None        # but the function is typed `-> str`

Fix mirrors the openai#3375 pattern: `return last_content.text or ""`.

Note: `extract_last_text` (declared `-> str | None`) is intentionally
left alone — `None` is already a valid return per its signature, and
coalescing `""` → `None` would silently change semantics for tools that
legitimately return empty text.
@ioleksiuk ioleksiuk force-pushed the fix/guard-none-text-extract-helpers branch from 48f00e7 to 4fe452d Compare May 14, 2026 01:18
@ioleksiuk ioleksiuk changed the title fix: guard None text in ItemHelpers.extract_last_content/extract_last_text fix: guard None text in ItemHelpers.extract_last_content May 14, 2026
@ioleksiuk
Copy link
Copy Markdown
Contributor Author

Follow-up: on second look at the diff, the original extract_last_text change (return last_content.text or None) had a subtle semantic regression I missed — it conflated text="" with text=None. Both became None, which would silently change behavior for tools that legitimately return empty text (callers using result is None to distinguish "no content" from "empty content" would now see None for both).

Reverted that change and dropped the related tests/inline comment. The PR is now scoped to a single, real fix:

  • extract_last_content (declared -> str) — coerce None to "" so the function never violates its type contract.

extract_last_text left untouched because:

  • It's declared -> str | None, so returning None is legal per the signature.
  • The pre-existing behavior of returning "" for empty text is meaningful (callers can still distinguish via is None).

Updated commit message + PR title accordingly. Sorry for the noise.

@ioleksiuk ioleksiuk requested a review from seratch May 14, 2026 01:22
@seratch seratch added this to the 0.17.x milestone May 14, 2026
@seratch
Copy link
Copy Markdown
Member

seratch commented May 14, 2026

@codex review

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex Review: Didn't find any major issues. 👍

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@seratch seratch merged commit 5e71d09 into openai:main May 15, 2026
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants