Skip to content

Conversation

@seuros
Copy link
Contributor

@seuros seuros commented Jan 8, 2026

Now the LLM can see the instructions that that the server emit at initialisation.

The data was where, just not used.

Fixes #7373

@github-actions
Copy link
Contributor

github-actions bot commented Jan 8, 2026

The following comment was made by an LLM, it may be inaccurate:

No duplicate PRs found

@seuros seuros force-pushed the instruction branch 2 times, most recently from 67e6dd6 to 1a95b75 Compare January 8, 2026 18:57
@github-actions
Copy link
Contributor

github-actions bot commented Jan 8, 2026

Thanks for your contribution!

This PR doesn't have a linked issue. All PRs must reference an existing issue.

Please:

  1. Open an issue describing the bug/feature (if one doesn't exist)
  2. Add Fixes #<number> or Closes #<number> to this PR description

See CONTRIBUTING.md for details.

@seuros seuros force-pushed the instruction branch 3 times, most recently from 3c77f7d to d2dc000 Compare January 12, 2026 18:21
- Add automatic instruction fetching via client.getInstructions()
- Cache server instructions during initialization
- Include instructions in LLM system prompts
- Remove fetchInstructions config (now automatic)
- Add graceful error handling
Resolve conflicts:
- mcp/index.ts: Keep both InstructionsChanged and BrowserOpenFailed events
- llm.ts: Combine MCP instructions fetching with Codex session handling
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE]: MCP Server instructions exposure

1 participant