feat: add OpenClaw as supported LLM provider#547
feat: add OpenClaw as supported LLM provider#547fredmisasi-oc wants to merge 1 commit intomattermost:masterfrom
Conversation
|
Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits. |
📝 WalkthroughWalkthroughThis PR introduces support for a new OpenClaw LLM service type by adding a service constant, validation logic for API URL requirements, provider registry entry with a default Claude model, corresponding tests, and UI components for configuration. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~12 minutes Possibly related PRs
Suggested labels
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Tip Try Coding Plans. Let us write the prompt for your AI agent so you can ship faster (with fewer bugs). Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
llm/configuration.go (1)
148-150: Fix the OpenClaw auth comment.This branch only validates
APIURL, but the new comment says an API key is required. That contradiction will mislead the next person touching validation and makes it easy to reintroduce the wrong requirement.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@llm/configuration.go` around lines 148 - 150, The comment for the ServiceTypeOpenClaw branch is misleading: it states OpenClaw requires an API key, but the code only validates service.APIURL; update the comment to match the actual validation or change the validation to also require the API key. Specifically, in the ServiceTypeOpenClaw branch (reference: ServiceTypeOpenClaw and service.APIURL) either remove the mention of an API key from the comment or add a check for the API key field (and include service.ApiKey or the correct auth field name) so the comment and validation remain consistent.llm/providers_test.go (1)
69-90: Prefer a table-driven provider-config test here.This duplicates the same assertion pattern already used for Scale. Folding provider config checks into one table over
{serviceType, expectedConfig}would make the next provider addition a data-only change.As per coding guidelines "Write Go unit tests as table-driven tests whenever possible".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@llm/providers_test.go` around lines 69 - 90, Replace the single-provider TestOpenClawProviderConfig with a table-driven test that iterates over a slice of test cases (each containing serviceType and expected config values) and runs the same assertions for each case; use GetOpenAICompatibleProvider to fetch the provider in the loop and assert fields DefaultModel, FixedAPIURL, CreateTransport, and DisableStreamOptions against the expected values for that case so future providers (e.g., Scale) become data-only additions and the assertion logic is centralized.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@webapp/src/components/system_console/service.tsx`:
- Line 45: The hard-coded "OpenClaw" label is not internationalized; replace the
literal 'OpenClaw' in the display-name map entry (['openclaw', 'OpenClaw']) and
the dropdown label usage at the other location with react-intl usage (e.g.,
useIntl()/intl.formatMessage or <FormattedMessage>) so the UI text comes from a
message descriptor; add a corresponding message id and defaultMessage (e.g., id:
'service.openclaw.name', defaultMessage: 'OpenClaw') to the component's messages
bundle or shared i18n file and use that descriptor for both the map value and
the dropdown label.
- Line 72: The code added 'openclaw' to isOpenAIType but missed updating related
checks: update supportsModelFetching to include 'openclaw' so OpenClaw services
will auto-load models, and change the credential gate logic (the branch that
currently requires an API key for every non-'openaicompatible' provider) to
treat 'openclaw' like 'openaicompatible' (allow URL-only/no-API-key
configuration accepted by IsValidService); ensure all references to provider
capability checks (isOpenAIType, supportsModelFetching, and the credential-check
branch used before calling IsValidService) include 'openclaw' consistently.
---
Nitpick comments:
In `@llm/configuration.go`:
- Around line 148-150: The comment for the ServiceTypeOpenClaw branch is
misleading: it states OpenClaw requires an API key, but the code only validates
service.APIURL; update the comment to match the actual validation or change the
validation to also require the API key. Specifically, in the ServiceTypeOpenClaw
branch (reference: ServiceTypeOpenClaw and service.APIURL) either remove the
mention of an API key from the comment or add a check for the API key field (and
include service.ApiKey or the correct auth field name) so the comment and
validation remain consistent.
In `@llm/providers_test.go`:
- Around line 69-90: Replace the single-provider TestOpenClawProviderConfig with
a table-driven test that iterates over a slice of test cases (each containing
serviceType and expected config values) and runs the same assertions for each
case; use GetOpenAICompatibleProvider to fetch the provider in the loop and
assert fields DefaultModel, FixedAPIURL, CreateTransport, and
DisableStreamOptions against the expected values for that case so future
providers (e.g., Scale) become data-only additions and the assertion logic is
centralized.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI (base), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
Run ID: 54fe4ed7-e233-43ec-8b7e-cd95092900d2
📒 Files selected for processing (5)
llm/configuration.gollm/providers.gollm/providers_test.gollm/service_types.gowebapp/src/components/system_console/service.tsx
| ['cohere', 'Cohere'], | ||
| ['mistral', 'Mistral'], | ||
| ['asage', 'asksage (Experimental)'], | ||
| ['openclaw', 'OpenClaw'], |
There was a problem hiding this comment.
Internationalize the new OpenClaw labels.
OpenClaw is new user-facing text in both the display-name map and the dropdown, but it is hard-coded instead of going through react-intl.
As per coding guidelines "Always add i18n for new text in user-facing code".
Also applies to: 157-157
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@webapp/src/components/system_console/service.tsx` at line 45, The hard-coded
"OpenClaw" label is not internationalized; replace the literal 'OpenClaw' in the
display-name map entry (['openclaw', 'OpenClaw']) and the dropdown label usage
at the other location with react-intl usage (e.g., useIntl()/intl.formatMessage
or <FormattedMessage>) so the UI text comes from a message descriptor; add a
corresponding message id and defaultMessage (e.g., id: 'service.openclaw.name',
defaultMessage: 'OpenClaw') to the component's messages bundle or shared i18n
file and use that descriptor for both the map value and the dropdown label.
| const type = props.service.type; | ||
| const intl = useIntl(); | ||
| const isOpenAIType = type === 'openai' || type === 'openaicompatible' || type === 'azure' || type === 'cohere' || type === 'mistral' || type === 'scale'; | ||
| const isOpenAIType = type === 'openai' || type === 'openaicompatible' || type === 'azure' || type === 'cohere' || type === 'mistral' || type === 'scale' || type === 'openclaw'; |
There was a problem hiding this comment.
Keep the OpenClaw capability checks in sync.
Adding openclaw to isOpenAIType only updates part of the form. supportsModelFetching on Line 81 still excludes it, and the credential gate on Line 85 still requires an API key for every non-openaicompatible provider. As a result, OpenClaw services can be selected here but won't auto-load models for the URL-only configuration that IsValidService accepts.
Suggested fix
- const supportsModelFetching = type === 'anthropic' || type === 'openai' || type === 'azure' || type === 'openaicompatible';
+ const supportsModelFetching =
+ type === 'anthropic' ||
+ type === 'openai' ||
+ type === 'azure' ||
+ type === 'openaicompatible' ||
+ type === 'openclaw';
useEffect(() => {
- // For openaicompatible, API key is optional if there's an API URL
- const hasRequiredCredentials = type === 'openaicompatible' ? (props.service.apiKey || props.service.apiURL) : props.service.apiKey;
+ // For OpenAI-compatible gateways, API key is optional if there's an API URL
+ const hasRequiredCredentials =
+ type === 'openaicompatible' || type === 'openclaw'
+ ? (props.service.apiKey || props.service.apiURL)
+ : props.service.apiKey;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@webapp/src/components/system_console/service.tsx` at line 72, The code added
'openclaw' to isOpenAIType but missed updating related checks: update
supportsModelFetching to include 'openclaw' so OpenClaw services will auto-load
models, and change the credential gate logic (the branch that currently requires
an API key for every non-'openaicompatible' provider) to treat 'openclaw' like
'openaicompatible' (allow URL-only/no-API-key configuration accepted by
IsValidService); ensure all references to provider capability checks
(isOpenAIType, supportsModelFetching, and the credential-check branch used
before calling IsValidService) include 'openclaw' consistently.
Summary
Adds OpenClaw as a first-class supported LLM provider type in the Agents plugin.
OpenClaw is an open-source personal AI gateway that exposes an OpenAI-compatible HTTP API. This PR allows users to point the Agents plugin at their local (or remote) OpenClaw gateway as the AI backend.
Changes
llm/service_types.go— addsServiceTypeOpenClaw = "openclaw"constantllm/providers.go— registers OpenClaw in theopenAICompatibleProvidersmap with default modelanthropic/claude-sonnet-4-6and a user-configurable API URL (no fixed URL, same pattern asopenaicompatible)llm/configuration.go— addsopenclawto theIsValidServiceswitch; requires onlyAPIURL(API key is optional since gateway auth is configured separately)webapp/src/components/system_console/service.tsx— adds OpenClaw to the provider display name map, dropdown,isOpenAITypecheck, and API URL field visibilityllm/providers_test.go— addsTestOpenClawProviderConfigunit testConfiguration
In the plugin admin console, select OpenClaw as the service type and set:
http://localhost:18789/v1)openclaw:main(routes to the main agent) or any model supported by your gatewayThe
/v1/chat/completionsendpoint must be enabled in OpenClaw config:Summary by CodeRabbit