Skip to content

[BUG] OpenAIModel does not work with Gemini OpenAI-compatible endpoints #330

@maximelebastard

Description

@maximelebastard

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

0.1.2

Node.js Version

23.10.0

Operating System

macOS

Installation Method

npm

Steps to Reproduce

Here is a reproduction code.
You have to get an OpenAI api key and a Google AI Studio api key - and provide them as env vars.

import { Agent, tool } from "@strands-agents/sdk";
import { OpenAIModel } from "@strands-agents/sdk/openai";
import OpenAI from "openai";
import { z } from "zod";

/**
 * Standalone repro for Strands SDK tool handling with two backends:
 * 1. Native OpenAI (expected to succeed)
 * 2. Google Gemini 2.5 Flash via the OpenAI-compatible endpoint (expected to fail)
 *
 * Run from the examples package:
 *   OPENAI_API_KEY=... GOOGLE_API_KEY=... pnpm tsx src/reproduce-strands-openapi-tool-issue.ts
 */

const getWeather = tool({
  name: "get_weather",
  description: "Get weather information for a city.",
  inputSchema: z.object({
    location: z
      .string()
      .describe("City or region the user is interested in."),
  }),
  async callback({ location }) {
    return {
      location,
      temperature: 72,
      conditions: "sunny",
    };
  },
});

const systemPrompt = [
  "You are a weather routing agent.",
  "You MUST call the `get_weather` tool exactly once before answering.",
  "Do not fabricate weather data yourself.",
].join(" ");

async function main(): Promise<void> {
  const openaiClient = new OpenAI({
    apiKey: process.env.OPENAI_API_KEY ?? "set-your-openai-api-key",
    baseURL: process.env.OPENAI_BASE_URL,
  });
  const geminiClient = new OpenAI({
    apiKey: process.env.GOOGLE_API_KEY ?? "set-your-google-api-key",
    baseURL:
      process.env.GEMINI_OPENAI_BASE ??
      "https://generativelanguage.googleapis.com/v1beta/openai/",
  });

  const scenarios = [
    {
      label: "OpenAI (gpt-4o-mini)",
      client: openaiClient,
      modelId: process.env.OPENAI_MODEL_ID ?? "gpt-4o-mini",
    },
    {
      label: "Gemini via OpenAI proxy (gemini-2.5-flash)",
      client: geminiClient,
      modelId: process.env.GEMINI_MODEL_ID ?? "gemini-2.5-flash",
    },
  ] as const;

  const results: Array<{
    label: string;
    success: boolean;
    error?: unknown;
  }> = [];

  for (const scenario of scenarios) {
    const summary = `[${scenario.label}]`;
    console.log(`${summary} invoking agent...`);

    const model = new OpenAIModel({
      client: scenario.client,
      modelId: scenario.modelId,
      temperature: 0,
    });

    const agent = new Agent({
      systemPrompt,
      model,
      tools: [getWeather],
    });

    try {
      await agent.invoke("Can you get the weather for San Francisco?");
      console.log(`${summary} ✅ succeeded`);
      results.push({ label: scenario.label, success: true });
    } catch (error) {
      console.error(`${summary} ❌ threw an error`, error);
      results.push({ label: scenario.label, success: false, error });
    }
  }
}

main().catch((error) => {
  console.error(error);
  process.exitCode = 1;
});

Expected Behavior

Tools use are working both with OpenAI and Google OpenAI compatible endpoints

Actual Behavior

It works with OpenAI
With Gemini, it throws an error about a missing tool_use block in the messages collection

Error: Model indicated toolUse but no tool use blocks found in message

Additional Context

No response

Possible Solution

No response

Related Issues

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions