Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

o3-mini in Assistants API unsupported parameter #1318

Closed
1 task done
Nedomas opened this issue Feb 11, 2025 · 3 comments
Closed
1 task done

o3-mini in Assistants API unsupported parameter #1318

Nedomas opened this issue Feb 11, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@Nedomas
Copy link

Nedomas commented Feb 11, 2025

Confirm this is a Node library issue and not an underlying OpenAI API issue

  • This is an issue with the Node library

Describe the bug

On o3-mini in Assistants API I get 400 Unsupported parameter: 'temperature' is not supported with this model.

There’s no way to remove this error as I think this is openai node npm sending the param.

To Reproduce

Just use o3-mini model and Assistants API.

Code snippets

await client.beta.threads.runs.create('thread-id', {
  stream: true,
  model: 'o3-mini',
  reasoning_effort: 'medium',
  instructions: 'Some instructions.',
  assistant_id: 'some-id',
})

OS

macOS

Node version

22.3.0

Library version

4.83.0

@Nedomas Nedomas added the bug Something isn't working label Feb 11, 2025
@RobertCraigie
Copy link
Collaborator

Thanks for the report but I cannot reproduce this issue, the following works for me

const assistant = await openai.beta.assistants.create({
  model: 'gpt-4-1106-preview',
  name: 'Math Tutor',
  instructions: 'You are a personal math tutor. Write and run code to answer math questions.',
});

let assistantId = assistant.id;
console.log('Created Assistant with Id: ' + assistantId);

const thread = await openai.beta.threads.create({
  messages: [
    {
      role: 'user',
      content: '"I need to solve the equation `3x + 11 = 14`. Can you help me?"',
    },
  ],
});

const run = await openai.beta.threads.runs.create(thread.id, {
  model: 'o3-mini',
  reasoning_effort: 'medium',
  instructions: 'Some instructions.',
  assistant_id: assistant.id,
});
console.log(run);

@juzarantri
Copy link

@RobertCraigie I am also getting the same issue. As per the error instruction I am not assigning any temperature to it but still it gives me error

error: { message: "Unsupported parameter: 'temperature' is not supported with this model.", type: 'invalid_request_error', param: 'model', code: 'unsupported_model' }, code: 'unsupported_model', param: 'model',

Code:
const assistant = await openai.beta.assistants.update(chatbotId, { instructions: updateFields.instruction, model: "o3-mini", ...(updateFields.model !== "o1" && updateFields.model !== "o3-mini" && { temperature: updateFields.temperature, }), });

@RobertCraigie
Copy link
Collaborator

I appreciate this is not very intuitive but when updating an asisstant to a model that doesn't support temperature you'll need to explicitly set it to null.

I've reported this to the API team to see if the behaviour here can be improved.

@RobertCraigie RobertCraigie closed this as not planned Won't fix, can't repro, duplicate, stale Feb 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants