Replies: 1 comment 2 replies
-
That's ancient. We've always supported Ruby 3.1.4 and above. Do you use Ruby 3.0 or older? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am using
gem 'ruby_llm', '1.2.0'
due to my current ruby version.And my
Ruby LLM GEM
based chat service works great when I useOpenAI
api keys.And when I use AWS Bedrock directly (not through Ruby LLM) my Bedrock API calls work great.
Issues arise when I try to use AWS Bedrock through Ruby LLM.
When I select "Bedrock" as the Provider and use
claude-3-5-sonnet
as the model I get:RubyLLM API error: 400 {"message":"Invocation of model ID anthropic.claude-3-5-sonnet-20241022-v2:0 with on-demand throughput isn’t supported. Retry your request with the ID or ARN of an inference profile that contains this model."}
.And when I try to use
us.anthropic.claude-3-5-sonnet-20240620-v1:0
as the model name I getUnknown model: us.anthropic.claude-3-5-sonnet-20240620-v1:0 for provider: bedrock
. Same forus.anthropic.claude-3-5-sonnet-20240620-v1
orus.anthropic.claude-3-5-sonnet
.In AWS Bedrock console I do not see any model where I can add provisioned throughput. What should I do given the error message above?
I also tried using models like
claude-sonnet-4
- I get error similar toUnknown model: claude-sonnet-4 for provider: bedrock
when I try:claude-sonnet-4
claude-4-sonnet
anthropic.claude-sonnet-4-20250514-v1:0
deepseek.r1-v1:0
and many others as model name ( asENV.fetch("AWS_BEDROCK_MODEL", ...
).What I am doing wrong?
My code is
Beta Was this translation helpful? Give feedback.
All reactions