Has anyone successfully used OpenRouter? #1610
Unanswered
moltovivo
asked this question in
Forums - Q&A
Replies: 1 comment
-
|
OpenRouter works — it's an OpenAI-compatible endpoint. The model format is The key thing: you need to provide your OpenRouter API key either via from crawl4ai import AsyncWebCrawler, CrawlerRunConfig
from crawl4ai.extraction_strategy import LLMExtractionStrategy
from crawl4ai.async_configs import LLMConfig
strategy = LLMExtractionStrategy(
llm_config=LLMConfig(
provider="openrouter/google/gemini-2.5-pro-exp-03-25",
api_token="sk-or-v1-your-openrouter-key-here",
),
instruction="Extract the main content",
)
config = CrawlerRunConfig(extraction_strategy=strategy)
async with AsyncWebCrawler() as crawler:
result = await crawler.arun(url, config=config)Or set the env var instead: export OPENROUTER_API_KEY="sk-or-v1-your-key"The error you're seeing ("points me to the LiteLLM providers page") was from the old litellm dependency's error handling — it showed that message whenever it didn't recognize a provider prefix. If you're on a recent version and still seeing this, make sure the provider string starts with |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Crawling is working great when I directly put a LLM model provider and token in. But when I try Openrouter models, it seems to not be recognising it.
Has anyone gotten an OpenRouter model working, and is there something different you need to do for this?
I commented on an issue here about OpenRouter hanging. It doesn't hang for me, but I do get an error that points me to the LiteLLM providers page, as if it is not a recognised model. Tried multiple (older, established) models with Openrouter,
#1115
Beta Was this translation helpful? Give feedback.
All reactions