Loading...
Loading...
Use multiple AI providers together — fallback patterns, cost optimization, and OpenRouter unified API
One endpoint for 200+ models from 20+ providers:
from openai import OpenAI
client = OpenAI(
base_url='https://openrouter.ai/api/v1',
api_key=os.getenv('OPENROUTER_API_KEY')
)
# Switch models by changing the name
response = client.chat.completions.create(
model='openai/gpt-4o', # or anthropic/claude-3.5-sonnet, google/gemini-2.5-flash
messages=[{'role': 'user', 'content': 'Hello!'}]
)| Task Type | Recommended Model | Why |
|---|---|---|
| Simple chat | Gemini Flash ($0.15/M) | Cheapest |
| Code | GPT-4o ($2.50/M) | Best at code |
| Document analysis | Claude Sonnet ($3/M) | 200K context |
| Math/reasoning | o3 ($10/M) | Designed for reasoning |
Implement a smart router:
class SmartRouter:
def route(self, task_type):
models = {
'simple': 'google/gemini-2.5-flash',
'code': 'openai/gpt-4o',
'analysis': 'anthropic/claude-3.5-sonnet',
}
return models.get(task_type, 'openai/gpt-4o')