OpenRouter SupportΒΆ
ToolUniverse now includes full support for OpenRouter, allowing you to access a wide variety of LLM models from multiple providers through a single, unified API.
OverviewΒΆ
OpenRouter provides access to models from:
OpenAI (GPT-5)
Anthropic (Claude Sonnet 4.5)
Google (Gemini 2.5 Flash, Gemini 2.5 Pro)
And many more!
ConfigurationΒΆ
Environment VariablesΒΆ
To use OpenRouter, set the following environment variables:
# Required
export OPENROUTER_API_KEY="your_openrouter_api_key_here"
# Optional - for usage tracking and attribution
export OPENROUTER_SITE_URL="https://your-site.com"
export OPENROUTER_SITE_NAME="Your Application Name"
Getting an API KeyΒΆ
Visit OpenRouter
Sign up for an account
Navigate to the API Keys section
Generate a new API key
Copy and set it as the
OPENROUTER_API_KEY
environment variable
Using OpenRouter with AgenticToolΒΆ
Basic ConfigurationΒΆ
Configure an AgenticTool to use OpenRouter:
from tooluniverse import ToolUniverse
# Example tool configuration using OpenRouter
tool_config = {
"name": "OpenRouter_Summarizer",
"description": "Summarize text using OpenRouter models",
"type": "AgenticTool",
"prompt": "Summarize the following text:\n{text}",
"input_arguments": ["text"],
"parameter": {
"type": "object",
"properties": {
"text": {
"type": "string",
"description": "The text to summarize",
"required": True
}
},
"required": ["text"]
},
"configs": {
"api_type": "OPENROUTER",
"model_id": "openai/gpt-5",
"temperature": 0.7,
"return_json": False
}
}
# Initialize ToolUniverse and register the tool
tu = ToolUniverse()
tu.register_tool_from_config(tool_config)
# Use the tool
result = tu.execute_tool("OpenRouter_Summarizer", {
"text": "Your long text here..."
})
Using OpenRouter with ToolFinderLLMΒΆ
Configure ToolFinderLLM to use OpenRouter models:
from tooluniverse import ToolUniverse
# Create ToolUniverse instance
tu = ToolUniverse()
# Configure ToolFinderLLM with OpenRouter
tool_finder_config = {
"type": "ToolFinderLLM",
"name": "Tool_Finder_OpenRouter",
"description": "Find tools using OpenRouter LLMs",
"configs": {
"api_type": "OPENROUTER",
"model_id": "anthropic/claude-sonnet-4.5",
"temperature": 0.1,
"max_new_tokens": 4096,
"return_json": True,
"exclude_tools": ["Tool_RAG", "Tool_Finder", "Finish"]
}
}
# Register and use
tu.register_tool_from_config(tool_finder_config)
result = tu.execute_tool("Tool_Finder_OpenRouter", {
"description": "tools for protein analysis",
"limit": 5
})
Fallback ConfigurationΒΆ
OpenRouter is included in the default fallback chain. If the primary API fails, the system will automatically try OpenRouter:
# Default fallback chain (in order):
# 1. CHATGPT (Azure OpenAI)
# 2. OPENROUTER (with openai/gpt-5)
# 3. GEMINI (Google Gemini)
# You can customize the fallback chain with environment variable:
import os
import json
custom_chain = [
{"api_type": "OPENROUTER", "model_id": "anthropic/claude-sonnet-4.5"},
{"api_type": "OPENROUTER", "model_id": "openai/gpt-5"},
{"api_type": "GEMINI", "model_id": "gemini-2.5-flash"}
]
os.environ["AGENTIC_TOOL_FALLBACK_CHAIN"] = json.dumps(custom_chain)
Advanced ConfigurationΒΆ
Custom Model LimitsΒΆ
Override default token limits for specific models:
import os
import json
custom_limits = {
"openai/gpt-5": {
"max_output": 32000, # Custom limit
"context_window": 1048576
},
"anthropic/claude-sonnet-4.5": {
"max_output": 4096,
"context_window": 200000
}
}
os.environ["OPENROUTER_DEFAULT_MODEL_LIMITS"] = json.dumps(custom_limits)