-
-
Notifications
You must be signed in to change notification settings - Fork 112
feat: Introduce OpenRouter adapter #123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
398c89c
9dbd37e
bcbbab2
4a510a7
46d56cb
89162f0
5060ba5
de451f0
2abf19a
b41a579
9fa0f41
2ff8c94
ac1cbb9
72a6b55
37aadc9
0538dc5
3eca5f5
a38f9e7
49d16d6
4519fc7
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,13 @@ | ||
| { | ||
| "permissions": { | ||
| "allow": [ | ||
| "Bash(pnpm install:*)", | ||
| "Bash(node -e:*)", | ||
| "Bash(pnpm start:*)", | ||
| "Bash(pnpm test:lib:*)", | ||
| "Bash(pnpm typecheck:*)", | ||
| "Bash(pnpm build:*)", | ||
| "Bash(find:*)" | ||
| ] | ||
| } | ||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,132 @@ | ||
| --- | ||
| title: OpenRouter Adapter | ||
| id: openrouter-adapter | ||
| --- | ||
|
|
||
| The OpenRouter adapter provides access to 300+ AI models from various providers through a single unified API, including models from OpenAI, Anthropic, Google, Meta, Mistral, and many more. | ||
|
|
||
| ## Installation | ||
|
|
||
| ```bash | ||
| npm install @tanstack/ai-openrouter | ||
| ``` | ||
|
|
||
| ## Basic Usage | ||
|
|
||
| ```typescript | ||
| import { chat } from "@tanstack/ai"; | ||
| import { openRouterText } from "@tanstack/ai-openrouter"; | ||
|
|
||
| const stream = chat({ | ||
| adapter: openRouterText("openai/gpt-5"), | ||
| messages: [{ role: "user", content: "Hello!" }], | ||
| }); | ||
| ``` | ||
|
|
||
| ## Configuration | ||
|
|
||
| ```typescript | ||
| import { createOpenRouter, type OpenRouterConfig } from "@tanstack/ai-openrouter"; | ||
|
|
||
| const config: OpenRouterConfig = { | ||
| apiKey: process.env.OPENROUTER_API_KEY!, | ||
| baseURL: "https://openrouter.ai/api/v1", // Optional | ||
| httpReferer: "https://your-app.com", // Optional, for rankings | ||
| xTitle: "Your App Name", // Optional, for rankings | ||
| }; | ||
|
|
||
| const adapter = createOpenRouter(config.apiKey, config); | ||
| ``` | ||
|
Comment on lines
+28
to
+39
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🧩 Analysis chain🏁 Script executed: #!/bin/bash
# Find the createOpenRouter function definition and exports
find packages/typescript/ai-openrouter -name "*.ts" -o -name "*.tsx" | head -20Repository: TanStack/ai Length of output: 809 🏁 Script executed: #!/bin/bash
# Search for createOpenRouter function definition
rg -n "export.*createOpenRouter|function createOpenRouter" packages/typescript/ai-openrouter -A 5Repository: TanStack/ai Length of output: 600 🏁 Script executed: #!/bin/bash
# Search for createOpenRouter usage in the codebase
rg -n "createOpenRouter\(" --type=ts -B 2 -A 2 | head -100Repository: TanStack/ai Length of output: 3031 Fix the The function signature is
The current example passes the full config (with 🤖 Prompt for AI Agents |
||
|
|
||
| ## Available Models | ||
|
|
||
| OpenRouter provides access to 300+ models from various providers. Models use the format `provider/model-name`: | ||
|
|
||
| ```typescript | ||
| model: "openai/gpt-5.1" | ||
| model: "anthropic/claude-sonnet-4.5" | ||
| model: "google/gemini-3-pro-preview" | ||
| model: "meta-llama/llama-4-maverick" | ||
| model: "deepseek/deepseek-v3.2" | ||
| ``` | ||
|
|
||
| See the full list at [openrouter.ai/models](https://openrouter.ai/models). | ||
|
|
||
| ## Example: Chat Completion | ||
|
|
||
| ```typescript | ||
| import { chat, toServerSentEventsResponse } from "@tanstack/ai"; | ||
| import { openRouterText } from "@tanstack/ai-openrouter"; | ||
|
|
||
| export async function POST(request: Request) { | ||
| const { messages } = await request.json(); | ||
|
|
||
| const stream = chat({ | ||
| adapter: openRouterText("openai/gpt-5"), | ||
| messages, | ||
| }); | ||
|
|
||
| return toServerSentEventsResponse(stream); | ||
| } | ||
| ``` | ||
|
|
||
| ## Example: With Tools | ||
|
|
||
| ```typescript | ||
| import { chat, toolDefinition } from "@tanstack/ai"; | ||
| import { openRouterText } from "@tanstack/ai-openrouter"; | ||
| import { z } from "zod"; | ||
|
|
||
| const getWeatherDef = toolDefinition({ | ||
| name: "get_weather", | ||
| description: "Get the current weather", | ||
| inputSchema: z.object({ | ||
| location: z.string(), | ||
| }), | ||
| }); | ||
|
|
||
| const getWeather = getWeatherDef.server(async ({ location }) => { | ||
| return { temperature: 72, conditions: "sunny" }; | ||
| }); | ||
|
|
||
| const stream = chat({ | ||
| adapter: openRouterText("openai/gpt-5"), | ||
| messages, | ||
| tools: [getWeather], | ||
| }); | ||
AlemTuzlak marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| ``` | ||
|
|
||
|
|
||
|
|
||
| ## Environment Variables | ||
|
|
||
| Set your API key in environment variables: | ||
|
|
||
| ```bash | ||
| OPENROUTER_API_KEY=sk-or-... | ||
| ``` | ||
|
|
||
| ## Model Routing | ||
|
|
||
| OpenRouter can automatically route requests to the best available provider: | ||
|
|
||
| ```typescript | ||
| const stream = chat({ | ||
| adapter: openRouterText("openrouter/auto"), | ||
| messages, | ||
| providerOptions: { | ||
| models: [ | ||
| "openai/gpt-4o", | ||
| "anthropic/claude-3.5-sonnet", | ||
| "google/gemini-pro", | ||
| ], | ||
| route: "fallback", // Use fallback if primary fails | ||
| }, | ||
| }); | ||
AlemTuzlak marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| ``` | ||
|
|
||
| ## Next Steps | ||
|
|
||
| - [Getting Started](../getting-started/quick-start) - Learn the basics | ||
| - [Tools Guide](../guides/tools) - Learn about tools | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -38,12 +38,39 @@ | |
| A powerful, type-safe AI SDK for building AI-powered applications. | ||
|
|
||
| - Provider-agnostic adapters (OpenAI, Anthropic, Gemini, Ollama, etc.) | ||
| - **Multimodal content support** - Send images, audio, video, and documents | ||
| - Chat completion, streaming, and agent loop strategies | ||
| - Headless chat state management with adapters (SSE, HTTP stream, custom) | ||
| - Type-safe tools with server/client execution | ||
| - Isomorphic type-safe tools with server/client execution | ||
| - **Enhanced integration with TanStack Start** - Share implementations between AI tools and server functions | ||
|
|
||
| ### <a href="https://tanstack.com/ai">Read the docs →</b></a> | ||
|
|
||
| ## Bonus: TanStack Start Integration | ||
|
|
||
| TanStack AI works with **any** framework (Next.js, Express, Remix, etc.). | ||
|
|
||
| **With TanStack Start**, you get a bonus: share implementations between AI tools and server functions with `createServerFnTool`: | ||
|
|
||
| ```typescript | ||
| import { createServerFnTool } from '@tanstack/ai-react' | ||
|
|
||
| // Define once, get AI tool AND server function (TanStack Start only) | ||
| const getProducts = createServerFnTool({ | ||
| name: 'getProducts', | ||
| inputSchema: z.object({ query: z.string() }), | ||
| execute: async ({ query }) => db.products.search(query), | ||
| }) | ||
|
|
||
| // Use in AI chat | ||
| chat({ tools: [getProducts.server] }) | ||
|
|
||
| // Call directly from components (no API endpoint needed!) | ||
| const products = await getProducts.serverFn({ query: 'laptop' }) | ||
| ``` | ||
|
|
||
| No duplicate logic, full type safety, automatic validation. The `serverFn` feature requires TanStack Start. See [docs](https://tanstack.com/ai) for details. | ||
|
|
||
|
Comment on lines
+49
to
+73
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Fix markdown heading level + code sample missing
-### <a href="https://tanstack.com/ai">Read the docs →</b></a>
+## <a href="https://tanstack.com/ai">Read the docs →</b></a>
@@
```typescript
import { createServerFnTool } from '@tanstack/ai-react'
+import { z } from 'zod'🤖 Prompt for AI Agents |
||
| ## Get Involved | ||
|
|
||
| - We welcome issues and pull requests! | ||
|
|
@@ -88,7 +115,7 @@ We're looking for TanStack AI Partners to join our mission! Partner with us to p | |
|
|
||
| - <a href="https://github.com/tanstack/config"><b>TanStack Config</b></a> – Tooling for JS/TS packages | ||
| - <a href="https://github.com/tanstack/db"><b>TanStack DB</b></a> – Reactive sync client store | ||
| - <a href="https://github.com/tanstack/devtools">TanStack Devtools</a> – Unified devtools panel | ||
| - <a href="https://github.com/tanstack/devtools"><b>TanStack Devtools</b></a> – Unified devtools panel | ||
| - <a href="https://github.com/tanstack/form"><b>TanStack Form</b></a> – Type‑safe form state | ||
| - <a href="https://github.com/tanstack/pacer"><b>TanStack Pacer</b></a> – Debouncing, throttling, batching | ||
| - <a href="https://github.com/tanstack/query"><b>TanStack Query</b></a> – Async state & caching | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,52 @@ | ||
| { | ||
| "name": "@tanstack/ai-openrouter", | ||
| "version": "0.0.1", | ||
| "description": "OpenRouter adapter for TanStack AI", | ||
| "author": "", | ||
| "license": "MIT", | ||
| "repository": { | ||
| "type": "git", | ||
| "url": "git+https://github.com/TanStack/ai.git", | ||
| "directory": "packages/typescript/ai-openrouter" | ||
| }, | ||
| "type": "module", | ||
| "module": "./dist/esm/index.js", | ||
| "types": "./dist/esm/index.d.ts", | ||
| "exports": { | ||
| ".": { | ||
| "types": "./dist/esm/index.d.ts", | ||
| "import": "./dist/esm/index.js" | ||
| } | ||
| }, | ||
| "files": [ | ||
| "dist", | ||
| "src" | ||
| ], | ||
| "scripts": { | ||
| "build": "vite build", | ||
| "clean": "premove ./build ./dist", | ||
| "lint:fix": "eslint ./src --fix", | ||
| "test:build": "publint --strict", | ||
| "test:eslint": "eslint ./src", | ||
| "test:lib": "vitest run", | ||
| "test:lib:dev": "pnpm test:lib --watch", | ||
| "test:types": "tsc" | ||
| }, | ||
| "keywords": [ | ||
| "ai", | ||
| "openrouter", | ||
| "tanstack", | ||
| "adapter" | ||
| ], | ||
| "dependencies": { | ||
| "@openrouter/sdk": "0.3.14", | ||
| "@tanstack/ai": "workspace:*" | ||
| }, | ||
| "devDependencies": { | ||
| "@vitest/coverage-v8": "4.0.14", | ||
| "vite": "^7.2.7" | ||
| }, | ||
| "peerDependencies": { | ||
| "@tanstack/ai": "workspace:*" | ||
| } | ||
| } |
Uh oh!
There was an error while loading. Please reload this page.