OpenRouter framework integration templates for Vercel AI SDK, LangChain, and OpenAI SDK.
This skill provides complete integration templates, setup scripts, and working examples for integrating OpenRouter with popular AI frameworks: Vercel AI SDK, LangChain, and OpenAI SDK.
createOpenAI()streamText() and generateText()useChat() hooktemplates/vercel-ai-sdk-config.ts - OpenRouter provider setuptemplates/vercel-api-route.ts - API route with streamingtemplates/vercel-chat-component.tsx - Chat UI componenttemplates/vercel-tools-config.ts - Tool calling setuptemplates/langchain-config.py - Python ChatOpenAI setuptemplates/langchain-config.ts - TypeScript ChatOpenAI setuptemplates/langchain-chain.py - LCEL chain templatetemplates/langchain-agent.py - Agent with toolstemplates/langchain-rag.py - RAG implementationtemplates/openai-sdk-config.ts - TypeScript configurationtemplates/openai-sdk-config.py - Python configurationtemplates/openai-streaming.ts - Streaming exampletemplates/openai-functions.ts - Function calling# Vercel AI SDK setup
bash scripts/setup-vercel-integration.sh
# LangChain setup (Python)
bash scripts/setup-langchain-integration.sh --python
# LangChain setup (TypeScript)
bash scripts/setup-langchain-integration.sh --typescript
# Validate integration is working
bash scripts/validate-integration.sh --framework vercel
# Test streaming functionality
bash scripts/test-streaming.sh --provider openrouter
# Check version compatibility
bash scripts/check-compatibility.sh
Read the setup script for your target framework:
Read: skills/provider-integration-templates/scripts/setup-vercel-integration.sh
Execute the setup script to install dependencies:
bash skills/provider-integration-templates/scripts/setup-vercel-integration.sh
Read the template you need:
Read: skills/provider-integration-templates/templates/vercel-ai-sdk-config.ts
Copy template to project:
cp skills/provider-integration-templates/templates/vercel-ai-sdk-config.ts src/lib/ai.ts
Customize with project-specific values:
YOUR_OPENROUTER_API_KEY with actual key or env varRead complete examples:
Read: skills/provider-integration-templates/examples/vercel-streaming-example.md
Examples show:
Run validation script:
bash skills/provider-integration-templates/scripts/validate-integration.sh --framework vercel
Test streaming:
bash scripts/test-streaming.sh --provider openrouter --model anthropic/claude-4.5-sonnet
src/lib/ai.tsapp/api/chat/route.tscomponents/chat.tsxsrc/config/langchain.pysrc/chains/chat_chain.pyHTTP-Referer and X-Title headersAll templates use these standard environment variables:
OPENROUTER_API_KEY=sk-or-v1-...
OPENROUTER_MODEL=anthropic/claude-4.5-sonnet
OPENROUTER_SITE_URL=https://yourapp.com # Optional: for rankings
OPENROUTER_SITE_NAME=YourApp # Optional: for rankings
Templates use configurable model selection. Common models:
anthropic/claude-4.5-sonnet - Best reasoning, long contextanthropic/claude-4.5-sonnet - Most capable, highest costmeta-llama/llama-3.1-70b-instruct - Fast, cost-effectiveopenai/gpt-4-turbo - Strong general purposegoogle/gemini-pro-1.5 - Long context, multimodalUpdate model selection in templates based on use case.
Issue: API key not working
sk-or-v1-...Issue: Streaming not working
stream: true in requestIssue: Model not found
provider/model-nameFor detailed implementation guides, load these files as needed:
examples/vercel-streaming-example.md - Complete Vercel AI SDK setupexamples/langchain-rag-example.md - RAG implementation guideexamples/openai-sdk-example.md - OpenAI SDK migration guideTemplate Version: 1.0.0 Framework Support: Vercel AI SDK 4.x, LangChain 0.3.x, OpenAI SDK 1.x Last Updated: 2025-10-31