API Integration
Integrate this MCP server into your applications.
Get your API Key
You'll need to login and generate a Smithery API key to connect to this server.
Installation
Install the official MCP SDKs using npm:
bash
npm install @modelcontextprotocol/sdk @smithery/sdk
TypeScript SDK
typescript
import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js"
import { createSmitheryUrl } from "@smithery/sdk"
const config = {
"proxyUrl": "string",
"gitlabHost": "string",
"qdrantHost": "string",
"qdrantPort": "string",
"braveApiKey": "string",
"enableTools": "string",
"gitlabToken": "string",
"openaiApiKey": "string",
"qdrantApiKey": "string",
"atlassianHost": "string",
"openaiBaseUrl": "string",
"useOpenrouter": "string",
"atlassianEmail": "string",
"atlassianToken": "string",
"deepseekApiKey": "string",
"googleAiApiKey": "string",
"googleTokenFile": "string",
"googleMapsApiKey": "string",
"openrouterApiKey": "string",
"useOllamaDeepseek": "string",
"openaiEmbeddingModel": "string",
"googleCredentialsFile": "string"
}
const serverUrl = createSmitheryUrl("https://server.smithery.ai/@athapong/aio-mcp", config, "your-smithery-api-key")
const transport = new StreamableHTTPClientTransport(serverUrl)
// Create MCP client
import { Client } from "@modelcontextprotocol/sdk/client/index.js"
const client = new Client({
name: "Test client",
version: "1.0.0"
})
await client.connect(transport)
// Use the server tools with your LLM application
const tools = await client.listTools()
console.log(`Available tools: ${tools.map(t => t.name).join(", ")}`)
Configuration Schema
Full JSON Schema for server configuration:
json
{
"type": "object",
"required": [
"braveApiKey",
"gitlabHost",
"gitlabToken",
"atlassianHost",
"atlassianEmail",
"atlassianToken"
],
"properties": {
"proxyUrl": {
"type": "string",
"description": "Proxy URL if required"
},
"gitlabHost": {
"type": "string",
"description": "GitLab host URL"
},
"qdrantHost": {
"type": "string",
"description": "Qdrant host URL"
},
"qdrantPort": {
"type": "string",
"description": "Port for Qdrant service"
},
"braveApiKey": {
"type": "string",
"description": "API key for Brave"
},
"enableTools": {
"type": "string",
"description": "Comma separated list of tools group to enable. Leave empty to enable all tools."
},
"gitlabToken": {
"type": "string",
"description": "Token for GitLab access"
},
"openaiApiKey": {
"type": "string",
"description": "API key for OpenAI"
},
"qdrantApiKey": {
"type": "string",
"description": "API key for Qdrant"
},
"atlassianHost": {
"type": "string",
"description": "Atlassian host URL"
},
"openaiBaseUrl": {
"type": "string",
"description": "Base URL for OpenAI API"
},
"useOpenrouter": {
"type": "string",
"description": "Flag to use OpenRouter"
},
"atlassianEmail": {
"type": "string",
"description": "Email for Atlassian"
},
"atlassianToken": {
"type": "string",
"description": "Token for Atlassian access"
},
"deepseekApiKey": {
"type": "string",
"description": "API key for Deepseek"
},
"googleAiApiKey": {
"type": "string",
"description": "API key for Google AI"
},
"googleTokenFile": {
"type": "string",
"description": "Path to Google token file"
},
"googleMapsApiKey": {
"type": "string",
"description": "API key for Google Maps"
},
"openrouterApiKey": {
"type": "string",
"description": "API key for OpenRouter"
},
"useOllamaDeepseek": {
"type": "string",
"description": "Flag to use Ollama Deepseek"
},
"openaiEmbeddingModel": {
"type": "string",
"description": "Model name for OpenAI embeddings"
},
"googleCredentialsFile": {
"type": "string",
"description": "Path to Google credentials file"
}
}
}