RAG Documentation ServerRAG Documentation Server

Remote

API Integration

Integrate this MCP server into your applications.

Get your API Key

You'll need to login and generate a Smithery API key to connect to this server.

Installation

Install the official MCP SDKs using npm:

bash
npm install @modelcontextprotocol/sdk @smithery/sdk

TypeScript SDK

typescript

import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js"
import { createSmitheryUrl } from "@smithery/sdk"

const config = {
  "MODEL": "nomic-embed-text",
  "QDRANT_URL": "http://localhost:6333",
  "OPENAI_API_KEY": "string",
  "QDRANT_API_KEY": "string",
  "OLLAMA_BASE_URL": "http://localhost:11434",
  "EMBEDDINGS_PROVIDER": "ollama"
}
const serverUrl = createSmitheryUrl("https://server.smithery.ai/@sanderkooger/mcp-server-ragdocs", config, "your-smithery-api-key")

const transport = new StreamableHTTPClientTransport(serverUrl)

// Create MCP client
import { Client } from "@modelcontextprotocol/sdk/client/index.js"

const client = new Client({
	name: "Test client",
	version: "1.0.0"
})
await client.connect(transport)

// Use the server tools with your LLM application
const tools = await client.listTools()
console.log(`Available tools: ${tools.map(t => t.name).join(", ")}`)

Configuration Schema

Full JSON Schema for server configuration:

json
{
  "type": "object",
  "properties": {
    "MODEL": {
      "type": "string",
      "default": "nomic-embed-text",
      "description": "The model name to be used by Ollama. E.g., 'nomic-embed-text'."
    },
    "QDRANT_URL": {
      "type": "string",
      "default": "http://localhost:6333",
      "description": "The URL for the Qdrant vector database."
    },
    "OPENAI_API_KEY": {
      "type": "string",
      "description": "API key for OpenAI, if using openai as the provider."
    },
    "QDRANT_API_KEY": {
      "type": "string",
      "description": "API key for Qdrant, if required."
    },
    "OLLAMA_BASE_URL": {
      "type": "string",
      "default": "http://localhost:11434",
      "description": "The base URL for the Ollama service."
    },
    "EMBEDDINGS_PROVIDER": {
      "type": "string",
      "default": "ollama",
      "description": "Embeddings provider. Options: 'ollama' or 'openai'."
    }
  }
}