Leverage Google's Gemini model capabilities seamlessly with your LLM applications. Generate text, manage stateful chats, and handle files efficiently through a standardized tool-based interface. Simplify your integration with powerful features designed for optimal performance and ease of use.
Tools
exampleTool
An example tool that takes a name and returns a greeting message. Demonstrates the basic structure of an MCP tool using Zod for parameter definition.
gemini_generateContent
Generates non-streaming text content using a specified Google Gemini model. This tool takes a text prompt and returns the complete generated response from the model. It's suitable for single-turn generation tasks where the full response is needed at once. Optional parameters allow control over generation (temperature, max tokens, etc.) and safety settings.
gemini_generateContentStream
Generates text content as a stream using a specified Google Gemini model. This tool takes a text prompt and streams back chunks of the generated response as they become available. It's suitable for interactive use cases or handling long responses. Optional parameters allow control over generation and safety settings.
gemini_functionCall
Generates content using a specified Google Gemini model, enabling the model to request execution of predefined functions. This tool accepts function declarations and returns either the standard text response OR the details of a function call requested by the model. NOTE: This tool only returns the *request* for a function call; it does not execute the function itself.