MCP Server
Gemini MCP Server
A Python-based MCP server that enables integration of Gemini AI models with MCP-compatible applications like Cursor/Claude, allowing for interaction with Gemini APIs through the Model Context Protocol.
1
GitHub Stars
8/22/2025
Last Updated
MCP Server Configuration
1{
2 "name": "gemini",
3 "command": "docker",
4 "args": [
5 "run",
6 "--rm",
7 "-i",
8 "--network",
9 "host",
10 "-e",
11 "GEMINI_API_KEY",
12 "-e",
13 "GEMINI_MODEL",
14 "-e",
15 "GEMINI_BASE_URL",
16 "-e",
17 "HTTP_PROXY",
18 "-e",
19 "HTTPS_PROXY",
20 "gemini-mcp-server:latest"
21 ],
22 "env": {
23 "GEMINI_API_KEY": "your_api_key_here",
24 "GEMINI_MODEL": "gemini-2.5-flash",
25 "GEMINI_BASE_URL": "https://generativelanguage.googleapis.com/v1beta/openai/",
26 "HTTP_PROXY": "http://127.0.0.1: 17890",
27 "HTTPS_PROXY": "http://127.0.0.1: 17890"
28 }
29}
JSON29 lines
README Documentation
Gemimi MCP Server (in Python)
Model Context Protocol (MCP) server for Gemimi integration, built on FastMCP.
This server is implemented in Python, with fastmcp.
Quick Start
- Build the Docker image:
docker build -t gemini-mcp-server .
Integration with Cursor/Claude
In MCP Settings -> Add MCP server, add this config:
{
"mcpServers": {
"gemini": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--network",
"host",
"-e",
"GEMINI_API_KEY",
"-e",
"GEMINI_MODEL",
"-e",
"GEMINI_BASE_URL",
"-e",
"HTTP_PROXY",
"-e",
"HTTPS_PROXY",
"gemini-mcp-server:latest"
],
"env": {
"GEMINI_API_KEY":"your_api_key_here",
"GEMINI_MODEL":"gemini-2.5-flash",
"GEMINI_BASE_URL":"https://generativelanguage.googleapis.com/v1beta/openai/",
"HTTP_PROXY":"http://127.0.0.1:17890",
"HTTPS_PROXY":"http://127.0.0.1:17890"
}
}
}
}
Note: Don't forget to replace GEMINI_API_KEY
、GEMINI_MODEL
、GEMINI_BASE_URL
、HTTP_PROXY
、HTTPS_PROXY
values with your actual Gemimi credentials and instance URL.
Quick Install
Quick Actions
Key Features
Model Context Protocol
Secure Communication
Real-time Updates
Open Source