一个强大的MCP(模型上下文协议)服务器实现,利用Gemini的功能进行上下文管理和缓存。该服务器充分利用了Gemini的2M令牌上下文窗口的价值,同时提供了高效缓存大量上下文的工具。
# Clone the repository
git clone https://github.com/ogoldberg/gemini-context-mcp-server
cd gemini-context-mcp-server
# Install dependencies
npm install
# Copy environment variables example
cp .env.example .env
# Add your Gemini API key to .env file
# GEMINI_API_KEY=your_api_key_here

# Build the server
npm run build
# Start the server
node dist/mcp-server.js

此MCP服务器可以与各种兼容MCP的客户端集成:
有关每个客户端的详细集成说明,请参阅MCP文档中的MCP客户端配置指南。
使用我们简化的客户端安装命令:
# Install and configure for Claude Desktop
npm run install:claude
# Install and configure for Cursor
npm run install:cursor
# Install and configure for VS Code
npm run install:vscode

每个命令都会设置适当的配置文件,并提供完成集成的说明。
启动服务器:
node dist/mcp-server.js
使用提供的测试脚本进行交互:
# 测试基本上下文管理
node test-gemini-context.js
# 测试缓存功能
node test-gemini-api-cache.js

import { GeminiContextServer } from './src/gemini-context-server.js';
async function main() {
// Create server instance
const server = new GeminiContextServer();
// Generate a response in a session
const sessionId = "user-123";
const response = await server.processMessage(sessionId, "What is machine learning?");
console.log("Response:", response);
// Ask a follow-up in the same session (maintains context)
const followUp = await server.processMessage(sessionId, "What are popular algorithms?");
console.log("Follow-up:", followUp);
}
main();

// Custom configuration
const config = {
gemini: {
apiKey: process.env.GEMINI_API_KEY,
model: 'gemini-2.0-pro',
temperature: 0.2,
maxOutputTokens: 1024,
},
server: {
sessionTimeoutMinutes: 30,
maxTokensPerSession: 1000000
}
};
const server = new GeminiContextServer(config);

// Create a cache for large system instructions
const cacheName = await server.createCache(
'Technical Support System',
'You are a technical support assistant for a software company...',
7200 // 2 hour TTL
);
// Generate content using the cache
const response = await server.generateWithCache(
cacheName,
'How do I reset my password?'
);
// Clean up when done
await server.deleteCache(cacheName);

该服务器实现了模型上下文协议(MCP),使其与Cursor或其他增强型开发环境等工具兼容。
上下文管理工具:
generate_text
- 生成带有上下文的文本get_context
- 获取会话的当前上下文clear_context
- 清除会话上下文add_context
- 添加特定的上下文条目search_context
- 语义上查找相关上下文缓存工具:
mcp_gemini_context_create_cache
- 为大型上下文创建缓存mcp_gemini_context_generate_with_cache
- 使用缓存的上下文生成mcp_gemini_context_list_caches
- 列出所有可用的缓存mcp_gemini_context_update_cache_ttl
- 更新缓存TTLmcp_gemini_context_delete_cache
- 删除缓存当与 Cursor 一起使用时,可以通过MCP配置进行连接:
{
"name": "gemini-context",
"version": "1.0.0",
"description": "Gemini context management and caching MCP server",
"entrypoint": "dist/mcp-server.js",
"capabilities": {
"tools": true
},
"manifestPath": "mcp-manifest.json",
"documentation": "README-MCP.md"
}

有关MCP工具的详细使用说明,请参阅 README-MCP.md。
创建一个包含以下选项的 .env
文件:
# Required
GEMINI_API_KEY=your_api_key_here
GEMINI_MODEL=gemini-2.0-flash
# Optional - Model Settings
GEMINI_TEMPERATURE=0.7
GEMINI_TOP_K=40
GEMINI_TOP_P=0.9
GEMINI_MAX_OUTPUT_TOKENS=2097152
# Optional - Server Settings
MAX_SESSIONS=50
SESSION_TIMEOUT_MINUTES=120
MAX_MESSAGE_LENGTH=1000000
MAX_TOKENS_PER_SESSION=2097152
DEBUG=false

# Build TypeScript files
npm run build
# Run in development mode with auto-reload
npm run dev
# Run tests
npm test

MIT