MCP Server
A Python backend implementing the Model Context Protocol with Azure OpenAI integration, enabling applications to interact with LLMs through a standardized interface with streaming capabilities.
README Documentation
MCP Server - Model Context Protocol Implementation
A comprehensive Python backend implementing the Model Context Protocol (MCP) with JSON-RPC 2.0, Azure OpenAI integration, and Server-Sent Events streaming capabilities.
Features
- Complete MCP Protocol Support: JSON-RPC 2.0 compliant implementation
- Azure OpenAI Integration: Seamless connection to Azure OpenAI services
- Streaming Responses: Real-time streaming via Server-Sent Events (SSE)
- Resource Management: File system resource discovery and access
- Tool Execution: Extensible tool registry with validation
- Authentication: JWT-based authentication system
- Monitoring: Prometheus metrics collection
- Web Interface: Built-in testing and management interface
Architecture
├── app/
│ ├── core/
│ │ ├── config.py # Configuration management
│ │ ├── errors.py # Custom exception classes
│ │ └── logging.py # Structured logging setup
│ ├── protocol/
│ │ ├── enums.py # MCP protocol enumerations
│ │ └── models.py # Pydantic models for MCP
│ ├── services/
│ │ ├── llm.py # Azure OpenAI service
│ │ ├── resources.py # Resource management
│ │ └── tools.py # Tool registry and execution
│ ├── transport/
│ │ └── http.py # HTTP transport layer
│ ├── auth.py # JWT authentication
│ └── metrics.py # Prometheus metrics
├── static/
│ └── app.js # Frontend JavaScript
├── templates/
│ └── index.html # Web interface
├── main.py # Application entry point
└── server.py # Flask app configuration
Installation
- Clone the repository:
git clone <repository-url>
cd mcp-server
- Install dependencies:
pip install -r requirements.txt
- Set up environment variables:
# Required for Azure OpenAI
export OPENAI_API_KEY="your-azure-openai-api-key"
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com"
export AZURE_OPENAI_DEPLOYMENT="your-deployment-name"
export AZURE_OPENAI_API_VERSION="2024-08-01-preview"
# Optional configurations
export JWT_SECRET="your-jwt-secret"
export SESSION_SECRET="your-session-secret"
Configuration
The server supports both Azure OpenAI and standard OpenAI configurations:
Azure OpenAI (Recommended)
USE_AZURE_OPENAI = True
AZURE_OPENAI_ENDPOINT = "https://your-resource.openai.azure.com"
AZURE_OPENAI_DEPLOYMENT = "gpt-4o"
AZURE_OPENAI_API_VERSION = "2024-08-01-preview"
Standard OpenAI
USE_AZURE_OPENAI = False
OPENAI_MODEL = "gpt-4o"
Running the Server
Development
python main.py
Production
gunicorn --bind 0.0.0.0:5000 --reuse-port --reload main:app
The server will be available at http://localhost:5000
API Endpoints
MCP Protocol
POST /rpc
- JSON-RPC 2.0 endpoint for MCP requestsGET /events
- Server-Sent Events for streaming responses
Management
GET /
- Web interface for testing and managementGET /health
- Health check endpointGET /metrics
- Prometheus metrics
Authentication
The server uses JWT-based authentication. Include the token in requests:
# HTTP Headers
Authorization: Bearer <token>
# Query Parameters (for SSE)
?token=<token>
Default development token: devtoken
MCP Protocol Support
Capabilities
- Resources: File system resource discovery and reading
- Tools: Extensible tool execution with validation
- Sampling: LLM completion requests (streaming and non-streaming)
- Logging: Structured JSON logging
Example Requests
Initialize Connection
{
"jsonrpc": "2.0",
"id": "init",
"method": "initialize",
"params": {
"protocolVersion": "2024-11-05",
"capabilities": {},
"clientInfo": {"name": "test-client", "version": "1.0.0"}
}
}
List Resources
{
"jsonrpc": "2.0",
"id": "resources",
"method": "resources/list",
"params": {}
}
Execute Tool
{
"jsonrpc": "2.0",
"id": "tool",
"method": "tools/call",
"params": {
"name": "calculate",
"arguments": {"operation": "add", "a": 5, "b": 3}
}
}
LLM Completion
{
"jsonrpc": "2.0",
"id": "completion",
"method": "sampling/createMessage",
"params": {
"messages": [{"content": {"type": "text", "text": "Hello, world!"}}],
"maxTokens": 100
}
}
Extending the Server
Adding New Tools
from app.services.tools import mcp_tool
@mcp_tool("my_tool", {
"type": "object",
"properties": {
"param1": {"type": "string"},
"param2": {"type": "number"}
},
"required": ["param1"]
})
async def my_custom_tool(param1: str, param2: float = 0.0):
"""Custom tool implementation"""
return {"result": f"Processed {param1} with {param2}"}
Custom Resource Handlers
from app.services.resources import ResourceService
class CustomResourceService(ResourceService):
async def list_resources(self, base_path: str = "."):
# Custom resource discovery logic
pass
Monitoring
The server includes comprehensive monitoring:
- Prometheus Metrics: Request counts, response times, error rates
- Structured Logging: JSON-formatted logs with correlation IDs
- Health Checks: Application and dependency status
Security
- Environment-based configuration (no hardcoded secrets)
- JWT authentication with configurable secrets
- Input validation on all endpoints
- Rate limiting headers from Azure OpenAI
Development
Running Tests
# Test the API endpoints
curl -X POST http://localhost:5000/rpc \
-H "Authorization: Bearer devtoken" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":"test","method":"initialize","params":{}}'
# Test streaming
curl -N "http://localhost:5000/events?token=devtoken&prompt=Hello&stream=true"
Adding Dependencies
pip install <package-name>
pip freeze > requirements.txt
Troubleshooting
Common Issues
-
Azure OpenAI Connection Errors
- Verify
AZURE_OPENAI_ENDPOINT
andAZURE_OPENAI_DEPLOYMENT
- Check API key permissions
- Ensure correct API version
- Verify
-
Authentication Failures
- Verify JWT token format
- Check token expiration
- Ensure correct secret configuration
-
Streaming Issues
- Use query parameters for SSE authentication
- Check network connectivity for long-running streams
Debug Logging
Enable debug logging by setting:
export DEBUG=true
License
This project is licensed under the MIT License.
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Submit a pull request
Support
For issues and questions:
- Check the troubleshooting section
- Review the API documentation
- Open an issue on GitHub