JUHE API Marketplace
MetehanYasar11 avatar
MCP Server

Ultralytics MCP Server

A Model Context Protocol compliant server that provides RESTful API access to Ultralytics YOLO operations for computer vision tasks including training, validation, prediction, export, tracking, and benchmarking.

1
GitHub Stars
11/17/2025
Last Updated
No Configuration
Please check the documentation below.
  1. Home
  2. MCP Servers
  3. ultralytics_mcp_server

README Documentation

šŸŽÆ RCT Detector Platform - Ultralytics MCP Server

Advanced AI-powered object detection platform with intelligent dataset upload, custom model training, and MCP integration for N8N automation.

A comprehensive Model Context Protocol (MCP) server that seamlessly integrates Ultralytics YOLO models with N8N workflows, providing a complete AI-powered computer vision solution with 10GB dataset upload support and intelligent background processing.

✨ Key Features

šŸŽÆ Core Capabilities

  • šŸ”¬ Advanced AI Detection: YOLO-based object detection and analysis
  • šŸ“¦ Smart Dataset Upload: 10GB limit with intelligent ZIP structure detection
  • šŸŽÆ Custom Model Training: Train your own models with any YOLO dataset
  • šŸ¤– YOLO11 Model Variants: Choose from nano/small/medium/large/x-large base models
  • ⚔ GPU Acceleration: NVIDIA CUDA support for fast training/inference
  • 🌐 Web Interface: Beautiful Streamlit dashboard
  • šŸ“Š Real-time Monitoring: Live GPU stats and training progress
  • šŸ”Œ MCP Integration: Connect with N8N for workflow automation
  • šŸ›”ļø Background Processing: Stable upload handling for large files

ļæ½ Quick Start

One-Command Setup

For Windows users:

setup.bat

For Linux/Mac users:

chmod +x setup.sh
./setup.sh

Manual setup:

docker-compose up --build -d

Access the Platform

  • 🌐 Main Interface: http://localhost:8501
  • šŸ“Š TensorBoard: http://localhost:6006
  • šŸ”Œ MCP Server: http://localhost:8092
  • šŸ““ Jupyter: http://localhost:8888

ļæ½ Requirements

  • Docker & Docker Compose
  • NVIDIA Docker Runtime (for GPU support)
  • 8GB+ RAM recommended
  • 50GB+ free disk space

šŸŽÆ Dataset Upload

Supported ZIP Structures

The platform automatically detects and organizes various ZIP structures:

āœ… Structure 1 (Flat):
dataset.zip
ā”œā”€ā”€ data.yaml
ā”œā”€ā”€ images/
│   ā”œā”€ā”€ img1.jpg
│   └── img2.jpg
└── labels/
    ā”œā”€ā”€ img1.txt
    └── img2.txt

āœ… Structure 2 (Nested):
dataset.zip
└── my_dataset/
    ā”œā”€ā”€ data.yaml
    ā”œā”€ā”€ images/
    │   ā”œā”€ā”€ train/
    │   └── val/
    └── labels/
        ā”œā”€ā”€ train/
        └── val/

āœ… Structure 3 (Split folders):
dataset.zip
ā”œā”€ā”€ data.yaml
ā”œā”€ā”€ train/
│   ā”œā”€ā”€ images/
│   └── labels/
└── val/
    ā”œā”€ā”€ images/
    └── labels/

Upload Process

  1. Navigate to Training page
  2. Click Upload Custom Dataset
  3. Select your ZIP file (up to 10GB)
  4. Enter dataset name
  5. Click Upload Dataset
  6. Do NOT refresh during processing
  7. Wait for completion message
  • 🌐 Streamlit UI: http://localhost:8501
  • šŸ“Š TensorBoard: http://localhost:6006
  • šŸ““ Jupyter Lab: http://localhost:8888
  • šŸ”— MCP Server: http://localhost:8092

šŸŽ® Available Services

ServicePortDescriptionStatus
Streamlit Dashboard8501Interactive YOLO model interfaceāœ… Ready
MCP Server8092N8N integration endpointāœ… Ready
TensorBoard6006Training metrics visualizationāœ… Ready
Jupyter Lab8888Development environmentāœ… Ready

šŸ› ļø MCP Tools Available

Our MCP server provides 7 specialized tools for AI workflows:

  1. detect_objects - Real-time object detection in images
  2. train_model - Custom YOLO model training
  3. evaluate_model - Model performance assessment
  4. predict_batch - Batch processing for multiple images
  5. export_model - Model format conversion (ONNX, TensorRT, etc.)
  6. benchmark_model - Performance benchmarking
  7. analyze_dataset - Dataset statistics and validation

šŸ”Œ N8N Integration

Connect to N8N using our MCP server:

  1. Server Endpoint: http://localhost:8092
  2. Transport: Server-Sent Events (SSE)
  3. Health Check: http://localhost:8092/health

Example N8N Workflow

{
  "mcp_connection": {
    "transport": "sse",
    "endpoint": "http://localhost:8092/sse"
  }
}

šŸ“ Project Structure

ultralytics_mcp_server/
ā”œā”€ā”€ 🐳 docker-compose.yml          # Orchestration configuration
ā”œā”€ā”€ šŸ”§ Dockerfile.ultralytics      # CUDA-enabled Ultralytics container
ā”œā”€ā”€ šŸ”§ Dockerfile.mcp-connector    # Node.js MCP server container
ā”œā”€ā”€ šŸ“¦ src/
│   └── server.js                  # MCP server implementation
ā”œā”€ā”€ šŸŽØ main_dashboard.py           # Streamlit main interface
ā”œā”€ā”€ šŸ“„ pages/                      # Streamlit multi-page app
│   ā”œā”€ā”€ train.py                   # Model training interface
│   └── inference.py               # Inference interface
ā”œā”€ā”€ ⚔ startup.sh                   # Container initialization script
ā”œā”€ā”€ šŸ“‹ .dockerignore               # Build optimization
└── šŸ“– README.md                   # This documentation

šŸ”§ Configuration

Environment Variables

  • CUDA_VISIBLE_DEVICES - GPU device selection
  • STREAMLIT_PORT - Streamlit service port (default: 8501)
  • MCP_PORT - MCP server port (default: 8092)
  • TENSORBOARD_PORT - TensorBoard port (default: 6006)

Custom Configuration

Edit docker-compose.yml to customize:

  • Port mappings
  • Volume mounts
  • Environment variables
  • Resource limits

šŸ“Š Usage Examples

Object Detection via Streamlit

  1. Navigate to http://localhost:8501
  2. Upload an image or video
  3. Select YOLO model variant and confidence threshold
  4. Run inference and view annotated results

Training Custom Models with YOLO11 Variants

  1. Go to Training page in Streamlit
  2. Upload custom dataset or select built-in datasets
  3. Choose YOLO11 Model Variant:
    • yolo11n: Fast training, good for testing (1.9M parameters)
    • yolo11s: Balanced performance (9.1M parameters)
    • yolo11m: Better accuracy (20.1M parameters)
    • yolo11l: High accuracy training (25.3M parameters)
    • yolo11x: Maximum accuracy (43.9M parameters)
  4. Configure epochs, batch size, image size
  5. Monitor real-time training progress with live GPU stats
  6. Models automatically save to workspace

Training Custom Models

  1. Access Training page in Streamlit interface
  2. Select YOLO11 Model Variant (nano/small/medium/large/x-large)
  3. Choose your dataset (built-in or custom upload)
  4. Configure training parameters (epochs, batch size, image size)
  5. Click Start Training and monitor progress
  6. Models auto-save to workspace for later use

Model Variant Selection:

  • yolo11n.pt - Nano: Fastest, lowest accuracy (1.9M params)
  • yolo11s.pt - Small: Good balance (9.1M params)
  • yolo11m.pt - Medium: Better accuracy (20.1M params)
  • yolo11l.pt - Large: High accuracy (25.3M params)
  • yolo11x.pt - X-Large: Highest accuracy (43.9M params)

N8N Automation

  1. Create N8N workflow
  2. Add MCP connector node
  3. Configure endpoint: http://localhost:8092
  4. Use available tools for automation

šŸ” Monitoring & Debugging

Container Status

docker ps
docker-compose logs ultralytics-container
docker-compose logs mcp-connector-container

Health Checks

# MCP Server
curl http://localhost:8092/health

# Streamlit
curl http://localhost:8501/_stcore/health

# TensorBoard
curl http://localhost:6006

šŸ”„ Restart & Maintenance

Restart Services

docker-compose restart

Update & Rebuild

docker-compose down
docker-compose up --build -d

Clean Reset

docker-compose down
docker system prune -f
docker-compose up --build -d

šŸŽÆ Performance Optimization

  • GPU Memory: Automatically managed by CUDA runtime
  • Batch Processing: Optimized for multiple image inference
  • Model Caching: Pre-loaded models for faster response
  • Multi-threading: Concurrent request handling

🚨 Troubleshooting

Common Issues

Container Restart Loop

# Check logs
docker-compose logs ultralytics-container

# Restart with rebuild
docker-compose down
docker-compose up --build -d

Streamlit Not Loading

# Verify container status
docker ps

# Check if files are copied correctly
docker exec ultralytics-container ls -la /ultralytics/

GPU Not Detected

# Check NVIDIA drivers
nvidia-smi

# Verify CUDA in container
docker exec ultralytics-container nvidia-smi

šŸ”§ Development

Local Development Setup

  1. Clone repository
  2. Install dependencies: npm install (for MCP server)
  3. Set up Python environment for Streamlit
  4. Run services individually for debugging

Adding New MCP Tools

  1. Edit src/server.js
  2. Add tool definition in tools array
  3. Implement handler in handleToolCall
  4. Test with N8N integration

šŸ¤ Contributing

  1. Fork the repository
  2. Create feature branch (git checkout -b feature/amazing-feature)
  3. Commit changes (git commit -m 'Add amazing feature')
  4. Push to branch (git push origin feature/amazing-feature)
  5. Open Pull Request

šŸ“„ License

This project is licensed under the AGPL-3.0 License - see the Ultralytics License for details.

šŸ™ Acknowledgments

  • Ultralytics - For the amazing YOLO implementation
  • N8N - For the workflow automation platform
  • Streamlit - For the beautiful web interface framework
  • NVIDIA - For CUDA support and GPU acceleration

šŸ“ž Support

  • šŸ› Issues: GitHub Issues
  • šŸ’¬ Discussions: GitHub Discussions
  • šŸ“§ Contact: Create an issue for support

Made with ā¤ļø for the AI Community

šŸš€ Ready to revolutionize your computer vision workflows? Start with docker-compose up -d!

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright Ā© 2025 - All rights reserved