JUHE API Marketplace
dhakalnirajan avatar
MCP Server

Blender Open MCP

A server that integrates Blender with local AI models via the Model Context Protocol, allowing users to control Blender using natural language prompts for 3D modeling tasks.

76
GitHub Stars
3/14/2026
Last Updated
MCP Server Configuration
1{
2 "name": "blender-open-mcp",
3 "command": "blender-mcp",
4 "args": [
5 "--transport",
6 "stdio"
7 ]
8}
JSON8 lines
  1. Home
  2. MCP Servers
  3. blender-open-mcp

README Documentation

blender-open-mcp

Open Models MCP for Blender3D using Ollama

Control Blender 3D with natural language prompts via local AI models. Built on the Model Context Protocol (MCP), connecting Claude, Cursor, or any MCP client to Blender through a local Ollama LLM.


Architecture

MCP Client (Claude/Cursor/CLI)
         │ HTTP / stdio
         ▼
┌─────────────────────┐
│   FastMCP Server    │  ← server.py  (port 8000)
│   blender-open-mcp  │
└─────────────────────┘
    │ TCP socket           │ HTTP
    ▼                      ▼
┌──────────────┐    ┌─────────────┐
│  Blender     │    │   Ollama    │  (port 11434)
│  Add-on      │    │  llama3.2   │
│  addon.py    │    │  gemma3...  │
│  (port 9876) │    └─────────────┘
└──────────────┘
       │ bpy
       ▼
  Blender Python API

Three independent processes:

  • FastMCP Server (server.py): Exposes MCP tools over HTTP or stdio
  • Blender Add-on (addon.py): TCP socket server running inside Blender
  • Ollama: Local LLM serving natural language queries

Installation

Prerequisites

DependencyVersionInstall
Blender3.0+blender.org
Python3.10+System or python.org
OllamaLatestollama.com
uvLatestpip install uv

1. Clone and set up

git clone https://github.com/dhakalnirajan/blender-open-mcp.git
cd blender-open-mcp

# Create virtual environment and install
uv venv
source .venv/bin/activate   # Linux / macOS
# .venv\Scripts\activate    # Windows

uv pip install -e .

2. Install the Blender Add-on

  1. Open Blender
  2. Go to Edit → Preferences → Add-ons → Install...
  3. Select addon.py from the repository root
  4. Enable "Blender MCP"
  5. Open the 3D Viewport, press N, find the Blender MCP panel
  6. Click "Start MCP Server" (default port: 9876)

3. Pull an Ollama model

ollama pull ollama run llama3.2

(Other models like Gemma3 can also be used.)

Setup

  1. Start the Ollama Server: Ensure Ollama is running in the background.

  2. Start the MCP Server:

blender-mcp

Custom options:

blender-mcp \
  --host 127.0.0.1 \
  --port 8000 \
  --blender-host localhost \
  --blender-port 9876 \
  --ollama-url http://localhost:11434 \
  --ollama-model llama3.2

For stdio transport (Claude Desktop, Cursor):

blender-mcp --transport stdio

Usage

MCP Client CLI

# Interactive shell
blender-mcp-client interactive

# One-shot scene info
blender-mcp-client scene

# Call a specific tool
blender-mcp-client tool blender_get_scene_info
blender-mcp-client tool blender_create_object '{"primitive_type": "SPHERE", "name": "MySphere"}'

# Natural language prompt
blender-mcp-client prompt "Create a metallic sphere at position 0, 0, 2"

# List all available tools
blender-mcp-client tools

Python API

import asyncio
from client.client import BlenderMCPClient

async def demo():
    async with BlenderMCPClient("http://localhost:8000") as client:
        # Scene inspection
        print(await client.get_scene_info())

        # Create objects
        await client.create_object("CUBE", name="MyCube", location=(0, 0, 0))
        await client.create_object("SPHERE", name="MySphere", location=(3, 0, 0))

        # Apply materials
        await client.set_material("MyCube", "GoldMat", color=[1.0, 0.84, 0.0, 1.0])

        # Move objects
        await client.modify_object("MySphere", location=(3, 0, 2), scale=(1.5, 1.5, 1.5))

        # PolyHaven assets
        categories = await client.get_polyhaven_categories("textures")
        await client.download_polyhaven_asset("brick_wall_001", resolution="2k")
        await client.set_texture("MyCube", "brick_wall_001")

        # Render
        await client.render_image("/tmp/my_render.png")

        # AI assistance
        response = await client.ai_prompt(
            "Write bpy code to add a sun light pointing down"
        )
        print(response)

        # Execute the generated code
        await client.execute_code(response)

asyncio.run(demo())

Claude Desktop / Cursor Integration

Add to your mcp.json (or ~/.cursor/mcp.json):

{
  "mcpServers": {
    "blender-open-mcp": {
      "command": "blender-mcp",
      "args": ["--transport", "stdio"]
    }
  }
}

Available Tools

ToolDescriptionModifies Blender
blender_get_scene_infoFull scene summary: objects, camera, render settingsNo
blender_get_object_infoDetailed object info: transforms, materials, mesh statsNo
blender_create_objectAdd a primitive mesh (CUBE, SPHERE, CYLINDER, ...)Yes
blender_modify_objectChange location, rotation, scale, visibilityYes
blender_delete_objectRemove an object from the sceneYes ⚠️
blender_set_materialCreate and assign a Principled BSDF materialYes
blender_render_imageRender current scene to a fileYes
blender_execute_codeRun arbitrary Python/bpy code in BlenderYes ⚠️
blender_get_polyhaven_categoriesList PolyHaven asset categoriesNo
blender_search_polyhaven_assetsSearch PolyHaven library with paginationNo
blender_download_polyhaven_assetDownload & import a PolyHaven assetYes
blender_set_textureApply a downloaded PolyHaven texture to an objectYes
blender_ai_promptSend a natural language prompt to OllamaNo
blender_get_ollama_modelsList available local Ollama modelsNo
blender_set_ollama_modelSwitch the active Ollama modelNo
blender_set_ollama_urlUpdate the Ollama server URLNo

Default Ports

ServicePort
FastMCP Server8000
Blender Add-on (TCP)9876
Ollama11434

Development

# Install dev dependencies
uv pip install -e ".[dev]"

# Run tests
pytest tests/ -v

# Type checking
mypy src/

# Linting
ruff check src/ client/

Troubleshooting

ProblemSolution
Cannot connect to Blender add-onOpen Blender → N-sidebar → Blender MCP → Start MCP Server
Cannot connect to OllamaRun ollama serve in a terminal
Object not foundCheck exact object name via blender_get_scene_info
Render failsEnsure the output directory exists and is writable
PolyHaven download failsCheck internet connection; try a lower resolution

License

MIT License. See LICENSE for details.

This project is not affiliated with the Blender Foundation.

Quick Install

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.aiFeatured on ShowMeBestAI
Copyright © 2026 JUHEDATA HK LIMITED - All rights reserved