JUHE API Marketplace
NightTrek avatar
MCP Server

Ollama MCP Server

A bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.

74
GitHub Stars
11/22/2025
Last Updated
MCP Server Configuration
1{
2 "name": "ollama",
3 "command": "node",
4 "args": [
5 "/path/to/ollama-server/build/index.js"
6 ],
7 "env": {
8 "OLLAMA_HOST": "http://127.0.0.1: 11434"
9 }
10}
JSON10 lines
  1. Home
  2. MCP Servers
  3. Ollama-mcp

README Documentation

Ollama MCP Server

šŸš€ A powerful bridge between Ollama and the Model Context Protocol (MCP), enabling seamless integration of Ollama's local LLM capabilities into your MCP-powered applications.

🌟 Features

Complete Ollama Integration

  • Full API Coverage: Access all essential Ollama functionality through a clean MCP interface
  • OpenAI-Compatible Chat: Drop-in replacement for OpenAI's chat completion API
  • Local LLM Power: Run AI models locally with full control and privacy

Core Capabilities

  • šŸ”„ Model Management

    • Pull models from registries
    • Push models to registries
    • List available models
    • Create custom models from Modelfiles
    • Copy and remove models
  • šŸ¤– Model Execution

    • Run models with customizable prompts
    • Chat completion API with system/user/assistant roles
    • Configurable parameters (temperature, timeout)
    • Raw mode support for direct responses
  • šŸ›  Server Control

    • Start and manage Ollama server
    • View detailed model information
    • Error handling and timeout management

šŸš€ Getting Started

Prerequisites

  • Ollama installed on your system
  • Node.js and npm/pnpm

Installation

  1. Install dependencies:
pnpm install
  1. Build the server:
pnpm run build

Configuration

Add the server to your MCP configuration:

For Claude Desktop:

MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-server/build/index.js"],
      "env": {
        "OLLAMA_HOST": "http://127.0.0.1:11434"  // Optional: customize Ollama API endpoint
      }
    }
  }
}

šŸ›  Usage Examples

Pull and Run a Model

// Pull a model
await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "pull",
  arguments: {
    name: "llama2"
  }
});

// Run the model
await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "run",
  arguments: {
    name: "llama2",
    prompt: "Explain quantum computing in simple terms"
  }
});

Chat Completion (OpenAI-compatible)

await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "chat_completion",
  arguments: {
    model: "llama2",
    messages: [
      {
        role: "system",
        content: "You are a helpful assistant."
      },
      {
        role: "user",
        content: "What is the meaning of life?"
      }
    ],
    temperature: 0.7
  }
});

Create Custom Model

await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "create",
  arguments: {
    name: "custom-model",
    modelfile: "./path/to/Modelfile"
  }
});

šŸ”§ Advanced Configuration

  • OLLAMA_HOST: Configure custom Ollama API endpoint (default: http://127.0.0.1:11434)
  • Timeout settings for model execution (default: 60 seconds)
  • Temperature control for response randomness (0-2 range)

šŸ¤ Contributing

Contributions are welcome! Feel free to:

  • Report bugs
  • Suggest new features
  • Submit pull requests

šŸ“ License

MIT License - feel free to use in your own projects!


Built with ā¤ļø for the MCP ecosystem

Quick Install

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright Ā© 2025 - All rights reserved