JUHE API Marketplace
ahmad-act avatar
MCP Server

Leave Manager MCP Tool Server

A centralized employee leave management system that allows users to check leave balances, apply for leave, and view leave history through an OpenAPI interface.

8
GitHub Stars
11/22/2025
Last Updated
No Configuration
Please check the documentation below.
  1. Home
  2. MCP Servers
  3. Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows

README Documentation

Local AI with Ollama, WebUI & MCP on Windows

A self-hosted AI stack combining Ollama for running language models, Open WebUI for user-friendly chat interaction, and MCP for centralized model managementβ€”offering full control, privacy, and flexibility without relying on the cloud.

This sample project provides an MCP-based tool server for managing employee leave balance, applications, and history. It is exposed via OpenAPI using mcpo for easy integration with Open WebUI or other OpenAPI-compatible clients.


πŸš€ Features

  • βœ… Check employee leave balance
  • πŸ“† Apply for leave on specific dates
  • πŸ“œ View leave history
  • πŸ™‹ Personalized greeting functionality

πŸ“ Project Structure

leave-manager/
β”œβ”€β”€ main.py                  # MCP server logic for leave management
β”œβ”€β”€ requirements.txt         # Python dependencies for the MCP server
β”œβ”€β”€ Dockerfile               # Docker image configuration for the leave manager
β”œβ”€β”€ docker-compose.yml       # Docker Compose file to run leave manager and Open WebUI
└── README.md                # Project documentation (this file)

πŸ“‹ Prerequisites

  1. Windows 10 or later (required for Ollama)
  2. Docker Desktop for Windows (required for Open WebUI and MCP)
    • Install from: Docker Desktop for Windows

πŸ› οΈ Workflow

  1. Install Ollama on Windows
  2. Pull the deepseek-r1 model
  3. Clone the repository and navigate to the project directory
  4. Run the docker-compose.yml file to launch services

Install Ollama

➀ Windows

  1. Download the Installer:

    • Visit Ollama Download and click Download for Windows to get OllamaSetup.exe.
    • Alternatively, download from Ollama GitHub Releases.
  2. Run the Installer:

    • Execute OllamaSetup.exe and follow the installation prompts.
    • After installation, Ollama runs as a background service, accessible at: http://localhost:11434.
    • Verify in your browser; you should see:
      Ollama is running
      

    Ollama Initial Window Ollama Setup Progress Ollama In System Tray Ollama On Browser

  3. Start Ollama Server (if not already running):

    ollama serve
    
    • Access the server at: http://localhost:11434.

Verify Installation

Check the installed version of Ollama:

ollama --version

Expected Output:

ollama version 0.7.1

Pull the deepseek-r1 Model

1. Pull the Default Model (7B):

Using PoweShell

ollama pull deepseek-r1

deepseek-r1

To Pull Specific Versions:

ollama run deepseek-r1:1.5b
ollama run deepseek-r1:671b

2. List Installed Models:

ollama list

Expected:

Expected Output:

NAME                    ID              SIZE
deepseek-r1:latest      xxxxxxxxxxxx    X.X GB

deepseek-r1:latest

4. Alternative Check via API:

curl http://localhost:11434/api/tags

Expected Output: A JSON response listing installed models, including deepseek-r1:latest.

alternative check

4. Test the API via PowerShell:

Invoke-RestMethod -Uri http://localhost:11434/api/generate -Method Post -Body '{"model": "deepseek-r1", "prompt": "Hello, world!", "stream": false}' -ContentType "application/json"

Expected Response: A JSON object containing the model's response to the "Hello, world!" prompt.

test the API

5. Run and Chat the Model via PowerShell:

ollama run deepseek-r1
  • This opens an interactive chat session with the deepseek-r1 model.
  • Type /bye and press Enter to exit the chat session.

run and chat

run and chat with Hi

exist chat


🐳 Run Open WebUI and MCP Server with Docker Compose

  1. Clone the Repository:

    git clone https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows.git
    cd Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows
    
  2. To launch both the MCP tool and Open WebUI locally (on Docker Desktop):

    docker-compose up --build
    

    exist chat exist chat exist chat exist chat exist chat exist chat exist chat

This will:

  • Start the Leave Manager (MCP Server) tool on port 8000
  • Launch Open WebUI at http://localhost:3000

🌐 Add MCP Tools to Open WebUI

The MCP tools are exposed via the OpenAPI specification at: http://localhost:8000/openapi.json.

  1. Open http://localhost:3000 in your browser.
  2. Click the Profile Icon and navigate to Settings. exist chat
  3. Select the Tools menu and click the Add (+) Button. exist chat
  4. Add a new tool by entering the URL: http://localhost:8000/. exist chat exist chat exist chat exist chat exist chat exist chat exist chat

πŸ’¬ Example Prompts

Use these prompts in Open WebUI to interact with the Leave Manager tool:

  • Check Leave Balance:
    Check how many leave days are left for employee E001
    
    exist chat exist chat
  • Apply for Leave:
    Apply
    ![exist chat](https://github.com/ahmad-act/Local-AI-with-Ollama-Open-WebUI-MCP-on-Windows/blob/main/readme-img/add-mcp-tools-on-open-webui-12.png)
    
  • View Leave History:
    What's the leave history of E001?
    
    exist chat
  • Personalized Greeting:
    Greet me as Alice
    
    exist chat

πŸ› οΈ Troubleshooting

  • Ollama not running: Ensure the service is active (ollama serve) and check http://localhost:11434.
  • Docker issues: Verify Docker Desktop is running and you have sufficient disk space.
  • Model not found: Confirm the deepseek-r1 model is listed with ollama list.
  • Port conflicts: Ensure ports 11434, 3000, and 8000 are free.

πŸ“š Additional Resources

  • Ollama Documentation
  • Open WebUI Documentation
  • Docker Desktop Documentation
  • MCP Documentation
  • OpenAPI Tool Servers
  • mcpo - Works with OpenAPI tools, SDKs, and UIs

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright Β© 2025 - All rights reserved