JUHE API Marketplace
o6-webwork avatar
MCP Server

LLM Tool-Calling Assistant

Connects local LLMs to external tools (calculator, knowledge base) via MCP protocol, enabling automatic tool detection and execution to enhance query responses.

0
GitHub Stars
8/23/2025
Last Updated
No Configuration
Please check the documentation below.

README Documentation

🧠 LLM Tool-Calling Assistant with MCP Integration

Connect your local LLM to real-world tools, knowledge bases, and APIs via MCP.

This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.


📦 Features

  • 🔧 Tool execution through MCP server
  • 🧠 Local LLM integration via HTTP or OpenAI SDK
  • 📚 Knowledge base support (data.json)
  • ⚡ Supports stdio and sse transports

🗂 Project Files

FileDescription
server.pyRegisters tools and starts MCP server
client-http.pyUses aiohttp to communicate with local LLM
clientopenai.pyUses OpenAI-compatible SDK for LLM + tool call logic
client-stdio.pyMCP client using stdio
client-see.pyMCP client using SSE
data.jsonQ&A knowledge base

📥 Installation

Requirements

Python 3.8+

Install dependencies:

pip install -r requirements.txt

requirements.txt

aiohttp==3.11.18
nest_asyncio==1.6.0
python-dotenv==1.1.0
openai==1.77.0
mcp==1.6.0

🚀 Getting Started

1. Run the MCP server

python server.py

This launches your tool server with functions like add, multiply, and get_knowledge_base.

2. Start a client

Option A: HTTP client (local LLM via raw API)

python client-http.py

Option B: OpenAI SDK client

python client-openai.py

Option C: stdio transport

python client-stdio.py

Option D: SSE transport

Make sure server.py sets:

transport = "sse"

Then run:

python client-sse.py

💬 Example Prompts

Math Tool Call

What is 8 times 3?

Response:

Eight times three is 24.

Knowledge Base Question

What are the healthcare benefits available to employees in Singapore?

Response will include the relevant answer from data.json.


📁 Example: data.json

[
  {
    "question": "What is Singapore's public holiday schedule?",
    "answer": "Singapore observes several public holidays..."
  },
  {
    "question": "How do I apply for permanent residency in Singapore?",
    "answer": "Submit an online application via the ICA website..."
  }
]

🔧 Configuration

Inside client-http.py or clientopenai.py, update the following:

LOCAL_LLM_URL = "..."
TOKEN = "your-api-token"
LOCAL_LLM_MODEL = "your-model"

Make sure your LLM is serving OpenAI-compatible API endpoints.


🧹 Cleanup

Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.


🪪 License

MIT License. See LICENSE file.

Quick Actions

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source