README Documentation
๐ง LLM Tool-Calling Assistant with MCP Integration
Connect your local LLM to real-world tools, knowledge bases, and APIs via MCP.
This project connects a local LLM (e.g. Qwen) to tools such as a calculator or a knowledge base via the MCP protocol. The assistant automatically detects and calls these tools to help answer user queries.
๐ฆ Features
- ๐ง Tool execution through MCP server
- ๐ง Local LLM integration via HTTP or OpenAI SDK
- ๐ Knowledge base support (
data.json) - โก Supports
stdioandssetransports
๐ Project Files
| File | Description |
|---|---|
server.py | Registers tools and starts MCP server |
client-http.py | Uses aiohttp to communicate with local LLM |
clientopenai.py | Uses OpenAI-compatible SDK for LLM + tool call logic |
client-stdio.py | MCP client using stdio |
client-see.py | MCP client using SSE |
data.json | Q&A knowledge base |
๐ฅ Installation
Requirements
Python 3.8+
Install dependencies:
pip install -r requirements.txt
requirements.txt
aiohttp==3.11.18
nest_asyncio==1.6.0
python-dotenv==1.1.0
openai==1.77.0
mcp==1.6.0
๐ Getting Started
1. Run the MCP server
python server.py
This launches your tool server with functions like add, multiply, and get_knowledge_base.
2. Start a client
Option A: HTTP client (local LLM via raw API)
python client-http.py
Option B: OpenAI SDK client
python client-openai.py
Option C: stdio transport
python client-stdio.py
Option D: SSE transport
Make sure server.py sets:
transport = "sse"
Then run:
python client-sse.py
๐ฌ Example Prompts
Math Tool Call
What is 8 times 3?
Response:
Eight times three is 24.
Knowledge Base Question
What are the healthcare benefits available to employees in Singapore?
Response will include the relevant answer from data.json.
๐ Example: data.json
[
{
"question": "What is Singapore's public holiday schedule?",
"answer": "Singapore observes several public holidays..."
},
{
"question": "How do I apply for permanent residency in Singapore?",
"answer": "Submit an online application via the ICA website..."
}
]
๐ง Configuration
Inside client-http.py or clientopenai.py, update the following:
LOCAL_LLM_URL = "..."
TOKEN = "your-api-token"
LOCAL_LLM_MODEL = "your-model"
Make sure your LLM is serving OpenAI-compatible API endpoints.
๐งน Cleanup
Clients handle tool calls and responses automatically. You can stop the server or client using Ctrl+C.
๐ชช License
MIT License. See LICENSE file.