JUHE API Marketplace
ashishpatel26 avatar
MCP Server

Model Context Protocol Server

A server exposing intelligent tools for enhancing RAG applications with entity extraction, query refinement, and relevance checking capabilities.

31
GitHub Stars
11/16/2025
Last Updated
No Configuration
Please check the documentation below.
  1. Home
  2. MCP Servers
  3. Agentic-RAG-with-MCP-Server

README Documentation

🚀 Agentic RAG with MCP Server


✨ Overview

Agentic RAG with MCP Server is a powerful project that brings together an MCP (Model Context Protocol) server and client for building Agentic RAG (Retrieval-Augmented Generation) applications.

This setup empowers your RAG system with advanced tools such as:

  • 🕵️‍♂️ Entity Extraction
  • 🔍 Query Refinement
  • ✅ Relevance Checking

The server hosts these intelligent tools, while the client shows how to seamlessly connect and utilize them.


🖥️ Server — server.py

Powered by the FastMCP class from the mcp library, the server exposes these handy tools:

Tool NameDescriptionIcon
get_time_with_prefixReturns the current date & time⏰
extract_entities_toolUses OpenAI to extract entities from a query — enhancing document retrieval relevance🧠
refine_query_toolImproves the quality of user queries with OpenAI-powered refinement✨
check_relevanceFilters out irrelevant content by checking chunk relevance with an LLM✅

🤝 Client — mcp-client.py

The client demonstrates how to connect and interact with the MCP server:

  • Establish a connection with ClientSession from the mcp library
  • List all available server tools
  • Call any tool with custom arguments
  • Process queries leveraging OpenAI or Gemini and MCP tools in tandem

⚙️ Requirements

  • Python 3.9 or higher
  • openai Python package
  • mcp library
  • python-dotenv for environment variable management

🛠️ Installation Guide

# Step 1: Clone the repository
git clone https://github.com/ashishpatel26/Agentic-RAG-with-MCP-Server.git

# Step 2: Navigate into the project directory
cd Agentic-RAG-with-MCP-Serve

# Step 3: Install dependencies
pip install -r requirements.txt

🔐 Configuration

  1. Create a .env file (use .env.sample as a template)
  2. Set your OpenAI model in .env:
OPENAI_MODEL_NAME="your-model-name-here"
GEMINI_API_KEY="your-model-name-here"

🚀 How to Use

  1. Start the MCP server:
python server.py
  1. Run the MCP client:
python mcp-client.py

📜 License

This project is licensed under the MIT License.


Thanks for Reading 🙏

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright © 2025 - All rights reserved