README Documentation
š§ MCP AI Server ā Modular Context Protocol for Intelligent Search
Welcome to the MCP AI Server, a powerful and modular tool that uses RAG-based retrieval, Pinecone vector storage, and MCP (Model Context Protocol) to create intelligent assistants capable of answering domain-specific questions from your own knowledge base.
š Features
ā
Local MCP server with FastAPI + Claude/ChatGPT integration
ā
Embedding using intfloat/multilingual-e5-large (via SentenceTransformer)
ā
Fast vector search with Pinecone
ā
Documented tools exposed to clients like Claude and Cursor IDE
ā
Secure .env usage for managing API keys
ā
Clean, extensible architecture
š§ Setup Instructions
1. Clone the Repo
git clone git@github.com:MeetRathodNitsan/MCP1.git
cd MCP1
2. Create a Virtual Environment
python -m venv .venv
# Windows
.venv\Scripts\activate
# macOS/Linux
source .venv/bin/activate
3. Install Dependencies
pip install -r requirements.txt
4. Configure Environment Variables
OPENAI_API_KEY=your-api-key...
PINECONE_API_KEY=...
PINECONE_ENVIRONMENT=your-env
5. How to use it
uv --directory F:/Project run main.py
Quick Actions
Key Features
Model Context Protocol
Secure Communication
Real-time Updates
Open Source