JUHE API Marketplace
jsmiff avatar
MCP Server

MCP Base

A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.

0
GitHub Stars
11/23/2025
Last Updated
No Configuration
Please check the documentation below.
  1. Home
  2. MCP Servers
  3. mcp

README Documentation

MCP Base - A Generic Model Context Protocol Framework

This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.

šŸ“‹ Features

  • Standardized MCP Server: A base server implementation with support for HTTP and stdio transports
  • Generic MCP Client: A client for connecting to any MCP server
  • Ollama Integration: Ready-to-use services for generating embeddings and text with Ollama
  • Supabase Integration: Built-in support for Supabase vector database
  • Modular Design: Clearly organized structure for resources, tools, and prompts
  • Sample Templates: Example implementations to help you get started quickly

šŸ› ļø Directory Structure

_mcp-base/
ā”œā”€ā”€ server.ts            # Main MCP server implementation
ā”œā”€ā”€ client.ts            # Generic MCP client
ā”œā”€ā”€ utils/               # Utility services
│   ā”œā”€ā”€ ollama_embedding.ts    # Embedding generation with Ollama
│   └── ollama_text_generation.ts  # Text generation with Ollama
ā”œā”€ā”€ tools/               # Tool implementations
│   └── sample-tool.ts   # Example tool template
ā”œā”€ā”€ resources/           # Resource implementations
│   └── sample-resource.ts  # Example resource template
ā”œā”€ā”€ prompts/             # Prompt implementations
│   └── sample-prompt.ts # Example prompt template
└── README.md            # This documentation

šŸš€ Getting Started

Prerequisites

  • Node.js and npm/pnpm
  • Ollama for local embedding and text generation
  • Supabase account for vector storage

Environment Setup

Create a .env file with the following variables:

PORT=3000
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_SERVICE_KEY=your-service-key
OLLAMA_URL=http://localhost:11434
OLLAMA_EMBED_MODEL=nomic-embed-text
OLLAMA_LLM_MODEL=llama3
SERVER_MODE=http  # 'http' or 'stdio'

Server Initialization

  1. Import the required modules
  2. Register your resources, tools, and prompts
  3. Start the server
// Import base server and utilities
import server from "./server";
import { registerSampleResources } from "./resources/sample-resource";
import { registerSampleTool } from "./tools/sample-tool";
import { registerSamplePrompts } from "./prompts/sample-prompt";

// Initialize database if needed
async function initializeDatabase() {
  // Your database initialization logic
}

// Register your components
registerSampleResources(server, supabase);
registerSampleTool(server, textGenerator, embeddings, supabase);
registerSamplePrompts(server, supabase);

// Start the server
startServer();

Client Usage

import MCPClient from "./client";

// Create a client instance
const client = new MCPClient({
  serverUrl: "http://localhost:3000",
});

// Example: Call a tool
async function callSampleTool() {
  const result = await client.callTool("sample-tool", {
    query: "example query",
    maxResults: 5,
  });
  console.log(result);
}

// Example: Read a resource
async function readResource() {
  const items = await client.readResource("items://all");
  console.log(items);
}

// Example: Get a prompt
async function getPrompt() {
  const prompt = await client.getPrompt("simple-prompt", {
    task: "Explain quantum computing",
  });
  console.log(prompt);
}

// Don't forget to disconnect when done
await client.disconnect();

šŸ“š Extending the Framework

Creating a New Tool

  1. Create a new file in the tools/ directory
  2. Define your tool function and schema using Zod
  3. Implement your tool logic
  4. Register the tool in your server

Creating a New Resource

  1. Create a new file in the resources/ directory
  2. Define your resource endpoints and schemas
  3. Implement your resource logic
  4. Register the resource in your server

Creating a New Prompt

  1. Create a new file in the prompts/ directory
  2. Define your prompt schema and parameters
  3. Implement your prompt template
  4. Register the prompt in your server

šŸ“„ License

MIT

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright Ā© 2025 - All rights reserved