JUHE API Marketplace
arberrexhepi avatar
MCP Server

MCP Hub

An Express server implementation of Model Context Protocol that allows websites to connect to LLMs through streamable HTTP and stdio transports, with a built-in chat UI for testing responses.

0
GitHub Stars
10/8/2025
Last Updated
No Configuration
Please check the documentation below.

README Documentation

MCP Burst

An Express app with AI capabilities powered by MCP (Model Context Protocol)

MCP Burst supports multiple server types and includes a built-in chat interface for testing. In a way acting as a bridge for MCP Clients to an array of tools and MCP servers/server gateways.

What it does

  • Chat Integration: Works with Ollama, OpenAI, and Docker Model Runner
  • MCP Server Support: Runs streamablehttp servers, stdio servers such as Docker MCP Gateway server, and n8n MCP trigger node URLs
  • Custom Tools: Create discoverable tools at the /mcp endpoint
  • Multi-Client Support: Use MCP_BURST_CLIENT=true in .env to serve multiple clients simultaneously

Quick Status Check

  • Session debugging: curl http://localhost:4000/status
  • Note: Multi-client sessions are still being thoroughly tested

NOTE: You might have to upgrade the security if you want to use this in production. Otherwise, test and be patient until a secure merge

Running MCP Burst in MCP Clients

VSCode:

  1. In your MCP Burst folder, run the hub:
npm run start:hub
  1. Create .vscode/mcp.json in your workspces for example, using this config:
{
  "servers": {
    "mcp-burst": {
      "type": "http",
      "url": "http://127.0.0.1:4000",
    }
  }
}

Claude Desktop: Update your claude_desktop_config.json to include:

{
  "mcpServers": {
    "mcp-burst": {
      "command": "node",
      "args": ["/path/to/mcpburst/hub/stdio-client-hub-entry.js"]
    }
  }
}

Prebuilt Demo Chatbot Agent App

In your MCP Burst folder, run these commands: 1.

npm run start:hub
npm run start:server

Table of Contents

Features

  • Streamable HTTP MCP transport server built on Model Context Protocol SDK
  • Serve MCP Burst as an http and/or stdio client (tested with Claude Desktop, VS Code)
  • Built-in demo tools: 'echo' and 'update_session_planner'
  • Express façade handling JSON-RPC at /mcp and health checks at /health
  • Built in Planner Tool (Required for built in chatbot, but not necessarily for MCP Clients.)
  • OPTIONAL: Built in Gamification (ie positive reinforcement for successfully completing tasks, worked well with Claude Desktop so I kept it in the repo)
  • Demo Chatbot Agent App to test MCP integration, and now session planner resource to execute multiple tools.

Prerequisites

  • Node.js v16 or higher
  • npm (included with Node.js)

Installation

  1. Clone the repository:

    git clone https://github.com/arberrexhepi/mcpburst.git
    cd mcpburst
    
  2. Install root dependencies in root:

    npm install
    
  3. Install hub and server dependencies:

    cd hub && npm install && cd ..
    

Configuration

Create environment variable files in both hub/ and server/ directories:

hub/.env

PORT=4000
HOST=localhost
MCP_REQUIRE_AUTH=false   # set to 'true' to require Bearer auth NOTE: 
MCP_BURST_CLIENT=true #set to false if you don't want it to be discoverable

server/.env

OPENAI_API_KEY=your_openai_api_key
MCP_ENDPOINT=http://127.0.0.1:4000/mcp
PORT = 3500
STRATEGY=DMR
DOCKER_MODEL_RUNNER_URL=http://localhost:12434/engines/llama.cpp/v1/chat/completions
OLLAMA_URL=http://localhost:11434/v1/chat/completions

Usage

  1. Build and start the MCP hub server:

    npm run start:hub
    
  2. Start the front-end proxy server:

    npm run start:server
    
  3. Open your browser at http://localhost:3000 to access the chat UI.

  4. The hub JSON-RPC endpoint is available at http://localhost:4000/mcp.

Directory Structure

├── LICENSE
├── package.json                    # root package config and scripts
├── README.md
├── hub/                           # MCP hub server
│   ├── package.json
│   ├── tsconfig.json
│   ├── dist/                      # TypeScript sources output to dist/
│   ├── installToolsFeature.ts
│   ├── installResourcesFeature.ts
│   ├── stdio-client-hub-entry.js
│   ├── bridgeBuilder.ts
│   ├── bridgeHttp.ts
│   ├── bridgeStdio.ts
│   ├── hub.ts
│   └── bridges/                   # sample tool definitions
│       └── hub.yaml
└── server/                        # proxy and front-end assets
    ├── index.js
    ├── extractContent.js
    ├── public/
    │   ├── index.html
    │   ├── css/
    │   │   └── styles.css
    │   └── js/
    │       └── chat.js
    └── agent_functions/
        ├── index.js
        ├── llmClient.js
        ├── mcpClient.js
        ├── planExecutor.js
        └── README.MD

Scripts

All scripts are defined in the root package.json:

  • npm run build : Compile TypeScript in hub and copy bridge files
  • npm run copy:bridges: Copy bridge YAML definitions to hub/dist
  • npm run start:hub : Build then run the MCP hub server
  • npm run start:server: Run the Express static server with chat UI

License

  • This project is licensed under the Apache License 2.0. See the LICENSE file for details.

Quick Actions

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.