JUHE API Marketplace
CodeAlive-AI avatar
MCP Server

CodeAlive MCP

A Model Context Protocol server that enhances AI agents by providing deep semantic understanding of codebases, enabling more intelligent interactions through advanced code search and contextual awareness.

57
GitHub Stars
10/8/2025
Last Updated
No Configuration
Please check the documentation below.

README Documentation

CodeAlive MCP: Deepest Context Engine for your projects (especially for large codebases)

CodeAlive Logo

Connect your AI assistant to CodeAlive's powerful code understanding platform in seconds!

This MCP (Model Context Protocol) server enables AI clients like Claude Code, Cursor, Claude Desktop, Continue, VS Code (GitHub Copilot), Cline, Codex, OpenCode, Qwen Code, Gemini CLI, Roo Code, Goose, Kilo Code, Windsurf, Kiro, Qoder, and Amazon Q Developer to access CodeAlive's advanced semantic code search and codebase interaction features.

What is CodeAlive?

The most accurate and comprehensive Context Engine as a service, optimized for large codebases, powered by advanced GraphRAG and accessible via MCP. It enriches the context for AI agents like Cursor, Claude Code, Codex, etc., making them 35% more efficient and up to 84% faster.

It's like Context7, but for your (large) codebases.

It allows AI-Coding Agents to:

  • Find relevant code faster with semantic search
  • Understand the bigger picture beyond isolated files
  • Provide better answers with full project context
  • Reduce costs and time by removing guesswork

🛠 Available Tools

Once connected, you'll have access to these powerful tools:

  1. get_data_sources - List your indexed repositories and workspaces
  2. codebase_search - Semantic code search across your indexed codebase (main/master branch)
  3. codebase_consultant - AI consultant with full project expertise

🎯 Usage Examples

After setup, try these commands with your AI assistant:

  • "Show me all available repositories" → Uses get_data_sources
  • "Find authentication code in the user service" → Uses codebase_search
  • "Explain how the payment flow works in this codebase" → Uses codebase_consultant

Table of Contents

🚀 Quick Start (Remote)

The fastest way to get started - no installation required! Our remote MCP server at https://mcp.codealive.ai/api provides instant access to CodeAlive's capabilities.

Step 1: Get Your API Key

  1. Sign up at https://app.codealive.ai/
  2. Navigate to API Keys (under Organization)
  3. Click "+ Create API Key"
  4. Copy your API key immediately - you won't see it again!

Step 2: Choose Your AI Client

Select your preferred AI client below for instant setup:

🚀 Quick Start (Agentic Installation)

You may ask your AI agent to install the CodeAlive MCP server for you.

  1. Copy-Paste the following prompt into your AI agent (remember to insert your API key):
Here is CodeAlive API key: PASTE_YOUR_API_KEY_HERE

Add the CodeAlive MCP server by following the installation guide from the README at https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/README.md

Find the section "AI Client Integrations" and locate your client (Claude Code, Cursor, Gemini CLI, etc.). Each client has specific setup instructions:
- For Gemini CLI: Use the one-command setup with `gemini mcp add`
- For Claude Code: Use `claude mcp add` with the --transport http flag
- For other clients: Follow the configuration snippets provided

Prefer the Remote HTTP option when available. If API key is not provided above, help me issue a CodeAlive API key first.

Then allow execution.

  1. Restart your AI agent.

🤖 AI Client Integrations

Claude Code

Option 1: Remote HTTP (Recommended)

claude mcp add --transport http codealive https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"

Option 2: Docker (STDIO)

claude mcp add codealive-docker /usr/bin/docker run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.2.0

Replace YOUR_API_KEY_HERE with your actual API key.

Cursor

Option 1: Remote HTTP (Recommended)

  1. Open Cursor → Settings (Cmd+, or Ctrl+,)
  2. Navigate to "MCP" in the left panel
  3. Click "Add new MCP server"
  4. Paste this configuration:
{
  "mcpServers": {
    "codealive": {
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
  1. Save and restart Cursor

Option 2: Docker (STDIO)

{
  "mcpServers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}
Codex

OpenAI Codex CLI supports MCP via ~/.codex/config.toml. Remote HTTP MCP is still evolving; the most reliable way today is to launch CodeAlive via Docker (stdio).

~/.codex/config.toml (Docker stdio – recommended)

[mcp_servers.codealive]
command = "docker"
args = ["run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"]

If your Codex version advertises support for remote/HTTP transports, you can try an experimental config (may not work on all versions):

# Experimental; if supported by your Codex build
[mcp_servers.codealive]
url = "https://mcp.codealive.ai/api"
headers = { Authorization = "Bearer YOUR_API_KEY_HERE" }
Gemini CLI

One command setup (complete):

gemini mcp add --transport http secure-http https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"

Replace YOUR_API_KEY_HERE with your actual API key. That's it - no config files needed! 🎉

Continue

Option 1: Remote HTTP (Recommended)

  1. Create/edit .continue/config.yaml in your project or ~/.continue/config.yaml
  2. Add this configuration:
mcpServers:
  - name: CodeAlive
    type: streamable-http
    url: https://mcp.codealive.ai/api
    requestOptions:
      headers:
        Authorization: "Bearer YOUR_API_KEY_HERE"
  1. Restart VS Code

Option 2: Docker (STDIO)

mcpServers:
  - name: CodeAlive
    type: stdio
    command: docker
    args:
      - run
      - --rm
      - -i
      - -e
      - CODEALIVE_API_KEY=YOUR_API_KEY_HERE
      - ghcr.io/codealive-ai/codealive-mcp:v0.2.0
Visual Studio Code with GitHub Copilot

Option 1: Remote HTTP (Recommended)

  1. Open Command Palette (Ctrl+Shift+P or Cmd+Shift+P)
  2. Run "MCP: Add Server"
  3. Choose "HTTP" server type
  4. Enter this configuration:
{
  "servers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
  1. Restart VS Code

Option 2: Docker (STDIO)

Create .vscode/mcp.json in your workspace:

{
  "servers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}
Claude Desktop

Note: Claude Desktop remote MCP requires OAuth authentication. Use Docker option for Bearer token support.

Docker (STDIO)

  1. Edit your config file:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Add this configuration:

{
  "mcpServers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}
  1. Restart Claude Desktop
Cline

Option 1: Remote HTTP (Recommended)

  1. Open Cline extension in VS Code
  2. Click the MCP Servers icon to configure
  3. Add this configuration to your MCP settings:
{
  "mcpServers": {
    "codealive": {
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
  1. Save and restart VS Code

Option 2: Docker (STDIO)

{
  "mcpServers": {
    "codealive": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}
OpenCode

Add CodeAlive as a remote MCP server in your opencode.json.

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "codealive": {
      "type": "remote",
      "url": "https://mcp.codealive.ai/api",
      "enabled": true,
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}
Qwen Code

Qwen Code supports MCP via mcpServers in its settings.json and multiple transports (stdio/SSE/streamable-http). Use streamable-http when available; otherwise use Docker (stdio).

~/.qwen/settings.json (Streamable HTTP)

{
  "mcpServers": {
    "codealive": {
      "type": "streamable-http",
      "url": "https://mcp.codealive.ai/api",
      "requestOptions": {
        "headers": {
          "Authorization": "Bearer YOUR_API_KEY_HERE"
        }
      }
    }
  }
}

Fallback: Docker (stdio)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": ["run", "--rm", "-i",
               "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
               "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"]
    }
  }
}
Roo Code

Roo Code reads a JSON settings file similar to Cline.

Global config: mcp_settings.json (Roo) or cline_mcp_settings.json (Cline-style)

Option A — Remote HTTP

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

Option B — Docker (STDIO)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}

Tip: If your Roo build doesn't honor HTTP headers, use the Docker/STDIO option.

Goose

UI path: Settings → MCP Servers → Add → choose Streamable HTTP

Streamable HTTP configuration:

  • Name: codealive
  • Endpoint URL: https://mcp.codealive.ai/api
  • Headers: Authorization: Bearer YOUR_API_KEY_HERE

Docker (STDIO) alternative:

Add a STDIO extension with:

  • Command: docker
  • Args: run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.2.0
Kilo Code

UI path: Manage → Integrations → Model Context Protocol (MCP) → Add Server

HTTP

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

STDIO (Docker)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}
Windsurf (Codeium)

File: ~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "serverUrl": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

Note: Product name is Windsurf.

Kiro

UI path: Settings → MCP → Add Server

Global file: ~/.kiro/settings/mcp.json Workspace file: .kiro/settings/mcp.json

HTTP

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

STDIO (Docker)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}
Qoder

UI path: User icon → Qoder Settings → MCP → My Servers → + Add (Agent mode)

SSE (remote HTTP)

{
  "mcpServers": {
    "codealive": {
      "type": "sse",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

STDIO (Docker)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}
Amazon Q Developer (CLI & IDE)

Q Developer CLI

Config file: ~/.aws/amazonq/mcp.json or workspace .amazonq/mcp.json

HTTP server

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      }
    }
  }
}

STDIO (Docker)

{
  "mcpServers": {
    "codealive": {
      "type": "stdio",
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE",
        "ghcr.io/codealive-ai/codealive-mcp:v0.2.0"
      ]
    }
  }
}

Q Developer IDE (VS Code / JetBrains)

Global: ~/.aws/amazonq/agents/default.json Local (workspace): .aws/amazonq/agents/default.json

Minimal entry (HTTP):

{
  "mcpServers": {
    "codealive": {
      "type": "http",
      "url": "https://mcp.codealive.ai/api",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY_HERE"
      },
      "timeout": 60000
    }
  }
}

Use the IDE UI: Q panel → Chat → tools icon → Add MCP Server → choose http or stdio.


🔧 Advanced: Local Development

For developers who want to customize or contribute to the MCP server.

Prerequisites

  • Python 3.11+
  • uv (recommended) or pip

Installation

# Clone the repository
git clone https://github.com/CodeAlive-AI/codealive-mcp.git
cd codealive-mcp

# Setup with uv (recommended)
uv venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate
uv pip install -e .

# Or setup with pip
python -m venv .venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate  
pip install -e .

Local Server Configuration

Once installed locally, configure your AI client to use the local server:

Claude Code (Local)

claude mcp add codealive-local /path/to/codealive-mcp/.venv/bin/python /path/to/codealive-mcp/src/codealive_mcp_server.py --env CODEALIVE_API_KEY=YOUR_API_KEY_HERE

Other Clients (Local)

Replace the Docker command and args with:

{
  "command": "/path/to/codealive-mcp/.venv/bin/python",
  "args": ["/path/to/codealive-mcp/src/codealive_mcp_server.py"],
  "env": {
    "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE"
  }
}

Running HTTP Server Locally

# Start local HTTP server
export CODEALIVE_API_KEY="your_api_key_here"
python src/codealive_mcp_server.py --transport http --host localhost --port 8000

# Test health endpoint
curl http://localhost:8000/health

Smithery Installation

Auto-install for Claude Desktop via Smithery:

npx -y @smithery/cli install @CodeAlive-AI/codealive-mcp --client claude

🌐 Community Plugins

Gemini CLI — CodeAlive Extension

Repo: https://github.com/akolotov/gemini-cli-codealive-extension

Gemini CLI extension that wires CodeAlive into your terminal with prebuilt slash commands and MCP config. It includes:

  • GEMINI.md guidance so Gemini knows how to use CodeAlive tools effectively
  • Slash commands: /codealive:chat, /codealive:find, /codealive:search
  • Easy setup via Gemini CLI's extension system

Install

gemini extensions install https://github.com/akolotov/gemini-cli-codealive-extension

Configure

# Option 1: .env next to where you run `gemini`
CODEALIVE_API_KEY="your_codealive_api_key_here"

# Option 2: environment variable
export CODEALIVE_API_KEY="your_codealive_api_key_here"
gemini

🐞 Troubleshooting

Quick Diagnostics

  1. Test the hosted service:

    curl https://mcp.codealive.ai/health
    
  2. Check your API key:

    curl -H "Authorization: Bearer YOUR_API_KEY" https://app.codealive.ai/api/v1/data_sources
    
  3. Enable debug logging: Add --debug to local server args

Common Issues

  • "Connection refused" → Check internet connection
  • "401 Unauthorized" → Verify your API key
  • "No repositories found" → Check API key permissions in CodeAlive dashboard
  • Client-specific logs → See your AI client's documentation for MCP logs

Getting Help


📄 License

MIT License - see LICENSE file for details.


Ready to supercharge your AI assistant with deep code understanding?
Get started now →

Quick Actions

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.