JUHE API Marketplace
Arize-ai avatar
MCP Server

@arizeai/phoenix-mcp

Phoenix MCP Server is an implementation of the Model Context Protocol for the Arize Phoenix platform. It provides a unified interface to Phoenix's capabilites. You can use Phoenix MCP Server for: Prompts Management: Create, list, update, and iterate on prompts Datasets: Explore datasets, and synte

7744
GitHub Stars
11/23/2025
Last Updated
No Configuration
Please check the documentation below.
  1. Home
  2. MCP Servers
  3. phoenix

README Documentation

phoenix banner

</a>
<a target="_blank" href="https://arize-ai.slack.com/join/shared_invite/zt-11t1vbu4x-xkBIHmOREQnYnYDH1GDfCg?__hstc=259489365.a667dfafcfa0169c8aee4178d115dc81.1733501603539.1733501603539.1733501603539.1&__hssc=259489365.1.1733501603539&__hsfp=3822854628&submissionGuid=381a0676-8f38-437b-96f2-fc10875658df#/shared-invite/email">
    
</a>
 <a target="_blank" href="https://bsky.app/profile/arize-phoenix.bsky.social">
    
</a>
<a target="_blank" href="https://x.com/ArizePhoenix">
    
</a>
<a target="_blank" href="https://pypi.org/project/arize-phoenix/">
    
</a>
<a target="_blank" href="https://anaconda.org/conda-forge/arize-phoenix">
    
</a>
<a target="_blank" href="https://pypi.org/project/arize-phoenix/">
    
</a>
<a target="_blank" href="https://hub.docker.com/r/arizephoenix/phoenix/tags">
    
</a>
<a target="_blank" href="https://hub.docker.com/r/arizephoenix/phoenix-helm">
    
</a>
<a target="_blank" href="https://github.com/Arize-ai/phoenix/tree/main/js/packages/phoenix-mcp">
    <img src="https://badge.mcpx.dev?status=on" title="MCP Enabled"/>
</a>
<a href="cursor://anysphere.cursor-deeplink/mcp/install?name=phoenix&config=eyJjb21tYW5kIjoibnB4IC15IEBhcml6ZWFpL3Bob2VuaXgtbWNwQGxhdGVzdCAtLWJhc2VVcmwgaHR0cHM6Ly9teS1waG9lbml4LmNvbSAtLWFwaUtleSB5b3VyLWFwaS1rZXkifQ%3D%3D"><img src="https://cursor.com/deeplink/mcp-install-dark.svg" alt="Add Arize Phoenix MCP server to Cursor" height=20 /></a>

Phoenix is an open-source AI observability platform designed for experimentation, evaluation, and troubleshooting. It provides:

  • Tracing - Trace your LLM application's runtime using OpenTelemetry-based instrumentation.
  • Evaluation - Leverage LLMs to benchmark your application's performance using response and retrieval evals.
  • Datasets - Create versioned datasets of examples for experimentation, evaluation, and fine-tuning.
  • Experiments - Track and evaluate changes to prompts, LLMs, and retrieval.
  • Playground- Optimize prompts, compare models, adjust parameters, and replay traced LLM calls.
  • Prompt Management- Manage and test prompt changes systematically using version control, tagging, and experimentation.

Phoenix is vendor and language agnostic with out-of-the-box support for popular frameworks (πŸ¦™LlamaIndex, πŸ¦œβ›“LangChain, Haystack, 🧩DSPy, πŸ€—smolagents) and LLM providers (OpenAI, Bedrock, MistralAI, VertexAI, LiteLLM, Google GenAI and more). For details on auto-instrumentation, check out the OpenInference project.

Phoenix runs practically anywhere, including your local machine, a Jupyter notebook, a containerized deployment, or in the cloud.

Installation

Install Phoenix via pip or conda

pip install arize-phoenix

Phoenix container images are available via Docker Hub and can be deployed using Docker or Kubernetes. Arize AI also provides cloud instances at app.phoenix.arize.com.

Packages

The arize-phoenix package includes the entire Phoenix platfom. However if you have deployed the Phoenix platform, there are light-weight Python sub-packages and TypeScript packages that can be used in conjunction with the platfrom.

Python Subpackages

PackageVersion & DocsDescription
arize-phoenix-otel Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
arize-phoenix-client Lightweight client for interacting with the Phoenix server via its OpenAPI REST interface
arize-phoenix-evals Tooling to evaluate LLM applications including RAG relevance, answer relevance, and more

TypeScript Subpackages

PackageVersion & DocsDescription
@arizeai/phoenix-otel Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults
@arizeai/phoenix-client Client for the Arize Phoenix API
@arizeai/phoenix-evals TypeScript evaluation library for LLM applications (alpha release)
@arizeai/phoenix-mcp MCP server implementation for Arize Phoenix providing unified interface to Phoenix's capabilities

Tracing Integrations

Phoenix is built on top of OpenTelemetry and is vendor, language, and framework agnostic. For details about tracing integrations and example applications, see the OpenInference project.

Python Integrations

IntegrationPackageVersion Badge
OpenAIopeninference-instrumentation-openai
OpenAI Agentsopeninference-instrumentation-openai-agents
LlamaIndexopeninference-instrumentation-llama-index
DSPyopeninference-instrumentation-dspy
AWS Bedrockopeninference-instrumentation-bedrock
LangChainopeninference-instrumentation-langchain
MistralAIopeninference-instrumentation-mistralai
Google GenAIopeninference-instrumentation-google-genai
Google ADKopeninference-instrumentation-google-adk
Guardrailsopeninference-instrumentation-guardrails
VertexAIopeninference-instrumentation-vertexai
CrewAIopeninference-instrumentation-crewai
Haystackopeninference-instrumentation-haystack
LiteLLMopeninference-instrumentation-litellm
Groqopeninference-instrumentation-groq
Instructoropeninference-instrumentation-instructor
Anthropicopeninference-instrumentation-anthropic
Smolagentsopeninference-instrumentation-smolagents
Agnoopeninference-instrumentation-agno
MCPopeninference-instrumentation-mcp
Pydantic AIopeninference-instrumentation-pydantic-ai
Autogen AgentChatopeninference-instrumentation-autogen-agentchat
Portkeyopeninference-instrumentation-portkey

Span Processors

Normalize and convert data across other instrumentation libraries by adding span processors that unify data.

PackageDescriptionVersion
openinference-instrumentation-openlitOpenInference Span Processor for OpenLIT traces.
openinference-instrumentation-openllmetryOpenInference Span Processor for OpenLLMetry (Traceloop) traces.

JavaScript Integrations

IntegrationPackageVersion Badge
OpenAI@arizeai/openinference-instrumentation-openai
LangChain.js@arizeai/openinference-instrumentation-langchain
Vercel AI SDK@arizeai/openinference-vercel
BeeAI@arizeai/openinference-instrumentation-beeai
Mastra@mastra/arize

Java Integrations

IntegrationPackageVersion Badge
LangChain4jopeninference-instrumentation-langchain4j
SpringAIopeninference-instrumentation-springAI

Platforms

PlatformDescriptionDocs
BeeAIAI agent framework with built-in observabilityIntegration Guide
DifyOpen-source LLM app development platformIntegration Guide
Envoy AI GatewayAI Gateway built on Envoy Proxy for AI workloadsIntegration Guide
LangFlowVisual framework for building multi-agent and RAG applicationsIntegration Guide
LiteLLM ProxyProxy server for LLMsIntegration Guide

Community

Join our community to connect with thousands of AI builders.

  • 🌍 Join our Slack community.
  • πŸ“š Read our documentation.
  • πŸ’‘ Ask questions and provide feedback in the #phoenix-support channel.
  • 🌟 Leave a star on our GitHub.
  • 🐞 Report bugs with GitHub Issues.
  • 𝕏 Follow us on 𝕏.
  • πŸ—ΊοΈ Check out our roadmap to see where we're heading next.
  • πŸ§‘β€πŸ« Deep dive into everything Agents and LLM Evaluations on Arize's Learning Hubs.

Breaking Changes

See the migration guide for a list of breaking changes.

Copyright, Patent, and License

Copyright 2025 Arize AI, Inc. All Rights Reserved.

Portions of this code are patent protected by one or more U.S. Patents. See the IP_NOTICE.

This software is licensed under the terms of the Elastic License 2.0 (ELv2). See LICENSE.

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright Β© 2025 - All rights reserved