JUHE API Marketplace
SepineTam avatar
MCP Server

Stata-MCP

An MCP server that lets Large Language Models interact with Stata software to perform regression analysis and other statistical operations.

60
GitHub Stars
11/21/2025
Last Updated
MCP Server Configuration
1{
2 "name": "stata-mcp",
3 "command": "uvx",
4 "args": [
5 "stata-mcp"
6 ]
7}
JSON7 lines
  1. Home
  2. MCP Servers
  3. stata-mcp

README Documentation

logo

Stata-MCP

Let LLM help you achieve your regression analysis with Stata ✨
Evolve from reg monkey to causal thinker πŸ’ -> 🧐

PyPI Downloads Ask DeepWiki


News:

  • Use Stata-MCP in Claude Code, look here
  • Try to use agent mode as tool? Now it is supported more easily here.
  • Want to evaluate your LLM? Look here.
  • Update StataFinder, but it is not stable, please config your STATA_CLI into your environment.

Finding our newest research? Click here or visit reports website.

Looking for others?

  • STOP: StataMCP-Team Opendata Project πŸ“Š, we have open-sourced a comprehensive dataset collection for social science research, aiming to enable the future of AI-driven and data-powered research paradigms.
  • Trace DID: If you want to fetch the newest information about DID (Difference-in-Difference), click here. Now there is a Chinese translation by Sepine Tam and StataMCP-Team πŸŽ‰
  • Jupyter Lab Usage (Important: Stata 17+) here
  • NBER-MCP & AER-MCP πŸ”§ under construction
  • Econometrics-Agent
  • TexIV: A machine learning-driven framework that transforms text data into usable variables for empirical research using advanced NLP and ML techniques
  • A VScode or Cursor integrated here. Confused it? πŸ’‘ Difference

πŸ’‘ Quick Start

Use Stata-MCP in Claude Code

We can use Stata-MCP in Claude Code as its prefect agentic ability.

Before using it, please make sure you have ever install Claude Code, if you don't know how to install it, visit on GitHub

You can open your terminal and cd to your working directory, and run:

claude mcp add stata-mcp --env STATA_MCP_CWD=$(pwd) -- uvx stata-mcp

I am not sure whether it works on Windows, as I do not have a Windows device for test it.

Then, you can use Stata-MCP in Claude Code. Here are some scenarios for using it:

  • Paper Replication: Replicate empirical studies from economics papers
  • Quick Hypothesis Testing: Validate economic hypotheses through regression analysis
  • Stata Learning Assistant: Learn econometrics with step-by-step Stata explanations
  • Code Organization: Review and optimize existing Stata do-files
  • Result Interpretation: Understand complex statistical outputs and regression results

Agent Mode

The details of agent mode find here.

git clone https://github.com/sepinetam/stata-mcp.git
cd stata-mcp

uv sync
uv pip install -e .

stata-mcp --version  # for test whether stata-mcp is installed successfully.
stata-mcp --agent  # now you have enjoy your stata-mcp agent mode.

or you can directly use it with uvx:

uvx stata-mcp --version  # for test whether it could be used on your computer.
uvx stata-mcp --agent

You can edit the task in agent_examples/openai/main.py for variable model_instructions and task_message, click me #L37 and #L68

Agent as Tool

If you want to use a Stata-Agent in another agent, here is a simple example:

import asyncio

from agents import Agent, Runner
from stata_mcp.agent_as.agent_as_tool import StataAgent

# init stata agent and set as tool
stata_agent = StataAgent()
sa_tool = stata_agent.as_tool()

# Create main Agent
agent = Agent(
    name="Assistant",
    instructions="You are a helpful assistant",
    tools=[sa_tool],
)

# Then run the agent as usual.
async def main(task: str, max_turns: int = 30):
    result = await Runner.run(agent, input=task, max_turns=max_turns)
    return result

if __name__ == "__main__":
    econ_task = "Use Stata default data to find out the relationship between mpg and price."
    asyncio.run(main(econ_task))

AI Chat-Bot Client Mode

Standard config requires: please make sure the stata is installed at the default path, and the stata cli (for macOS and Linux) exists.

The standard config json as follows, you can DIY your config via add envs.

{
  "mcpServers": {
    "stata-mcp": {
      "command": "uvx",
      "args": [
        "stata-mcp"
      ]
    }
  }
}

For more detailed usage information, visit the Usage guide.

And some advanced usage, visit the Advanced guide

Prerequisites

  • uv - Package installer and virtual environment manager
  • Claude, Cline, ChatWise, or other LLM service
  • Stata License
  • Your API-KEY from LLM

Notes:

  1. If you are located in China, a short uv usage document you can find here.
  2. Claude is the best choice for Stata-MCP, for Chinese, I recommend to use DeepSeek as your model provider as it is cheap and powerful, also the score is highest in China provider, if you are increased in it, visit the report How to use StataMCP improve your social science research.

Installation

For the new version, you don't need to install the stata-mcp package again, you can just use the following command to check whether your computer can use stata-mcp.

uvx stata-mcp --usable
uvx stata-mcp --version

If you want to use it locally, you can install it via pip or download the source code.

Download via pip

pip install stata-mcp

Download source code and compile

git clone https://github.com/sepinetam/stata-mcp.git
cd stata-mcp

uv build

Then you can find the compiled stata-mcp binary in the dist directory. You can use it directly or add it to your PATH.

For example:

uvx /path/to/your/whl/stata_mcp-1.13.0-py3-non-any.whl  # here is the wheel file name, you can change it to your version

πŸ“ Documentation

  • For more detailed usage information, visit the Usage guide.
  • Advanced Usage, visit the Advanced
  • Some questions, visit the Questions
  • Difference with Stata-MCP@hanlulong, visit the Difference

πŸ’‘ Questions

  • Cherry Studio 32000 wrong
  • Cherry Studio 32000 error
  • Windows Support
  • Network Errors When Running Stata-MCP

πŸš€ Roadmap

  • macOS support
  • Windows support
  • Additional LLM integrations (With a new webUI)
  • Performance optimizations (Via prompt and context engineering)

For more information, refer to the Statement.

πŸ› Report Issues

If you encounter any bugs or have feature requests, please open an issue.

πŸ“„ License

Apache License 2.0

πŸ“š Citation

If you use Stata-MCP in your research, please cite this repository using one of the following formats:

BibTeX

@software{sepinetam2025stata,
  author = {Song Tan},
  title = {Stata-MCP: Let LLM help you achieve your regression analysis with Stata},
  year = {2025},
  url = {https://github.com/sepinetam/stata-mcp},
  version = {1.13.0}
}

APA

Song Tan. (2025). Stata-MCP: Let LLM help you achieve your regression analysis with Stata (Version 1.13.0) [Computer software]. https://github.com/sepinetam/stata-mcp

Chicago

Song Tan. 2025. "Stata-MCP: Let LLM help you achieve your regression analysis with Stata." Version 1.13.0. https://github.com/sepinetam/stata-mcp.

πŸ“¬ Contact

Email: sepinetam@gmail.com

Or contribute directly by submitting a Pull Request! We welcome contributions of all kinds, from bug fixes to new features.

❀️ Acknowledgements

The author sincerely thanks the Stata official team for their support and the Stata License for authorizing the test development.

πŸ“ƒ Statement

The Stata referred to in this project is the commercial software Stata developed by StataCorp LLC. This project is not affiliated with, endorsed by, or sponsored by StataCorp LLC. This project does not include the Stata software or any installation packages; users must obtain and install a validly licensed copy of Stata from StataCorp. This project is licensed under Apache-2.0. The project maintainers accept no liability for any loss or damage arising from the use of this project or from actions related to Stata.

More information: refer to the Chinese version at [source/docs/README/cn/README.md]; in case of any conflict, the Chinese version shall prevail.

✨ Star History

Star History Chart

Quick Install

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright Β© 2025 - All rights reserved