JUHE API Marketplace
MadMando avatar
MCP Server

MCP Autonomous Analyst

A local, agentic AI pipeline that analyzes tabular data, detects anomalies, and generates interpretive summaries using local LLMs orchestrated via the Model Context Protocol.

0
GitHub Stars
11/23/2025
Last Updated
No Configuration
Please check the documentation below.
  1. Home
  2. MCP Servers
  3. mcp-autonomous-analyst

README Documentation

Autonomous Analyst

๐Ÿง  Overview

Autonomous Analyst is a local, agentic AI pipeline that:

  • Analyzes tabular data
  • Detects anomalies with Mahalanobis distance
  • Uses a local LLM (llama3.2:1b via Ollama) to generate interpretive summaries
  • Logs results to ChromaDB for semantic recall
  • Is fully orchestrated via the Model Context Protocol (MCP)

โš™๏ธ Features

ComponentDescription
FastAPI Web UIFriendly dashboard for synthetic or uploaded datasets
MCP Tool OrchestrationEach process step is exposed as a callable MCP tool
Anomaly DetectionMahalanobis Distance-based outlier detection
Visual OutputSaved scatter plot of inliers vs. outliers
Local LLM SummarizationInsights generated using llama3.2:1b via Ollama
Vector Store LoggingSummaries are stored in ChromaDB for persistent memory
Agentic Planning ToolA dedicated LLM tool (autonomous_plan) determines next steps based on dataset context
Agentic FlowLLM + memory + tool use + automatic reasoning + context awareness

๐Ÿงช Tools Defined (via MCP)

Tool NameDescriptionLLM Used
generate_dataCreate synthetic tabular data (Gaussian + categorical)โŒ
analyze_outliersLabel rows using Mahalanobis distanceโŒ
plot_resultsSave a plot visualizing inliers vs outliersโŒ
summarize_resultsInterpret and explain outlier distribution using llama3.2:1bโœ…
summarize_data_statsDescribe dataset trends using llama3.2:1bโœ…
log_results_to_vector_storeStore summaries to ChromaDB for future referenceโŒ
search_logsRetrieve relevant past sessions using vector search (optional LLM use)โš ๏ธ
autonomous_planRun the full pipeline, use LLM to recommend next actions automaticallyโœ…

๐Ÿค– Agentic Capabilities

  • Autonomy: LLM-guided execution path selection with autonomous_plan
  • Tool Use: Dynamically invokes registered MCP tools via LLM inference
  • Reasoning: Generates technical insights from dataset conditions and outlier analysis
  • Memory: Persists and recalls knowledge using ChromaDB vector search
  • LLM: Powered by Ollama with llama3.2:1b (temperature = 0.1, deterministic)

๐Ÿš€ Getting Started

1. Clone and Set Up

git clone https://github.com/MadMando/mcp-autonomous-analyst.git
cd mcp-autonomous-analyst
conda create -n mcp-agentic python=3.11 -y
conda activate mcp-agentic
pip install uv
uv pip install -r requirements.txt

2. Start the MCP Server

mcp run server.py --transport streamable-http

3. Start the Web Dashboard

uvicorn web:app --reload --port 8001

Then visit: http://localhost:8000


๐ŸŒ Dashboard Flow

  • Step 1: Upload your own dataset or click Generate Synthetic Data
  • Step 2: The system runs anomaly detection on feature_1 vs feature_2
  • Step 3: Visual plot of outliers is generated
  • Step 4: Summaries are created via LLM
  • Step 5: Results are optionally logged to vector store for recall

๐Ÿ“ Project Layout

๐Ÿ“ฆ autonomous-analyst/
โ”œโ”€โ”€ server.py                  # MCP server
โ”œโ”€โ”€ web.py                     # FastAPI + MCP client (frontend logic)
โ”œโ”€โ”€ tools/
โ”‚   โ”œโ”€โ”€ synthetic_data.py
โ”‚   โ”œโ”€โ”€ outlier_detection.py
โ”‚   โ”œโ”€โ”€ plotter.py
โ”‚   โ”œโ”€โ”€ summarizer.py
โ”‚   โ”œโ”€โ”€ vector_store.py
โ”œโ”€โ”€ static/                   # Saved plot
โ”œโ”€โ”€ data/                     # Uploaded or generated dataset
โ”œโ”€โ”€ requirements.txt
โ”œโ”€โ”€ .gitignore
โ””โ”€โ”€ README.md

๐Ÿ“š Tech Stack

  • MCP SDK: mcp
  • LLM Inference: Ollama running llama3.2:1b
  • UI Server: FastAPI + Uvicorn
  • Memory: ChromaDB vector database
  • Data: pandas, matplotlib, scikit-learn

โœ… .gitignore Additions

__pycache__/
*.pyc
*.pkl
.env
static/
data/

๐Ÿ™Œ Acknowledgements

This project wouldn't be possible without the incredible work of the open-source community. Special thanks to:

Tool / LibraryPurposeRepository
๐Ÿง  Model Context Protocol (MCP)Agentic tool orchestration & executionmodelcontextprotocol/python-sdk
๐Ÿ’ฌ OllamaLocal LLM inference engine (llama3.2:1b)ollama/ollama
๐Ÿ” ChromaDBVector database for logging and retrievalchroma-core/chroma
๐ŸŒ FastAPIInteractive, fast web interfacetiangolo/fastapi
โšก UvicornASGI server powering the FastAPI backendencode/uvicorn

๐Ÿ’ก If you use this project, please consider starring or contributing to the upstream tools that make it possible.

This repo was created with the assistance of a local rag-llm using llama3.2:1b

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright ยฉ 2025 - All rights reserved