JUHE API Marketplace
video-db avatar
MCP Server

VideoDB Director

VideoDB Director

43
GitHub Stars
11/22/2025
Last Updated
No Configuration
Please check the documentation below.
  1. Home
  2. MCP Servers
  3. agent-toolkit

README Documentation

Latest Number GitHub tag (latest SemVer) Stars Issues


Logo

VideoDB Agent Toolkit

AI Agent toolkit for VideoDB
llms.txt >> llms-full.txt
MCP

VideoDB Agent Toolkit

The VideoDB Agent Toolkit exposes VideoDB context to LLMs and agents. It enables integration to AI-driven IDEs like Cursor, chat agents like Claude Code etc. This toolkit automates context generation, maintenance, and discoverability. It auto-syncs SDK versions, docs, and examples and is distributed through MCP and llms.txt

🚀 Quick Overview

The toolkit offers context files designed for use with LLMs, structured around key components:

llms-full.txt — Comprehensive context for deep integration.

llms.txt — Lightweight metadata for quick discovery.

MCP (Model Context Protocol) — A standardized protocol.

These components leverage automated workflows to ensure your AI applications always operate with accurate, up-to-date context.

📦 Toolkit Components

1. llms-full.txt (View »)


llms-full.txt consolidates everything your LLM agent needs, including:

  • Comprehensive VideoDB overview.

  • Complete SDK usage instructions and documentation.

  • Detailed integration examples and best practices.

Real-world Examples:

  • VideoDB's Director code-assistant agent (View Implementation )
  • VideoDB's Discord Bot to power customer support and community help (View Implementation )
  • Integrate llms-full.txt directly into your LLM-powered workflows, agent systems, or AI coding environments.

2. llms.txt (View »)


A streamlined file following the Answer.AI llms.txt proposal. Ideal for quick metadata exposure and LLM discovery.

ℹ️ Recommendation: Use llms.txt for lightweight discovery and metadata integration. Use llms-full.txt for complete functionality.

3. MCP (Model Context Protocol)

The VideoDB MCP Server connects with the Director backend framework, providing a single tool for many workflows. For development, it can be installed and used via uvx for isolated environments. For more details on MCPs, please visit here

Install uv

We need to install uv first.

For macOS/Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

For Windows:

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

You can also visit the installation steps of uv for more details here

Run the MCP Server

You can run the MCP server using uvx using the following command

uvx videodb-director-mcp --api-key=VIDEODB_API_KEY

Update VideoDB Director MCP package

To ensure you're using the latest version of the MCP server with uvx, start by clearing the cache:

uv cache clean

This command removes any outdated cached packages of videodb-director-mcp, allowing uvx to fetch the most recent version.

If you always want to use the latest version of the MCP server, update your command as follows:

uvx videodb-director-mcp@latest --api-key=<VIDEODB_API_KEY>

🧠 Anatomy of LLM Context Files

LLM context files in VideoDB are modular, automatically generated, and continuously updated from multiple sources:

🧩 Modular Structure:

  • Instructions — Best practices and prompt guidelines View »

  • SDK Context — SDK structure, classes, and interface definitions View »

  • Docs Context — Summarized product documentation View »

  • Examples Context — Real-world notebook examples View »

    Token Breakdown

Automated Maintenance:

  • Managed through GitHub Actions for automated updates.
  • Triggered by changes to SDK repositories, documentation, or examples.
  • Maintained centrally via a config.yaml file.

🛠️ Automation with GitHub Actions

Automatic context generation ensures your applications always have the latest information:

🔹 SDK Context Workflow (View)

  • Automatically generates documentation from SDK repo updates.
  • Uses Sphinx for Python SDKs.

🔹 Docs Context Workflow (View)

  • Scrapes and summarizes documentation using FireCrawl and LLM-powered summarization.

🔹 Examples Context Workflow (View)

  • Converts and summarizes notebooks into practical context examples.

🔹 Master Context Workflow (View)

  • Combines all sub-components into unified llms-full.txt.
  • Generates standards-compliant llms.txt.
  • Updates documentation with token statistics for transparency.

🛠️ Customization via config.yaml

The config.yaml file centralizes all configurations, allowing easy customization:

  • Inclusion & Exclusion Patterns for documentation and notebook processing
  • Custom LLM Prompts for precise summarization tailored to each document type
  • Layout Configuration for combining context components seamlessly

config.yaml > llms_full_txt_file defines how llms-full.txt is assembled:

llms_full_txt_file:
  input_files:
    - name: Instructions
      file_path: "context/instructions/prompt.md"
    - name: SDK Context
      file_path: "context/sdk/context/index.md"
    - name: Docs Context
      file_path: "context/docs/docs_context.md"
    - name: Examples Context
      file_path: "context/examples/examples_context.md"
  output_files:
    - name: llms_full_txt
      file_path: "context/llms-full.txt"
    - name: llms_full_md
      file_path: "context/llms-full.md"
  layout: |
    {{FILE1}}

    {{FILE2}}

    {{FILE3}}

    {{FILE4}}

💡 Best Practices for Context-Driven Development

  • Automate Context Updates: Leverage GitHub Actions to maintain accuracy.
  • Tailored Summaries: Use custom LLM prompts to ensure context relevance.
  • Seamless Integration: Continuously integrate with existing LLM agents or IDEs.

By following these practices, you ensure your AI applications have reliable, relevant, and up-to-date context—critical for effective agent performance and developer productivity.


🚀 Get Started

Clone the toolkit repository and follow the setup instructions in config.yaml to start integrating VideoDB contexts into your LLM-powered applications today.

Explore further:

  • VideoDB SDK
  • Documentation
  • Cookbook Examples

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright © 2025 - All rights reserved