JUHE API Marketplace
nick-telsan avatar
MCP Server

MCP Weather Demo

A demonstration server for ModelContextProtocol that provides weather tools and prompts for Claude AI, allowing LLMs to access external data without manual copying.

0
GitHub Stars
11/21/2025
Last Updated
MCP Server Configuration
1{
2 "name": "weather",
3 "command": "/ABSOLUTE/PATH/TO/bin/bun",
4 "args": [
5 "/ABSOLUTE/PATH/TO/src/server/index.ts"
6 ]
7}
JSON7 lines
  1. Home
  2. MCP Servers
  3. mcp

README Documentation

mcp

A demo of the ModelContextProtocol (MCP) using the Anthropic AI SDK.

Built following the Server Quickstart and the Client Quickstart.

Getting Started

Prerequisites

  • Bun (v1.2.17)
  • asdf (optional)
  • asdf-bun (optional)
  • Claude Desktop (required if only running the server)

Installation

If using asdf, run:

asdf install

Regardless of whether using asdf or not, install dependencies:

bun install

Server

The server allows a client to access resources, tools, and prompts.

It needs a client to interact with an LLM. Claude Desktop serves as the client if you are only running the server.

Set Up

You need to add the server to Claude Desktop by modifying the claude_desktop_config.json file in your Library/Application Support/Claude directory. If this file does not exist, you can create it.

vi ~/.config/Claude/claude_desktop_config.json

Add the following to the file:

{
  "mcpServers": {
    "weather": {
      "command": "/ABSOLUTE/PATH/TO/bin/bun",
      "args": ["/ABSOLUTE/PATH/TO/src/server/index.ts"]
    }
  }
}

⚠️ Bun Path

If you are using asdf, you will need to use the absolute path to the bun executable. You can find this by running asdf where bun.

If you're just using bun without asdf, you can use bun as the command.

Run

Once you've modified the claude_desktop_config.json file, restart Claude Desktop.

You should now see the weather tools and prompts in Claude Desktop!

Client

Instead of using Claude Desktop, you can also run a client to handle the interaction with the LLM.

This would suitable for building a chat interface or web application that uses Anthropic's API. With MCP, you can give the LLM access to data without having to manually copy and paste them into a prompt.

Set Up

Get an Anthropic API key from the Anthropic API Keys page.

Create a .env file in the root of the project and add the following:

ANTHROPIC_API_KEY=your_api_key_here

Run

Now you can run the client:

bun run dev

This gives you an interactive CLI where you can ask the LLM questions. Note that you have access to the tools defined in the server!

Quick Install

Quick Actions

View on GitHubView All Servers

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.

Learn More
JUHE API Marketplace

Accelerate development, innovate faster, and transform your business with our comprehensive API ecosystem.

JUHE API VS

  • vs. RapidAPI
  • vs. API Layer
  • API Platforms 2025
  • API Marketplaces 2025
  • Best Alternatives to RapidAPI

For Developers

  • Console
  • Collections
  • Documentation
  • MCP Servers
  • Free APIs
  • Temp Mail Demo

Product

  • Browse APIs
  • Suggest an API
  • Wisdom Gate LLM
  • Global SMS Messaging
  • Temp Mail API

Company

  • What's New
  • Welcome
  • About Us
  • Contact Support
  • Terms of Service
  • Privacy Policy
Featured on Startup FameFeatured on Twelve ToolsFazier badgeJuheAPI Marketplace - Connect smarter, beyond APIs | Product Huntai tools code.marketDang.ai
Copyright © 2025 - All rights reserved