MCP Weather Demo
A demonstration server for ModelContextProtocol that provides weather tools and prompts for Claude AI, allowing LLMs to access external data without manual copying.
README Documentation
mcp
A demo of the ModelContextProtocol (MCP) using the Anthropic AI SDK.
Built following the Server Quickstart and the Client Quickstart.
Getting Started
Prerequisites
- Bun (v1.2.17)
- asdf (optional)
- asdf-bun (optional)
- Claude Desktop (required if only running the server)
Installation
If using asdf
, run:
asdf install
Regardless of whether using asdf
or not, install dependencies:
bun install
Server
The server allows a client to access resources, tools, and prompts.
It needs a client to interact with an LLM. Claude Desktop serves as the client if you are only running the server.
Set Up
You need to add the server to Claude Desktop by modifying the claude_desktop_config.json
file in your Library/Application Support/Claude
directory. If this file does not exist, you can create it.
vi ~/.config/Claude/claude_desktop_config.json
Add the following to the file:
{
"mcpServers": {
"weather": {
"command": "/ABSOLUTE/PATH/TO/bin/bun",
"args": ["/ABSOLUTE/PATH/TO/src/server/index.ts"]
}
}
}
⚠️ Bun Path
If you are using asdf
, you will need to use the absolute path to the bun
executable. You can find this by running asdf where bun
.
If you're just using bun
without asdf
, you can use bun
as the command.
Run
Once you've modified the claude_desktop_config.json
file, restart Claude Desktop.
You should now see the weather
tools and prompts in Claude Desktop!
Client
Instead of using Claude Desktop, you can also run a client to handle the interaction with the LLM.
This would suitable for building a chat interface or web application that uses Anthropic's API. With MCP, you can give the LLM access to data without having to manually copy and paste them into a prompt.
Set Up
Get an Anthropic API key from the Anthropic API Keys page.
Create a .env
file in the root of the project and add the following:
ANTHROPIC_API_KEY=your_api_key_here
Run
Now you can run the client:
bun run dev
This gives you an interactive CLI where you can ask the LLM questions. Note that you have access to the tools defined in the server!