MCP Memory Tracker
A Model Context Protocol (MCP) server that provides persistent memory capabilities using OpenAI's vector stores, allowing AI assistants to save and search through memories across conversations.
README Documentation
MCP Memory Tracker
A Model Context Protocol (MCP) server that provides persistent memory capabilities using OpenAI's vector stores. This allows AI assistants to save and search through memories across conversations.
Features
- Save Memories: Store text-based memories in OpenAI vector stores
- Search Memories: Semantic search through saved memories using natural language queries
- Persistent Storage: Memories are stored in OpenAI's cloud infrastructure
- MCP Compatible: Works with any MCP-compatible client (like Claude Desktop)
Prerequisites
- Python 3.8+
- OpenAI API key
- UV package manager (recommended) or pip
Installation
- Clone the repository:
git clone <repository-url>
cd mcp-memory-tracker
- Install dependencies:
# Using UV (recommended)
uv sync
# Or using pip
pip install -r requirements.txt
- Set up environment variables:
Create a
.env
file in the project root:
OPENAI_API_KEY=your_openai_api_key_here
Usage
Running the MCP Server
# Using UV
uv run server.py
# Or using Python directly
python server.py
Available Tools
save_memory(memory: str)
Saves a text memory to the vector store.
Parameters:
memory
(string): The text content to save
Returns:
{
"status": "saved",
"vector store id": "vs_xxxxx"
}
Example:
save_memory("I met John at the coffee shop on Main Street. He's a software engineer who loves hiking.")
search_memories(query: str)
Searches through saved memories using semantic search.
Parameters:
query
(string): Natural language search query
Returns:
{
"status": "success",
"results": ["matching memory content..."]
}
Example:
search_memories("Who did I meet at the coffee shop?")
Integration with MCP Clients
Claude Desktop
Add this server to your Claude Desktop configuration:
{
"mcpServers": {
"memory-tracker": {
"command": "uv",
"args": ["run", "/path/to/mcp-memory-tracker/server.py"],
"env": {
"OPENAI_API_KEY": "your_api_key_here"
}
}
}
}
Other MCP Clients
This server implements the standard MCP protocol and should work with any compatible client. Refer to your client's documentation for configuration details.
How It Works
- Vector Store Management: The server automatically creates and manages an OpenAI vector store named "memories"
- Memory Storage: When you save a memory, it's uploaded as a text file to the vector store
- Semantic Search: The search functionality uses OpenAI's vector search capabilities to find relevant memories based on meaning, not just keywords
Configuration
The server uses the following constants that can be modified in server.py
:
VECTOR_STORE_NAME
: Name of the OpenAI vector store (default: "memories")
Dependencies
fastmcp
: MCP server frameworkopenai
: OpenAI Python SDKpython-dotenv
: Environment variable management
Troubleshooting
Common Issues
- "OPENAI_API_KEY not found": Make sure your
.env
file is properly configured - "'SyncPage' object has no attribute...": This indicates an API response structure issue - check your OpenAI SDK version
- File upload errors: Ensure your OpenAI API key has vector store permissions
Debug Mode
Add print statements to see detailed responses:
print(f"Vector store ID: {vector_store.id}")
print(f"Search results: {results}")
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
License
[Add your license here]
Support
For issues and questions:
- Check the troubleshooting section
- Review OpenAI API documentation
- Check MCP protocol documentation