Localizable XStrings MCP Server
Provides tools for working with iOS Localizable.xcstrings files, enabling automated translation workflows and localization management for iOS/macOS projects using Xcode String Catalogs.
README Documentation
Localizable XStrings MCP Server
A Model Context Protocol (MCP) server that provides tools for working with iOS Localizable.xcstrings files. This tool enables automated translation workflows and localization management for iOS/macOS projects using Xcode String Catalogs.
Features
- Extract Language Support: Get all supported language codes from .xcstrings files
- Key Management: Extract all localization keys and base language strings
- Automated Translation: Translate strings using OpenAI API
- Batch Processing: Chunked translation (50 strings per chunk) with async concurrency
- File Management: Apply translations back to .xcstrings files while preserving structure
- Cost-Effective: Uses OpenAI API for translations
Setup
Prerequisites
- Python 3.12+
- uv (Python package manager)
- OpenAI API key (for translation features)
Installation
- Clone the repository:
git clone git@github.com:iamnotagentleman/localizable-xcstrings-mcp.git
cd localizable-xcstrings-mcp
- Install dependencies with uv:
uv sync
Configuration
-
Get an OpenAI API key from platform.openai.com
-
Create a .env file and add your OpenAI API key:
OPENAI_API_KEY=your_openai_api_key_here
-
Optional: Customize other settings in the .env file:
OPENAI_MODEL
: Choose the translation model (default: gpt-4o-mini)TRANSLATION_CHUNK_SIZE
: Adjust batch size for large filesTRANSLATION_TEMPERATURE
: Control translation creativity (0.0-1.0)TRANSLATION_MAX_CONCURRENT_CHUNKS
: Limit concurrent API requestsTRANSLATION_RATE_LIMIT_DELAY
: Delay between API calls
Usage
Running the MCP Server
Start the server with:
uv run src/localizable_xstrings_mcp/server.py
This will launch a FastMCP interface where you can:
- Upload .xcstrings files
- Extract language information and keys
- Translate strings to target languages
- Apply translations back to files
Available Tools
- Get Languages: Extract supported language codes from .xcstrings files
- Get Keys: List all localization keys
- Get Base Strings: Extract base language key-value pairs
- Translate: Preview translations using OpenAI API
- Apply Translations: Translate and apply to .xcstrings files
- Apply Missing: Translate and apply only missing translations for a target language
- Translate Key: Translate specific keys to multiple languages
Adding to Claude Code
To use this MCP server with Claude Code, follow these steps:
1. Install and Configure
First, ensure the package is installed in your Python environment:
uv sync
2. Add to Claude Code
Use the fastmcp install command:
claude mcp add localizable-xcstrings --scope user -- uv run --with fastmcp fastmcp run server.py
3. Restart Claude Code
After installation, restart Claude Code to load the new MCP server.
4. Verify Installation
In Claude Code, you should now have access to these tools:
get_languages_tool
get_keys_tool
get_base_strings_tool
translate_tool
apply_tool
apply_missing_tool
translate_key_tool
Example Workflow
-
Extract information from your .xcstrings file:
Use get_languages_tool with path to your Localizable.xcstrings file
-
Get all localization keys:
Use get_keys_tool to see all string identifiers
-
Translate to a new language:
Use apply_tool with target language (e.g., "de" for German) Ensure your .env file is properly configured with your OpenAI API key
-
Translate specific keys:
Use translate_key_tool for individual string translations
Environment Variables
All configuration is managed through environment variables in the .env
file:
Variable | Required | Default | Description |
---|---|---|---|
OPENAI_API_KEY | Yes | - | Your OpenAI API key |
OPENAI_MODEL | No | gpt-4o-mini | OpenAI model for translations |
OPENAI_BASE_URL | No | - | Custom API base URL |
File Format Support
This tool works with Xcode 15+ String Catalog files (.xcstrings). These files use a JSON structure to store localized strings and metadata.
Translation Features
- Chunked Processing: Large translation jobs are split into 50-string chunks
- Async Concurrency: Up to 3 chunks processed simultaneously
- Token Limit Protection: Prevents API context limit issues
- Progress Reporting: Shows processing status for large jobs
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Run the test suite
- Submit a pull request
Support
For issues and questions:
- Check the test files for usage examples
- Open an issue on the repository