MCP Server
MCP OI-Wiki
Enhances large language models with competitive programming knowledge by leveraging OI-Wiki content through vector search, allowing models to retrieve relevant algorithms and techniques.
24
GitHub Stars
8/23/2025
Last Updated
MCP Server Configuration
1{
2 "name": "oi-wiki",
3 "command": "uv",
4 "args": [
5 "--directory",
6 "/mcp-oi-wiki",
7 "run",
8 "python",
9 "main.py"
10 ]
11}
JSON11 lines
README Documentation
mcp-oi-wiki
让大模型拥有 OI-Wiki 的加成!
How does it work?
使用 Deepseek-V3 对 OI-wiki 当前的 462 个页面做摘要,将摘要嵌入为语义向量,建立向量数据库。
查询时,找到数据库中最接近的向量,返回对应的 wiki markdown。
Usage
确保你拥有 uv
。
首先,下载本仓库:
cd <path of MCP servers>
git clone --recurse-submodules https://github.com/ShwStone/mcp-oi-wiki.git
然后打开你的 MCP 配置文件(mcpo 或 claude):
{
"mcpServers": {
"oi-wiki": {
"command": "uv",
"args": [
"--directory",
"<path of MCP servers>/mcp-oi-wiki",
"run",
"python",
"main.py"
]
}
}
}
Update
可以生成自己的 db/oi-wiki.db
。
将 Silicon flow API key 放在 api.key
文件中。
然后运行:
uv run script/request.py
在批量推理页面下载摘要结果到 result.jsonl
。
最后运行:
uv run script/gendb.py
生成新的 db/oi-wiki.db
。
Thanks
Quick Install
Quick Actions
Key Features
Model Context Protocol
Secure Communication
Real-time Updates
Open Source