JUHE API Marketplace
topoteretes avatar
MCP Server

cognee

Memory manager for AI apps and Agents using various graph and vector stores and allowing ingestion from 30+ data sources

7537
GitHub Stars
10/3/2025
Last Updated
No Configuration
Please check the documentation below.

README Documentation

Cognee Logo

cognee - Memory for AI Agents in 6 lines of code

Demo . Learn more · Join Discord · Join r/AIMemory . Docs . cognee community repo

GitHub commits Github tag Downloads

cognee - Memory for AI Agents  in 5 lines of code | Product Hunt topoteretes%2Fcognee | Trendshift

Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Extract, Cognify, Load) pipelines.

🌐 Available Languages : Deutsch | Español | français | 日本語 | 한국어 | Português | Русский | 中文

Why cognee?

Get Started

Get started quickly with a Google Colab notebook , Deepnote notebook or starter repo

About cognee

cognee works locally and stores your data on your device. Our hosted solution is just our deployment of OSS cognee on Modal, with the goal of making development and productionization easier.

Self-hosted package:

  • Interconnects any kind of documents: past conversations, files, images, and audio transcriptions
  • Replaces RAG systems with a memory layer based on graphs and vectors
  • Reduces developer effort and cost, while increasing quality and precision
  • Provides Pythonic data pipelines that manage data ingestion from 30+ data sources
  • Is highly customizable with custom tasks, pipelines, and a set of built-in search endpoints

Hosted platform:

Self-Hosted (Open Source)

📦 Installation

You can install Cognee using either pip, poetry, uv or any other python package manager.

Cognee supports Python 3.10 to 3.12

With uv

uv pip install cognee

Detailed instructions can be found in our docs

💻 Basic Usage

Setup

import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"

You can also set the variables by creating .env file, using our template. To use different LLM providers, for more info check out our documentation

Simple example

Python

This script will run the default pipeline:

import cognee
import asyncio

async def main():
    # Add text to cognee
    await cognee.add("Cognee turns documents into AI memory.")

    # Generate the knowledge graph
    await cognee.cognify()

    # Add memory algorithms to the graph
    await cognee.memify()

    # Query the knowledge graph
    results = await cognee.search("What does cognee do?")

    # Display the results
    for result in results:
        print(result)

if __name__ == '__main__':
    asyncio.run(main())

Example output:

  Cognee turns documents into AI memory.

Via CLI

Let's get the basics covered

cognee-cli add "Cognee turns documents into AI memory."

cognee-cli cognify

cognee-cli search "What does cognee do?"
cognee-cli delete --all

or run

cognee-cli -ui

Hosted Platform

Get up and running in minutes with automatic updates, analytics, and enterprise security.

  1. Sign up on cogwit
  2. Add your API key to local UI and sync your data to Cogwit

Demos

  1. Cogwit Beta demo:

Cogwit Beta

  1. Simple GraphRAG demo

Simple GraphRAG demo

  1. cognee with Ollama

cognee with local models

Contributing

Your contributions are at the core of making this a true open source project. Any contributions you make are greatly appreciated. See CONTRIBUTING.md for more information.

Code of Conduct

We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information.

Citation

We now have a paper you can cite:

@misc{markovic2025optimizinginterfaceknowledgegraphs,
      title={Optimizing the Interface Between Knowledge Graphs and LLMs for Complex Reasoning}, 
      author={Vasilije Markovic and Lazar Obradovic and Laszlo Hajdu and Jovan Pavlovic},
      year={2025},
      eprint={2505.24478},
      archivePrefix={arXiv},
      primaryClass={cs.AI},
      url={https://arxiv.org/abs/2505.24478}, 
}

Quick Actions

Key Features

Model Context Protocol
Secure Communication
Real-time Updates
Open Source

Boost your projects with Wisdom Gate LLM API

Supporting GPT-5, Claude-4, DeepSeek v3, Gemini and more.

Enjoy a free trial and save 20%+ compared to official pricing.