diff --git a/README.md b/README.md index d51a380b14..5be2758efe 100644 --- a/README.md +++ b/README.md @@ -5,27 +5,27 @@
- cognee - Memory for AI Agents in 6 lines of code + Cognee - Accurate and Persistent AI Memory

Demo . - Learn more + Docs + . + Learn More · Join Discord · Join r/AIMemory . - Docs - . - cognee community repo + Community Plugins & Add-ons

[![GitHub forks](https://img.shields.io/github/forks/topoteretes/cognee.svg?style=social&label=Fork&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/network/) [![GitHub stars](https://img.shields.io/github/stars/topoteretes/cognee.svg?style=social&label=Star&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/stargazers/) [![GitHub commits](https://badgen.net/github.amrom.workers.devmits/topoteretes/cognee)](https://GitHub.com/topoteretes/cognee/commit/) - [![Github tag](https://badgen.net/github/tag/topoteretes/cognee)](https://github.com/topoteretes/cognee/tags/) + [![GitHub tag](https://badgen.net/github/tag/topoteretes/cognee)](https://github.com/topoteretes/cognee/tags/) [![Downloads](https://static.pepy.tech/badge/cognee)](https://pepy.tech/project/cognee) [![License](https://img.shields.io/github/license/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/blob/main/LICENSE) [![Contributors](https://img.shields.io/github/contributors/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/graphs/contributors) @@ -41,11 +41,7 @@

- - - - -Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Extract, Cognify, Load) pipelines. +Use your data to build personalized and dynamic memory for AI Agents. Cognee lets you replace RAG with scalable and modular ECL (Extract, Cognify, Load) pipelines.

🌐 Available Languages @@ -53,7 +49,7 @@ Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Ext Deutsch | Español | - français | + Français | 日本語 | 한국어 | Português | @@ -67,69 +63,65 @@ Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Ext +## About Cognee +Cognee is an open-source tool and platform that transforms your raw data into persistent and dynamic AI memory for Agents. It combines vector search with graph databases to make your documents both searchable by meaning and connected by relationships. -## Get Started - -Get started quickly with a Google Colab notebook , Deepnote notebook or starter repo - +You can use Cognee in two ways: -## About cognee +1. [Self-host Cognee Open Source](https://docs.cognee.ai/getting-started/installation), which stores all data locally by default. +2. [Connect to Cognee Cloud](https://platform.cognee.ai/), and get the same OSS stack on managed infrastructure for easier development and productionization. -cognee works locally and stores your data on your device. -Our hosted solution is just our deployment of OSS cognee on Modal, with the goal of making development and productionization easier. +### Cognee Open Source (self-hosted): -Self-hosted package: +- Interconnects any type of data — including past conversations, files, images, and audio transcriptions +- Replaces traditional RAG systems with a unified memory layer built on graphs and vectors +- Reduces developer effort and infrastructure cost while improving quality and precision +- Provides Pythonic data pipelines for ingestion from 30+ data sources +- Offers high customizability through user-defined tasks, modular pipelines, and built-in search endpoints -- Interconnects any kind of documents: past conversations, files, images, and audio transcriptions -- Replaces RAG systems with a memory layer based on graphs and vectors -- Reduces developer effort and cost, while increasing quality and precision -- Provides Pythonic data pipelines that manage data ingestion from 30+ data sources -- Is highly customizable with custom tasks, pipelines, and a set of built-in search endpoints +### Cognee Cloud (managed): +- Hosted web UI dashboard +- Automatic version updates +- Resource usage analytics +- GDPR compliant, enterprise-grade security -Hosted platform: -- Includes a managed UI and a [hosted solution](https://www.cognee.ai) +## Basic Usage & Feature Guide +To learn more, [check out this short, end-to-end Colab walkthrough](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing) of Cognee's core features. +[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing) -## Self-Hosted (Open Source) +## Quickstart +Let’s try Cognee in just a few lines of code. For detailed setup and configuration, see the [Cognee Docs](https://docs.cognee.ai/getting-started/installation#environment-configuration). -### 📦 Installation +### Prerequisites -You can install Cognee using either **pip**, **poetry**, **uv** or any other python package manager.. +- Python 3.10 to 3.12 -Cognee supports Python 3.10 to 3.12 +### Step 1: Install Cognee -#### With uv +You can install Cognee with **pip**, **poetry**, **uv**, or your preferred Python package manager. ```bash uv pip install cognee ``` -Detailed instructions can be found in our [docs](https://docs.cognee.ai/getting-started/installation#environment-configuration) - -### 💻 Basic Usage - -#### Setup - -``` +### Step 2: Configure the LLM +```python import os os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY" - ``` +Alternatively, create a `.env` file using our [template](https://github.com/topoteretes/cognee/blob/main/.env.template). -You can also set the variables by creating .env file, using our template. -To use different LLM providers, for more info check out our documentation - - -#### Simple example +To integrate other LLM providers, see our [LLM Provider Documentation](https://docs.cognee.ai/setup-configuration/llm-providers). +### Step 3: Run the Pipeline +Cognee will take your documents, generate a knowledge graph from them and then query the graph based on combined relationships. -##### Python - -This script will run the default pipeline: +Now, run a minimal pipeline: ```python import cognee @@ -147,7 +139,7 @@ async def main(): await cognee.memify() # Query the knowledge graph - results = await cognee.search("What does cognee do?") + results = await cognee.search("What does Cognee do?") # Display the results for result in results: @@ -158,69 +150,61 @@ if __name__ == '__main__': asyncio.run(main()) ``` -Example output: -``` - Cognee turns documents into AI memory. +As you can see, the output is generated from the document we previously stored in Cognee: + +```bash + Cognee turns documents into AI memory. ``` -##### Via CLI -Let's get the basics covered +### Use the Cognee CLI -``` +As an alternative, you can get started with these essential commands: + +```bash cognee-cli add "Cognee turns documents into AI memory." cognee-cli cognify -cognee-cli search "What does cognee do?" +cognee-cli search "What does Cognee do?" cognee-cli delete --all ``` -or run -``` + +To open the local UI, run: +```bash cognee-cli -ui ``` +## Demos & Examples - - - -### Hosted Platform - -Get up and running in minutes with automatic updates, analytics, and enterprise security. - -1. Sign up on [cogwit](https://www.cognee.ai) -2. Add your API key to local UI and sync your data to Cogwit - - - - -## Demos +See Cognee in action: -1. Cogwit Beta demo: +### Cognee Cloud Beta Demo -[Cogwit Beta](https://github.com/user-attachments/assets/fa520cd2-2913-4246-a444-902ea5242cb0) +[Watch Demo](https://github.com/user-attachments/assets/fa520cd2-2913-4246-a444-902ea5242cb0) -2. Simple GraphRAG demo +### Simple GraphRAG Demo -[Simple GraphRAG demo](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f) +[Watch Demo](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f) -3. cognee with Ollama +### Cognee with Ollama -[cognee with local models](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db) +[Watch Demo](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db) -## Contributing -Your contributions are at the core of making this a true open source project. Any contributions you make are **greatly appreciated**. See [`CONTRIBUTING.md`](CONTRIBUTING.md) for more information. +## Community & Support +### Contributing +We welcome contributions from the community! Your input helps make Cognee better for everyone. See [`CONTRIBUTING.md`](CONTRIBUTING.md) to get started. -## Code of Conduct +### Code of Conduct -We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information. +We're committed to fostering an inclusive and respectful community. Read our [Code of Conduct](https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md) for guidelines. -## Citation +## Research & Citation -We now have a paper you can cite: +We recently published a research paper on optimizing knowledge graphs for LLM reasoning: ```bibtex @misc{markovic2025optimizinginterfaceknowledgegraphs,