Skip to content

turtlebasket/mem0x

Repository files navigation

mem0x

Fork of mem0 with fixes

Quickstart

Install the sdk via pip:

pip install git+https://github.com/turtlebasket/mem0x@main

Install sdk via npm:

npm install git+https://github.com/turtlebasket/mem0x@main

Development

# clone this repo
git clone https://github.com/turtlebasket/mem0x
# mem0 branch = remote main. sync all changes to mem0 (required for any cherry-picks)
git remote add mem0 https://github.com/mem0ai/mem0.git
git checkout mem0
git merge mem0/main

Basic Usage

Mem0 requires an LLM to function, with gpt-4o-mini from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our Supported LLMs documentation.

First step is to instantiate the memory:

from openai import OpenAI
from mem0 import Memory

openai_client = OpenAI()
memory = Memory()

def chat_with_memories(message: str, user_id: str = "default_user") -> str:
    # Retrieve relevant memories
    relevant_memories = memory.search(query=message, user_id=user_id, limit=3)
    memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories["results"])

    # Generate Assistant response
    system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"
    messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
    response = openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
    assistant_response = response.choices[0].message.content

    # Create new memories from the conversation
    messages.append({"role": "assistant", "content": assistant_response})
    memory.add(messages, user_id=user_id)

    return assistant_response

def main():
    print("Chat with AI (type 'exit' to quit)")
    while True:
        user_input = input("You: ").strip()
        if user_input.lower() == 'exit':
            print("Goodbye!")
            break
        print(f"AI: {chat_with_memories(user_input)}")

if __name__ == "__main__":
    main()

For detailed integration steps, see the Quickstart and API Reference.

🔗 Integrations & Demos

  • ChatGPT with Memory: Personalized chat powered by Mem0 (Live Demo)
  • Browser Extension: Store memories across ChatGPT, Perplexity, and Claude (Chrome Extension)
  • Langgraph Support: Build a customer bot with Langgraph + Mem0 (Guide)
  • CrewAI Integration: Tailor CrewAI outputs with Mem0 (Example)

About

tmp fork of mem0 with misc fixes

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 230