How to Use Mem0 for Persistent AI Memory: Complete Step by Step Guide
By Braincuber Team
Published on April 21, 2026
Mem0 solves a fundamental problem with LLMs: they have no memory between sessions. Every conversation starts from a blank slate. For casual use, this is fine. But for real-world applications, your users expect the AI to remember them. Mem0 adds a persistent memory layer that stores, retrieves, and updates information about users across sessions.
What You'll Learn:
- What Mem0 is and how it differs from conversation history
- Setting up Mem0 with both Platform and self-hosted options
- Basic memory operations: add, search, update, delete
- Building a memory-powered AI agent with OpenAI Agents SDK
- Custom categories and instructions for fine-tuned behavior
- Graph memory for tracking relationships
What Is Mem0?
Mem0 is an open-source memory layer that sits between your application and the LLM. It automatically extracts relevant information from conversations, stores it, and retrieves it when needed. The project raised $24M in October 2025 and works with any LLM provider: OpenAI, Anthropic, Ollama, or your own models.
User Memory
Persists across all conversations with a specific person. If someone mentions they prefer morning study sessions, that fact stays available in every future session.
Session Memory
Tracks context within a single conversation, like the current recipe being discussed.
Agent Memory
Stores information specific to a particular AI agent instance.
Graph Memory
Tracks relationships between entities. When order or progression matters.
Under the hood, Mem0 combines vector search with graph relationships. When you add a conversation to memory, it automatically extracts the important bits. You do not need to manually tag what should be remembered.
Performance Note
On the LOCOMO benchmark, Mem0 scored 26% higher than OpenAI's built-in memory. It responds 91% faster by selectively retrieving relevant memories instead of processing full conversation history. Token usage drops by about 90%.
Getting Started With Mem0
Install the package along with python-dotenv for managing API keys:
pip install mem0ai python-dotenv
Sign up at app.mem0.ai and grab your API key. Create a .env file:
MEM0_API_KEY=your-api-key-here
Initializing the Client
Initialize the Mem0 client using the API key:
from mem0 import MemoryClient
from dotenv import load_dotenv
import os
load_dotenv()
client = MemoryClient(api_key=os.getenv("MEM0_API_KEY"))
Adding Memory
To store a memory, pass a conversation in the OpenAI chat format. The user_id parameter scopes memories to a specific user:
messages = [
{"role": "user", "content": "I'm a vegetarian and allergic to nuts."},
{"role": "assistant", "content": "Got it! I'll remember that."}
]
client.add(messages, user_id="user123")
Searching Memory
To retrieve memories, use search() with a natural language query:
results = client.search("dietary restrictions", filters={"user_id": "user123"})
The response contains the extracted memories with relevance scores. Notice that Mem0 automatically split "vegetarian" and "allergic to nuts" into two separate facts.
Building a Mem0 AI Agent
Now let's build a learning companion agent that autonomously decides when to store, retrieve, and update memories. We will use the OpenAI Agents SDK to give our agent memory tools it can call on its own.
Setting Up Dependencies
pip install openai-agents mem0ai python-dotenv
Creating Memory Tools
The agent needs three tools: search, save, and update. We use the @function_tool decorator to expose these to the agent.
Important Note
Memory processing is asynchronous. After calling save_memory, there is a brief delay before the new memory becomes searchable. In production, account for this in your user interface.
Fine-Tuning Mem0 Behavior
Production applications need finer control over what gets stored and how it is retrieved. Mem0 offers project-level settings.
Custom Categories
Define your own categories that match your domain:
client.project.update(custom_categories=[
{"name": "topics", "description": "Programming languages, frameworks, or subjects"},
{"name": "skill_levels", "description": "Proficiency: beginner, intermediate, advanced"},
{"name": "goals", "description": "Learning objectives and targets"},
{"name": "progress", "description": "Completed courses, chapters, or milestones"},
{"name": "preferences", "description": "Learning style, schedule, or format preferences"}
])
Custom Instructions
Control exactly what gets extracted from conversations:
client.project.update(custom_instructions="""
Extract and remember:
- Programming topics and technologies mentioned
- Current skill level for each topic
- Learning goals and deadlines
- Progress updates and completions
- Preferred learning resources (videos, docs, exercises)
Do not store:
- Personal identifiers beyond the user_id
- Payment or financial information
- Off-topic conversation that isn't about learning
""")
Graph Memory
Graph memory adds relationship tracking. When order, progression, or connections between entities matter, graph memory gives your agent a richer context.
Pro Feature
Graph memory is only available with a Pro plan ($249/month) or higher.
Platform vs Open Source
| Aspect | Platform | Open Source |
|---|---|---|
| Setup time | Minutes | Hours to days |
| Infrastructure | Fully managed | Self-managed |
| Pricing | $0-249+/month | Your infrastructure costs |
| Compliance | SOC 2, GDPR included | You implement |
Frequently Asked Questions
What is Mem0, and how does it differ from conversation history?
Mem0 is a memory layer that extracts and stores relevant facts from conversations, not full transcripts. It retrieves only contextually relevant memories per query, reducing token usage by ~90%.
Is Mem0 free to use?
The Platform offers a free tier with 10,000 memories for prototyping. Paid plans start at $19/month (Starter) and $249/month (Pro). The open-source version is free but requires self-hosted infrastructure.
Can I use Mem0 with LLM providers other than OpenAI?
Yes. Mem0 works with any LLM provider including Anthropic, Ollama, Groq, and local models. The open source version supports 16+ LLM providers and 24+ vector databases.
What's the difference between user, session, and agent memory?
User memory persists across all conversations with a specific person. Session memory tracks context within a single conversation. Agent memory stores information specific to a particular AI agent instance.
How do I delete or update stored memories?
Use client.update(memory_id=id, text=new_text) to modify existing memories or client.delete(memory_id=id) to remove them. For bulk deletion, use client.delete_all(user_id="user_id").
Ready to Add Memory to Your AI?
Mem0 enables you to build AI applications that actually remember users. Start with the free tier for prototyping, then scale to production as needed.
