Quick Answer
Redis caching stores frequently-accessed data in RAM (0.1-1ms) instead of querying database (10-100ms) = 100x faster, reducing queries from 100,000 to 500 daily. The problem: D2C dashboard loads 500 products, displayed to 200 users/day. Without caching: each user triggers 500 database queries = 100,000 daily queries (500 × 200), CPU maxed, 8 second page loads. With Redis: First user triggers 500 queries, results cached in Redis (RAM), next 199 users get instant cached results (zero database queries), total 500 queries vs 100,000 = 200x reduction, pages load 0.2s vs 8s = 40x faster. How it works: Redis = ultra-fast in-memory data store, database query takes 10-100ms, Redis get takes 0.1-1ms = 100x faster than database. Installation: Ubuntu: sudo apt-get install redis-server, systemctl start redis-server, verify with redis-cli ping (returns PONG). Docker: redis:7-alpine image with persistence (appendonly yes). Odoo configuration: Edit odoo.conf: session_store=redis (stores user sessions in Redis not database), session_store_url=redis://localhost:6379, ormcache_redis_url=redis://localhost:6379 (ORM calculated values), ormcache_redis_expire=3600 (1 hour), workers=4 for multi-worker. Caching strategies: (1) Session caching (automatic, user login stored in Redis = instant), (2) ORM caching (@ormcache('self.id') decorator on computed fields, calculates once then caches), (3) Custom data caching (@ormcache('self.env.uid') for expensive queries like monthly sales), (4) Cache invalidation (CRITICAL: clear_cache() when data changes). Real pattern: E-commerce dashboard 50 categories with sales stats. Without caching: 50+ queries per request, 5 seconds, 100 concurrent users = 5000 queries/second. With caching: 2 queries first time (read_group aggregation), cached 1 hour, next 3,599 requests = 0 queries from Redis (instant), 5s → 0.1s = 50x faster, 50 queries/hour vs 50 queries/minute. Impact: Blazing fast pages, modest infrastructure, $100k-$300k savings vs constantly buying bigger servers.
The Caching Performance Problem
Your D2C dashboard loads a product catalog. Same 500 products, displayed to 200 users per day.
| Scenario | Queries | Load Time | Result |
|---|---|---|---|
| Without Caching | 100,000 daily (500 × 200) | 8 seconds | CPU maxed out |
| With Redis Caching | 500 total (first user only) | 0.2 seconds | Instant for 199 users |
100,000 → 500 queries
8s → 0.2s
That's 40x faster with Redis caching.
We've implemented 150+ Odoo systems. The ones with Redis caching? Blazing fast, happy users, modest infrastructure. The ones without? Constantly buying bigger servers to handle the load, still slow, money wasted. That's $100,000-$300,000 in unnecessary infrastructure costs.
What Is Redis?
Redis: Ultra-fast in-memory data store (like Memcached, but more powerful).
Speed Comparison
| Storage Type | Access Time | Speed Difference |
|---|---|---|
| Database Query | 10-100ms | Baseline (slowest) |
| Redis Get | 0.1-1ms | 100x faster than database |
| Server RAM | 0.01ms | Same as Redis, but lost between requests |
Installing Redis
On Linux (Ubuntu/Debian)
# Install Redis
sudo apt-get update
sudo apt-get install redis-server
# Start Redis
sudo systemctl start redis-server
sudo systemctl enable redis-server # Auto-start on reboot
# Verify running
redis-cli ping
# Should return: PONG
Using Docker (Recommended for Production)
version: '3'
services:
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis_data:/data
command: redis-server --appendonly yes # Persistence
odoo:
image: odoo:18
ports:
- "8069:8069"
environment:
- HOST=db
- USER=odoo
- PASSWORD=odoo
- REDIS_URL=redis://redis:6379
depends_on:
- redis
volumes:
redis_data:
docker-compose up -d
Configuring Odoo for Redis
Edit odoo.conf
[options]
# Session caching (stores user sessions in Redis)
session_store = redis
session_store_url = redis://localhost:6379
# ORM caching (stores calculated values in Redis)
ormcache_redis_url = redis://localhost:6379
ormcache_redis_expire = 3600 # Expire after 1 hour
# Multi-worker setup
workers = 4
longpolling_port = 8072
For Remote Redis Server
session_store_url = redis://192.168.1.10:6379/0
ormcache_redis_url = redis://192.168.1.10:6379/1
Restart Odoo
sudo systemctl restart odoo
Verify Working
# Check Redis keys
redis-cli
> KEYS *
> MEMORY STATS
> INFO stats
Caching Strategies
Strategy 1: Session Caching (Automatic)
• User logs in
• Session data stored in Redis (not database)
• Subsequent requests get session from Redis (instant)
• User logs out → Session deleted from Redis
Result: Faster login/logout, less database load.
Strategy 2: ORM Caching (Computed Values)
from odoo import models, fields, api
from odoo.tools import ormcache
class Product(models.Model):
_name = 'product.product'
# Without caching: Recalculates every access
profit_margin = fields.Float(compute='_compute_profit')
# With caching: Calculates once, caches result
@api.depends('list_price', 'cost')
@ormcache('self.id') # Cache by product ID
def _compute_profit(self):
"""Cached calculation."""
for product in self:
if product.list_price > 0:
profit = (product.list_price - product.cost)
margin = (profit / product.list_price) * 100
product.profit_margin = margin
else:
product.profit_margin = 0
Strategy 3: Custom Data Caching
from odoo.tools import ormcache
from redis import Redis
class SalesAnalytics(models.Model):
_name = 'sales.analytics'
@ormcache('self.env.uid') # Cache per user
def get_monthly_sales(self):
"""Expensive query, cache for 1 hour."""
return self.env['sale.order'].read_group(
domain=[('state', '=', 'done')],
fields=['order_date', 'amount_total:sum'],
groupby=['order_date:month']
)
# Manual cache management
def invalidate_cache(self):
"""Clear cache when data changes."""
self.get_monthly_sales.clear_cache(self)
Strategy 4: Cache Invalidation (CRITICAL!)
class Product(models.Model):
_name = 'product.product'
@api.constrains('cost')
def _invalidate_profit_on_cost_change(self):
"""Clear profit margin cache when cost changes."""
# Clear ORM cache
self.env.clear() # Nuclear option: clears all cache
# Or selective clear
self._compute_profit.clear_cache(self)
class SaleOrder(models.Model):
_inherit = 'sale.order'
def action_confirm(self):
"""Invalidate related caches on order confirmation."""
result = super().action_confirm()
# Clear customer's cached data
self.partner_id.env.clear()
return result
⚠️ Critical:
Always implement cache invalidation when data changes. Stale cache = incorrect data displayed to users.
Real D2C Implementation: E-Commerce Dashboard
Scenario: E-commerce dashboard showing 50 product categories with sales stats.
WITHOUT CACHING
class EcommerceDashboard:
def get_dashboard_data(self):
"""Expensive: 50 queries per request."""
categories = self.env['product.category'].search([])
data = []
for category in categories:
# Each category: search orders, aggregate
orders = self.env['sale.order'].search([
('order_line.product_id.categ_id', '=', category.id),
('state', '=', 'done')
])
total_sales = sum(o.amount_total for o in orders)
order_count = len(orders)
data.append({
'category': category.name,
'sales': total_sales,
'orders': order_count,
})
return data
# Time: 5 seconds per request
# Queries: 50+ per request
# With 100 concurrent users: 5000+ queries/second
WITH CACHING
from odoo.tools import ormcache
class EcommerceDashboard:
@ormcache()
def get_dashboard_data(self):
"""Cached: 50 queries first time, then instant."""
categories = self.env['product.category'].search([])
# Fetch all data in 2 queries (not 50+)
sales_data = self.env['sale.order'].read_group(
domain=[('state', '=', 'done')],
fields=['order_line.product_id.categ_id', 'amount_total:sum'],
groupby=['order_line.product_id.categ_id']
)
# Map to categories
sales_by_cat = {
d['order_line.product_id.categ_id'][0]: d['amount_total']
for d in sales_data
}
order_data = self.env['sale.order'].read_group(
domain=[('state', '=', 'done')],
fields=['order_line.product_id.categ_id'],
groupby=['order_line.product_id.categ_id']
)
orders_by_cat = {
d['order_line.product_id.categ_id'][0]: d['order_line.product_id.categ_id_count']
for d in order_data
}
# Build result (no queries)
data = []
for category in categories:
data.append({
'category': category.name,
'sales': sales_by_cat.get(category.id, 0),
'orders': orders_by_cat.get(category.id, 0),
})
return data
def invalidate_dashboard_cache(self):
"""Clear cache when orders/products change."""
self.get_dashboard_data.clear_cache(self)
# Usage
class SaleOrder(models.Model):
_inherit = 'sale.order'
def action_confirm(self):
result = super().action_confirm()
# Invalidate dashboard cache
dashboard = self.env['ecommerce.dashboard']
dashboard.invalidate_dashboard_cache()
return result
Performance Comparison
| Metric | Without Cache | With Redis Cache |
|---|---|---|
| First Request | 50+ queries, 5 seconds | 2 queries, cached in Redis (1 hour) |
| Subsequent Requests | 50+ queries, 5 seconds each | 0 queries, served from Redis (instant) |
| Load Time | 5 seconds | 0.1 seconds (50x faster) |
| Database Load | 50 queries/minute | 50 queries/hour |
Monitoring & Troubleshooting
Monitor Redis
# Check memory usage
redis-cli INFO memory
# Check keys in cache
redis-cli KEYS "*"
redis-cli DBSIZE
# Monitor in real-time
redis-cli MONITOR
# Clear all cache (use carefully!)
redis-cli FLUSHALL
Debug Odoo Caching
# Check if Redis is connected
from odoo.tools import config
print(config.get('session_store_url'))
# Clear cache programmatically
self.env.clear()
# Check cache statistics
from odoo import tools
cache = tools.LRU
print(cache.sizes())
Common Issues
| Problem | Solution |
|---|---|
| Redis not connecting | Check redis.conf, verify port 6379 open, restart Redis |
| Cache not invalidating | Call self.env.clear() or specific method.clear_cache() |
| Memory growing unbounded | Set maxmemory policy in redis.conf (maxmemory 2gb, maxmemory-policy allkeys-lru) |
Your Action Items
Install Redis
❏ Install Redis server (or Docker)
❏ Verify Redis running (redis-cli ping)
❏ Configure firewall (port 6379)
Configure Odoo
❏ Update odoo.conf with Redis URL
❏ Set session_store = redis
❏ Set ormcache_redis_url
❏ Restart Odoo
Implement Caching
❏ Identify expensive queries/calculations
❏ Add @ormcache decorator
❏ Implement cache invalidation
❏ Monitor Redis usage
Monitor & Maintain
❏ Watch Redis memory usage
❏ Set maxmemory limits
❏ Clear unused cache regularly
❏ Profile for cache hit ratio
Free Redis Caching Assessment
Stop running slow Odoo systems on huge servers. We'll identify cacheable queries/calculations, set up Redis (Docker recommended), configure Odoo for Redis, implement caching where it matters, and monitor cache performance. Most D2C brands can achieve 20-50x speedups with proper caching. Cost: 8-12 hours consulting. Value: $150,000+ in infrastructure you don't need to buy.
