The Reporting Performance Problem
Your D2C generates a monthly sales report. 10 million order lines. User clicks "Generate Report." Odoo loads all 10 million records into memory:
| Metric | Value |
|---|---|
| Memory needed | 10M × 2KB = 20GB |
| Server has | 16GB |
| Result | CRASH or 5-minute timeout |
The Naive Approach
lines = self.env['sale.order.line'].search([('state', '=', 'done')])
report_data = []
for line in lines:
report_data.append({
'product': line.product_id.name,
'qty': line.product_qty,
'revenue': line.price_total,
})
The Optimized Approach
report_data = self.env['sale.order.line'].read_group(
domain=[('state', '=', 'done')],
fields=['product_id', 'product_qty:sum', 'price_total:sum'],
groupby=['product_id']
)
5 minutes → 0.5 seconds
600x faster
Memory used: 50MB vs 20GB
We've implemented 150+ Odoo systems. The ones with optimized reports? Executives use dashboards daily, instant insights, data-driven decisions. The ones without? Reports timeout, data warehouses cost $100K+, executives don't trust the data. That's $200,000-$500,000 in lost analytics value.
read_group() Method (The Silver Bullet)
What it does: Aggregates data using SQL GROUP BY (fast, efficient).
Basic Syntax
result = self.env['model'].read_group(
domain=[], # Filter records
fields=[], # What to aggregate
groupby=[], # Group by what
)
# Returns: List of grouped aggregates
Real D2C Example 1: Sales by Salesman
salesmen = self.env['res.users'].search([])
report = []
for salesman in salesmen:
orders = self.env['sale.order'].search([
('user_id', '=', salesman.id),
('state', '=', 'done')
])
total = sum(o.amount_total for o in orders)
report.append({
'salesman': salesman.name,
'total_revenue': total,
'order_count': len(orders),
})
# Time: 60 seconds (search for each of 100 salesmen)
result = self.env['sale.order'].read_group(
domain=[('state', '=', 'done')],
fields=['user_id', 'amount_total:sum'],
groupby=['user_id']
)
# Transform result
report = [
{
'salesman': r['user_id'][1], # User name from ID tuple
'total_revenue': r['amount_total'],
'order_count': r['user_id_count'], # Auto-count
}
for r in result
]
# Time: 0.5 seconds (single SQL query)
# 120x FASTER!
Real D2C Example 2: Revenue by Product and Month
# Aggregate sales by product and month
result = self.env['sale.order'].read_group(
domain=[('state', '=', 'done')],
fields=[
'product_id',
'order_date',
'amount_total:sum',
],
groupby=['product_id', 'order_date:month'],
orderby=['product_id', 'order_date:month DESC']
)
# Result: Grouped by product, then by month within each product
# Single SQL query handles all grouping
Available Aggregation Functions
| Function | Syntax | Description |
|---|---|---|
| Sum | field:sum | Total sum |
| Average | field:avg | Average value |
| Count | field:count | Count records |
| Max | field:max | Maximum value |
| Min | field:min | Minimum value |
SQL Direct Queries (For Complex Reports)
When to use: Complex aggregations, multiple joins, custom calculations.
def get_monthly_sales_by_region(self):
"""Complex report: SQL is cleaner and faster."""
query = """
SELECT
c.country_id,
c.name as country,
DATE_TRUNC('month', so.order_date) as month,
COUNT(DISTINCT so.id) as order_count,
SUM(sol.product_qty) as total_qty,
SUM(sol.price_total) as total_revenue,
AVG(sol.price_unit) as avg_price
FROM sale_order so
JOIN sale_order_line sol ON sol.order_id = so.id
JOIN res_partner rp ON rp.id = so.partner_id
JOIN res_country c ON c.id = rp.country_id
WHERE so.state = 'done'
AND so.order_date >= DATE_TRUNC('month', CURRENT_DATE - INTERVAL '12 months')
GROUP BY c.country_id, c.name, DATE_TRUNC('month', so.order_date)
ORDER BY month DESC, total_revenue DESC
"""
self.env.cr.execute(query)
return self.env.cr.dictfetchall()
Why SQL is Faster
• Database executes natively (no Python overhead)
• Single query vs multiple ORM calls
• Optimizer handles joins efficiently
Security: ALWAYS Sanitize User Input!
query = f"SELECT * FROM sale_order WHERE partner_id = {user_input}"
query = "SELECT * FROM sale_order WHERE partner_id = %s"
self.env.cr.execute(query, (user_input,))
Real D2C Example: Complete Dashboard Report
Scenario: Executive dashboard with 4 metrics (all need to be instant).
from odoo import models, fields, api
from odoo.tools import ormcache
class SalesDashboard(models.Model):
_name = 'sales.dashboard'
@ormcache()
def get_dashboard_metrics(self):
"""Get all metrics cached."""
# Metric 1: Total revenue (simple aggregation)
total_rev = self.env['sale.order'].read_group(
domain=[('state', '=', 'done')],
fields=['amount_total:sum'],
groupby=['1'] # Group by constant (no grouping)
)[0]['amount_total']
# Metric 2: Top 5 products (grouped)
top_products = self.env['sale.order.line'].read_group(
domain=[('order_id.state', '=', 'done')],
fields=['product_id', 'price_total:sum'],
groupby=['product_id'],
limit=5,
orderby='price_total DESC'
)
# Metric 3: Revenue by salesman (with more detail)
sales_by_rep = self.env['sale.order'].read_group(
domain=[('state', '=', 'done')],
fields=[
'user_id',
'amount_total:sum',
'id:count'
],
groupby=['user_id'],
orderby='amount_total DESC'
)
# Metric 4: Monthly trend (with SQL for date formatting)
monthly_trend = self._get_monthly_trend()
return {
'total_revenue': total_rev,
'top_products': [
{
'product': p['product_id'][1],
'revenue': p['amount_total'],
}
for p in top_products
],
'sales_by_rep': [
{
'salesman': r['user_id'][1],
'revenue': r['amount_total'],
'orders': r['user_id_count'],
}
for r in sales_by_rep
],
'monthly_trend': monthly_trend,
}
def _get_monthly_trend(self):
"""Get monthly revenue trend."""
query = """
SELECT
DATE_TRUNC('month', so.order_date)::date as month,
SUM(so.amount_total) as revenue
FROM sale_order so
WHERE so.state = 'done'
AND so.order_date >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY DATE_TRUNC('month', so.order_date)
ORDER BY month DESC
"""
self.env.cr.execute(query)
return self.env.cr.dictfetchall()
Materialized Views (For Recurring Reports)
Problem: Same report runs daily. Why recalculate every time?
Solution: Materialized view (snapshot updated once daily)
-- Create materialized view (cached report)
CREATE MATERIALIZED VIEW mv_sales_report AS
SELECT
DATE_TRUNC('month', so.order_date)::date as month,
rp.country_id,
c.name as country,
COUNT(DISTINCT so.id) as order_count,
SUM(sol.price_total) as total_revenue
FROM sale_order so
JOIN sale_order_line sol ON sol.order_id = so.id
JOIN res_partner rp ON rp.id = so.partner_id
JOIN res_country c ON c.id = rp.country_id
WHERE so.state = 'done'
GROUP BY DATE_TRUNC('month', so.order_date), rp.country_id, c.name;
-- Index for faster queries
CREATE INDEX idx_mv_sales_month ON mv_sales_report(month DESC);
Refresh Daily
# In cron job or daily script
psql odoo_db -c "REFRESH MATERIALIZED VIEW CONCURRENTLY mv_sales_report;"
Query Materialized View (Instant)
def get_sales_by_country_and_month(self):
"""Query materialized view (instant)."""
query = """
SELECT month, country, order_count, total_revenue
FROM mv_sales_report
ORDER BY month DESC, total_revenue DESC
"""
self.env.cr.execute(query)
return self.env.cr.dictfetchall()
10 seconds → 0.1 seconds
100x faster
Only refreshed once daily (2-3 seconds cost)
Your Action Items
For Simple Reports
❏ Use read_group() instead of loops
❏ Add caching with @ormcache
❏ Cache aggregation results
For Complex Reports
❏ Write SQL directly (sanitize input!)
❏ Use indexes on groupby fields
❏ Test query speed with EXPLAIN ANALYZE
For Recurring Reports
❏ Create materialized views
❏ Refresh daily or weekly
❏ Query views instead of raw tables
❏ Monitor view size
Free Reporting Performance Audit
Stop waiting for reports to load. Most D2C brands have 5-10 reports that timeout. Optimizing them is 4-6 hours work, delivering 10-100x speedups. We'll profile your slowest reports, identify missing indexes, optimize queries (read_group vs SQL), implement caching, build materialized views where needed, and measure improvements.
