If your Magento store is slow, the database is almost always the first place to look. A single unoptimized query running on every page load can add hundreds of milliseconds to your response time — and on a store with thousands of products, slow queries are almost inevitable.
Here's the complete playbook.
Step 1: Enable the slow query log
MySQL and MariaDB have a built-in slow query log. Enable it on your server:
SET GLOBAL slow_query_log = 'ON';
SET GLOBAL long_query_time = 1;
SET GLOBAL slow_query_log_file = '/var/log/mysql/slow-queries.log';
SET GLOBAL log_queries_not_using_indexes = 'ON';
Set long_query_time to 1 second as a starting threshold. Once you've fixed the worst offenders, lower it to 0.5 or even 0.1.
For permanent configuration, add to my.cnf:
[mysqld]
slow_query_log = 1
slow_query_log_file = /var/log/mysql/slow-queries.log
long_query_time = 1
log_queries_not_using_indexes = 1
Step 2: Analyze the slow query log
After collecting a few hours of data, use mysqldumpslow to summarize:
mysqldumpslow -s t -t 20 /var/log/mysql/slow-queries.log
This shows your 20 slowest queries sorted by total execution time. You'll typically find a handful of queries responsible for 80%+ of your database load.
Step 3: Run EXPLAIN on slow queries
For each slow query, run EXPLAIN to understand what MySQL is doing:
EXPLAIN SELECT e.entity_id, a1.value AS name
FROM catalog_product_entity e
LEFT JOIN catalog_product_entity_varchar a1
ON e.entity_id = a1.entity_id AND a1.attribute_id = 73
WHERE e.entity_id IN (1, 2, 3, 4, 5);
Key things to look for in the output:
| Column | Bad sign | Good sign |
|---|---|---|
type |
ALL (full table scan) |
ref, eq_ref, const
|
key |
NULL (no index used) |
Index name |
rows |
High number | Low number |
Extra |
Using filesort, Using temporary
|
Using index |
A type: ALL with rows: 500000 means MySQL is scanning your entire products table for every request. That's a critical fix.
Step 4: Add missing indexes
Once you've identified a query doing a full table scan, check if an index would help:
-- Check existing indexes
SHOW INDEX FROM catalog_product_entity_varchar;
-- Add an index if missing
ALTER TABLE catalog_product_entity_varchar
ADD INDEX idx_entity_attribute (entity_id, attribute_id);
Warning: Adding indexes on large tables takes time and locks the table. Use
ALTER TABLE ... ALGORITHM=INPLACE, LOCK=NONEon MySQL 5.7+ or run during a maintenance window.
Step 5: Common Magento slow query patterns
EAV overload — too many attribute joins:
Magento's EAV architecture means product data is spread across multiple tables. A single product load can generate 20–30 joins. The fix: use flat catalog tables or selectively index frequently-accessed attributes.
bin/magento indexer:reindex catalog_product_flat
bin/magento config:set catalog/frontend/flat_catalog_product 1
Unindexed product collection filters:
Custom collection filters on non-indexed attributes cause full scans. Always check if the attribute has Used in Search or a proper index.
ORDER BY on non-indexed columns:
Sorting on columns without indexes causes Using filesort — MySQL sorts in memory or on disk instead of using an index.
-- Add a composite index for common sort patterns
ALTER TABLE sales_order ADD INDEX idx_created_status (created_at, status);
N+1 queries in loops:
// Bad: 1 query per product
foreach ($products as $product) {
$category = $product->getCategory(); // new query each iteration
}
// Good: load all at once
$productIds = $products->getAllIds();
$categories = $categoryRepository->getListByProductIds($productIds);
Step 6: Query result caching
Some queries are inherently expensive but don't change often — complex reports, faceted navigation counts, bestseller calculations. Cache their results:
$cacheKey = 'category_product_count_' . $categoryId;
$result = $this->cache->load($cacheKey);
if (!$result) {
$result = $this->runExpensiveQuery($categoryId);
$this->cache->save(serialize($result), $cacheKey, ['category_' . $categoryId], 3600);
}
Automating the process
Manual slow query analysis works but doesn't scale. The BetterMagento Query Optimizer automates the entire workflow:
- Continuously monitors your slow query log
- Runs EXPLAIN analysis automatically
- Suggests specific indexes with the exact
ALTER TABLEstatements to run - Tracks query performance over time
- Alerts you when new slow queries appear
For most stores, it finds 5–15 high-impact indexes that can be added in under an hour — often cutting total database query time by 40–60%.
Summary checklist
- [ ] Enable slow query log (threshold: 1s)
- [ ] Analyze with
mysqldumpslowweekly - [ ] Run EXPLAIN on top 10 slow queries
- [ ] Add missing indexes
- [ ] Enable flat catalog tables
- [ ] Cache results of expensive but stable queries
- [ ] Set up automated monitoring
Originally published on magevanta.com
United States
NORTH AMERICA
Related News
How Braze’s CTO is rethinking engineering for the agentic area
10h ago
Amazon Employees Are 'Tokenmaxxing' Due To Pressure To Use AI Tools
21h ago

Implementing Multicloud Data Sharding with Hexagonal Storage Adapters
15h ago

DeepMind’s CEO Says AGI May Be ~4 Years Away. The Last Three Missing Pieces Are Not What Most People Think.
15h ago

CCSnapshot - A Claude Code Configs Transfer Tool
21h ago