
Finding Slow Queries in PostgreSQL (Without Guessing)
Here’s the quantitative method used by DBAs and tools like pganalyze and AWS Performance Insights. Connect to your database and create the extension: CREATE EXTENSION IF NOT EXISTS pg_stat_statements ; Then tell PostgreSQL to load it at startup. The easiest way is with ALTER SYSTEM (no need to edit config files): ALTER SYSTEM SET shared_preload_libraries = 'pg_stat_statements' ; Now restart PostgreSQL. If you’re using Docker: docker restart <your-pg-container-name> The extension now tracks every query, grouping similar ones together. Find the Queries That Cost the Most After the restart, run some queries so the extension collects data. Then run: SELECT query , calls , round ( total_exec_time :: numeric , 2 ) AS total_ms , round ( mean_exec_time :: numeric , 2 ) AS avg_ms , round (( 100 * total_exec_time :: numeric / sum ( total_exec_time :: numeric ) OVER ()):: numeric , 2 ) AS pct_of_total FROM pg_stat_statements ORDER BY total_exec_time DESC LIMIT 10 ; The pct_of_total column shows w
Continue reading on Dev.to
Opens in a new tab




