
Next.js job board: reliable scrapes with pg locks
I stopped duplicate cron runs with Postgres advisory locks. I moved “job already exists” into one SQL upsert. I rate-limit per source, not globally. I keep Next.js out of the scraping path entirely. Context I’m building a job board for Psychiatric Mental Health Nurse Practitioners. Next.js 14 on Vercel. Supabase (Postgres) behind it. The board has 8,000+ active listings across 2,000+ companies. I scrape 200+ jobs daily from multiple sources. My first version was naive. A Vercel Cron hit an API route. The route scraped. Then wrote rows. It worked… until it didn’t. One day I had 3 copies of the same job. Another day I missed an entire source. Brutal. The core issue wasn’t “scraping is hard”. It was “cron is not a single threaded program”. Retries happen. Overlaps happen. Two regions happen. And “check-then-insert” is a race. So I rebuilt the pipeline around the database. Postgres decides what runs. Postgres decides what’s new. 1) I don’t trust cron. I make Postgres gate it. Vercel Cron i
Continue reading on Dev.to Webdev
Opens in a new tab

![[MM’s] Boot Notes — The Day Zero Blueprint — Test Smarter on Day One](/_next/image?url=https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1368%2F1*AvVpFzkFJBm-xns4niPLAA.png&w=1200&q=75)

