
Monitor Page Load Times Across Any URL Fleet Without Running a Browser
Every performance monitoring tool I've tried has the same problem: it runs inside your infrastructure. Lighthouse needs Node. WebPageTest needs a server. Puppeteer needs Chrome running somewhere. For a quick external check of "is my site actually fast for real users?", they're all overkill. I wanted something dead simple: give it a list of URLs, get back load times, detect regressions. Here's what I built — and how you can replicate it in under 50 lines. The approach Instead of running a headless browser, I use a screenshot API that already has a Puppeteer cluster running in the cloud. The API's load_time_ms field captures real browser load time — navigation start to load event — which is exactly what matters for CWV (Core Web Vitals). The endpoint I'm using is SnapAPI's /v1/analyze : curl "https://snapapi.tech/v1/analyze?url=https://example.com" \ -H "X-API-Key: YOUR_KEY" Response includes: { "url" : "https://example.com" , "title" : "Example Domain" , "load_time_ms" : 83 , "page_type
Continue reading on Dev.to Webdev
Opens in a new tab

