Back to articles
Four Bugs We Found in Our Node.js Rate Limiter (And How We Fixed Them)

Four Bugs We Found in Our Node.js Rate Limiter (And How We Fixed Them)

via Dev.toBill Tu

We recently shipped node-rate-limiter-pro , a high-performance rate limiter for Node.js with Token Bucket, Sliding Window, and Redis support. It benchmarks at ~2M ops/sec in memory and ~10x faster than express-rate-limit. Then we did a proper code review. We found four bugs — none of them obvious at first glance, all of them the kind that only show up in production under real load. This post walks through each one: what went wrong, why it matters, and the fix. Bug #1: The Middleware That Swallowed Errors The problem: Our Express middleware called await this.consume(key) without a try/catch. // Before (broken) return async ( req , res , next ) => { const key = keyFn ( req ); const result = await this . consume ( key ); // 💥 if Redis is down, this throws res . setHeader ( ' X-RateLimit-Limit ' , result . limit ); // ... next (); }; If you're using the in-memory store, this will never bite you — the memory store can't fail. But the moment you plug in Redis and the connection drops, consum

Continue reading on Dev.to

Opens in a new tab

Read Full Article
2 views

Related Articles