
RLAAS (Rate Limiting As A Service): Rate Limiting Across Modern Systems
RLAAS (Rate Limiting As A Service) is an open-source, policy-driven platform for controlling HTTP traffic, logs, spans, metrics, and events with one reusable engine. The Problem Nobody Talks About Every engineering team knows they need rate limiting. But most solutions only protect one layer — the API gateway. What happens to everything else? Here are the real pain points I kept running into: Log floods — A bug sends millions of error logs to your observability stack. Costs spike. Dashboards break. On-call engineers drown in noise. Metric storms — A chatty service emits 50x normal Datadog metrics during a deployment. Your bill triples overnight. Kafka cascades — A slow consumer falls behind. Retries pile up. One service takes down the entire event pipeline. Sidecar blindspots — Traffic between services inside a mesh never hits your gateway. There’s nothing enforcing limits there. Copy-paste rate limiting — Every team reimplements throttling logic in their own service, with their own bu
Continue reading on Dev.to
Opens in a new tab



