Testing a new rate-limiting service – feedback welcome

Hey all,

I’m building a project called Rately. It’s a rate-limiting service that runs on Cloudflare Workers (so at the edge, close to your clients).

The idea is simple: instead of only limiting by IP, you can set rules based on your own data — things like:

- URL params (/users/:id/posts → limit per user ID)

- Query params (?api_key=123 → limit per API key)

- Headers (X-Org-ID, Authorization, etc.)

Example:

Say your API has an endpoint /user/42/posts. With Rately you can tell it: “apply a limit of 100 requests/min per userId”.

So user 42 and user 99 each get their own bucket automatically. No custom nginx or middleware needed.

It has two working modes:

- Proxy mode – you point your API domain (CNAME) to Rately. Requests come in, Rately enforces your limits, then forwards to your origin. Easiest drop-in.

``` Client ---> Rately (enforce limits) ---> Origin API ```

- Control plane mode – you keep running your own API as usual, but your code or middleware can call Rately’s API to ask “is this request allowed?” before handling it. Gives you more flexibility without routing all traffic through Rately.

``` Client ---> Your API ---> Rately /check (allow/deny) ---> Your API logic ```

I’m looking for a few developers with APIs who want to test it out. I’ll help with setup .

3 points | by 0megion 1 day ago

3 comments

  • nik736 22 hours ago
    Rails 8 introduced built in rate limiting, smilar to what you described. Since it's built in already I have no use for your service, but good luck!
    • 0megion 19 hours ago
      It's already too late if requests are hitting your service. What happens when a customer sends x100 traffic all of a sudden, your Rails backend needs to scale or crash to handle the rate limiting.
      • nik736 19 hours ago
        For me it's not a big deal, my app autoscales and rate limit details are stored in Redis, so it's super fast. For x100 the traffic it wouldn't even need to scale since it's simply hitting Redis.
        • 0megion 1 hour ago
          Interesting, but still your Rail app needs to apply the rate limit logic working with Redis, maybe your DB will be protected, but you will still receive the traffic through your network, meaning more ingress cost, along with CPU to process all the incoming requests.
  • galaxy_gas 1 day ago
    I can't see myself ever using this unless self hosted as library with no phonehome . remote internet API call every time when I am getting millions of rps is intolerable
    • 0megion 1 day ago
      That is fair. Maybe Proxy mode would be a way to go in this case? Instead of calling another API, you can pass through your requests, and only a small amount can reach your origin.
      • galaxy_gas 1 day ago
        CF Worker its pay per req isnt it ? This seems impossibly expensive to proxy
        • 0megion 23 hours ago
          Correct, pricing would be based on the traffic volume.
  • nidssc 1 day ago
    [dead]