Is using ioredis in proxy.ts supported for rate limiting? (self-hosted, Node.js runtime, Next.js 16) #91716
-
Rate limiting in proxy.ts with ioredis (self-hosted, single instance)Hi! With Next.js 16, I'm self-hosting on a single instance (not serverless) and want to implement rate limiting using my own Redis in // lib/redis.ts
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL!);
export default redis;// proxy.ts
import redis from '@/lib/redis';
import { NextRequest, NextResponse } from 'next/server';
export async function proxy(request: NextRequest) {
const ip = request.headers.get('x-forwarded-for')?.split(',')[0] ?? '127.0.0.1';
const key = `rl:${ip}:${Math.floor(Date.now() / 1000)}`;
const count = await redis.incr(key);
if (count === 1) await redis.expire(key, 2);
if (count > 10) {
return NextResponse.json({ error: 'Too many requests' }, { status: 429 });
}
return NextResponse.next();
}However, the docs state:
My questions:
Environment:
Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
|
I added that because, if you import a singleton in Proxy, it won't be the same singleton if imported in a render pass or a Route Handler. You can however, put it in globalThis, but that's mutable share state across requests, which might be good for a redis connection but, like, if you stub user data there, that it is a big no no 🙅 ~~ I don't think we are talking about the thing. Do you mean like next time the proxy is opened, you want to reuse the connection or query the same pool for a connection? Proxy is meant to only use nodejs AFAIK though, so that bit should be fine. |
Beta Was this translation helpful? Give feedback.
-
|
To add to the previous answer — yes, using ioredis in Connection SingletonThe warning about shared modules means that a module imported in // lib/redis.ts
import Redis from "ioredis";
const redis = new Redis(process.env.REDIS_URL!, {
maxRetriesPerRequest: 3,
lazyConnect: true,
});
export default redis;The Rate Limiting PatternYour implementation is solid. One improvement — use a Lua script or import redis from "@/lib/redis";
import { NextRequest, NextResponse } from "next/server";
const WINDOW_SEC = 2;
const MAX_REQUESTS = 10;
export async function proxy(request: NextRequest) {
const ip = request.headers.get("x-forwarded-for")?.split(",")[0] ?? "127.0.0.1";
const key = `rl:${ip}:${Math.floor(Date.now() / (WINDOW_SEC * 1000))}`;
const count = await redis
.multi()
.incr(key)
.expire(key, WINDOW_SEC + 1)
.exec();
const current = count?.[0]?.[1] as number;
if (current > MAX_REQUESTS) {
return NextResponse.json({ error: "Too many requests" }, { status: 429 });
}
return NextResponse.next();
}Should You Do This in proxy.ts or Nginx?For a self-hosted single instance, either works. The advantage of Regarding future stability — |
Beta Was this translation helpful? Give feedback.
I added that because, if you import a singleton in Proxy, it won't be the same singleton if imported in a render pass or a Route Handler.
You can however, put it in globalThis, but that's mutable share state across requests, which might be good for a redis connection but, like, if you stub user data there, that it is a big no no 🙅 ~~ I don't think we are talking about the thing. Do you mean like next time the proxy is opened, you want to reuse the connection or query the same pool for a connection?
Proxy is meant to only use nodejs AFAIK though, so that bit should be fine.