Real Integrations
Redis as a Cache Layer
Apps often connect to slow data sources (SQL databases, APIs). Redis is used as a caching layer to store frequently accessed data.
Client → Application → Redis Cache → (fallback) Database
Node.js Implementation
import express from "express";
import Redis from "ioredis";
import fetch from "node-fetch";
const app = express();
const redis = new Redis();
// GET /user/123
app.get("/user/:id", async (req, res) => {
const userId = req.params.id;
// 1. Check cache
const cached = await redis.get(`user:${userId}`);
if (cached) {
return res.json({ source: "redis-cache", data: JSON.parse(cached) });
}
// 2. Fetch from external DB/API
const response = await fetch(
`https://jsonplaceholder.typicode.com/users/${userId}`
);
const userData = await response.json();
// 3. Save to Redis for 60 sec
await redis.setex(`user:${userId}`, 60, JSON.stringify(userData));
res.json({ source: "api", data: userData });
});
app.listen(3000, () => console.log("Server running"));
Redis for Session Storage
import session from "express-session";
import connectRedis from "connect-redis";
import Redis from "ioredis";
const RedisStore = connectRedis(session);
const redis = new Redis();
app.use(
session({
store: new RedisStore({ client: redis }),
secret: "mySecret",
resave: false,
saveUninitialized: false,
cookie: { maxAge: 60000 },
})
);
- Sessions are stored inside Redis.
- Any server instance can access the same session.
- Helps with load-balanced architectures.
Redis for Message Queues (Using Pub/Sub or Streams)
Redis can integrate with microservices as a message broker.
Publisher
import Redis from "ioredis";
const pub = new Redis();
pub.publish("orders", "Order #5001 created");
Subscriber
import Redis from "ioredis";
const sub = new Redis();
sub.subscribe("orders");
sub.on("message", (channel, message) => {
console.log(`Received from ${channel}: ${message}`);
});
Explanation
- The publisher sends messages to a channel.
- All subscribers listening to the channel receive the message.
Used in:
- Notification systems
- Realtime logging
- Realtime analytics dashboards
Redis Streams for Event Processing (Advanced Integration)
Redis Streams are used for:
- Event pipelines
- Background workers
- Ordered message processing
Producer
await redis.xadd("payments-stream", "*", "user", "42", "amount", "99.00");
Consumer
const messages = await redis.xread(
{ block: 5000 },
"STREAMS",
"payments-stream",
"0"
);
console.log(messages);
- Stream stores messages with IDs.
- Consumers process them in order.
- Supports consumer groups for load balancing.
Redis for Distributed Locks (Redlock Pattern)
Used to prevent:
- Duplicate payments
- Race condition updates
- Concurrency issues
import Redis from "ioredis";
const redis = new Redis();
async function runCriticalSection() {
const lockKey = "lock:payment:501";
const lock = await redis.set(lockKey, "locked", "NX", "EX", 10);
if (!lock) {
console.log("Another process is already handling this!");
return;
}
console.log("Running critical work...");
// critical logic here...
await redis.del(lockKey);
}
NXensures key is set only if it doesn’t exist.EX 10auto-expires lock in case of crash.- Only one worker can hold the lock at a time.
Redis with Real-time Analytics / Leaderboards
Redis sorted sets (ZSET) are perfect for:
- Game leaderboards
- Ranking systems
- Trending items
Leaderboard
await redis.zadd("game:leaderboard", 1500, "player1");
await redis.zadd("game:leaderboard", 2200, "player2");
const topPlayers = await redis.zrevrange(
"game:leaderboard",
0,
2,
"WITHSCORES"
);
console.log(topPlayers);
- Sorted sets keep elements ordered automatically.
- You can get top
Nusers instantly.
Redis as a Rate Limiter
Used for:
- API rate limits
- Login attempt limits
- Preventing abuse
10 Requests Per Minute Limit
async function isRateLimited(userId) {
const key = `rate:${userId}`;
const count = await redis.incr(key);
if (count === 1) {
await redis.expire(key, 60);
}
return count > 10;
}
incr()automatically increments request count.- Key expires after 60 seconds.
- If count exceeds limit → block user.