Skip to main content
Technology & EngineeringDatabase Services234 lines

Redis

Build with Redis for caching, sessions, and real-time data. Use this skill when

Quick Summary32 lines
You are a backend specialist who integrates Redis into projects. Redis is an
in-memory data store used for caching, sessions, rate limiting, queues, pub/sub,
and real-time data. Upstash provides serverless Redis with an HTTP API for edge
and serverless environments.

## Key Points

- Set TTL (expiry) on all cache keys — prevent stale data and memory bloat
- Use pipelines for multiple operations — reduces round trips
- Use Upstash for serverless/edge, ioredis for long-lived servers
- Namespace keys with prefixes — `user:`, `session:`, `cache:` — to stay organized
- Use hashes for objects with multiple fields — more memory-efficient than JSON strings
- Use sorted sets for ranked/scored data — built-in ordering and range queries
- Always handle the cache-miss case — Redis is a cache, not the source of truth
- Using Redis as the primary database for critical data — it's a cache first
- Not setting TTLs — memory grows unbounded until OOM
- Creating a new connection per request — reuse the client
- Storing large values (>1MB) — Redis is for small, hot data
- Using KEYS command in production — blocks the server; use SCAN instead

## Quick Example

```bash
npm install ioredis
```

```typescript
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL!);
```
skilldb get database-services-skills/RedisFull skill: 234 lines
Paste into your CLAUDE.md or agent config

Redis Integration

You are a backend specialist who integrates Redis into projects. Redis is an in-memory data store used for caching, sessions, rate limiting, queues, pub/sub, and real-time data. Upstash provides serverless Redis with an HTTP API for edge and serverless environments.

Core Philosophy

In-memory speed, persistent when needed

Redis stores data in memory — reads and writes take microseconds. It can persist to disk for durability, but the primary value is speed. Use it for data that benefits from fast access.

Data structures, not just key-value

Redis isn't just get/set. It has sorted sets (leaderboards), lists (queues), sets (unique collections), hashes (objects), streams (event logs), and HyperLogLog (cardinality estimation). Pick the right structure for the job.

Cache, don't replace

Redis complements your primary database. Cache hot data in Redis, serve it fast, and fall back to the database on cache miss. Don't use Redis as your only data store for data you can't afford to lose.

Setup

Install (ioredis — TCP connections)

npm install ioredis
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL!);

Install (Upstash — HTTP, serverless-friendly)

npm install @upstash/redis
import { Redis } from '@upstash/redis';
const redis = new Redis({
  url: process.env.UPSTASH_REDIS_REST_URL!,
  token: process.env.UPSTASH_REDIS_REST_TOKEN!,
});

Key Techniques

Basic operations

// String (get/set)
await redis.set('user:123', JSON.stringify(user));
await redis.set('session:abc', data, 'EX', 3600); // Expires in 1 hour
const user = JSON.parse(await redis.get('user:123') ?? '{}');

// Delete
await redis.del('user:123');

// Check existence
const exists = await redis.exists('user:123');

// Set expiry
await redis.expire('user:123', 300); // 5 minutes

// Increment
await redis.incr('page:views:homepage');
await redis.incrby('user:123:credits', 10);

Caching pattern

async function getCachedUser(userId: string) {
  const cacheKey = `user:${userId}`;

  // Check cache
  const cached = await redis.get(cacheKey);
  if (cached) return JSON.parse(cached);

  // Cache miss — fetch from database
  const user = await db.query.users.findFirst({ where: eq(users.id, userId) });
  if (user) {
    await redis.set(cacheKey, JSON.stringify(user), 'EX', 300); // Cache 5 min
  }

  return user;
}

// Invalidate on update
async function updateUser(userId: string, data: Partial<User>) {
  await db.update(users).set(data).where(eq(users.id, userId));
  await redis.del(`user:${userId}`); // Invalidate cache
}

Rate limiting

async function rateLimit(key: string, limit: number, windowSec: number): Promise<boolean> {
  const current = await redis.incr(key);
  if (current === 1) {
    await redis.expire(key, windowSec);
  }
  return current <= limit;
}

// Usage in API route
const allowed = await rateLimit(`rate:${ip}`, 100, 60); // 100 req/min
if (!allowed) return new Response('Too Many Requests', { status: 429 });

Sliding window rate limiter

async function slidingWindowRateLimit(key: string, limit: number, windowMs: number): Promise<boolean> {
  const now = Date.now();
  const windowStart = now - windowMs;

  const pipeline = redis.pipeline();
  pipeline.zremrangebyscore(key, 0, windowStart); // Remove old entries
  pipeline.zadd(key, now, `${now}-${Math.random()}`); // Add current
  pipeline.zcard(key); // Count entries in window
  pipeline.expire(key, Math.ceil(windowMs / 1000)); // Auto-cleanup

  const results = await pipeline.exec();
  const count = results![2][1] as number;
  return count <= limit;
}

Hash (objects)

// Store user as hash
await redis.hset('user:123', { name: 'Alice', email: 'alice@example.com', plan: 'pro' });

// Get single field
const name = await redis.hget('user:123', 'name');

// Get all fields
const user = await redis.hgetall('user:123');

// Increment a field
await redis.hincrby('user:123', 'loginCount', 1);

Sorted set (leaderboard)

// Add scores
await redis.zadd('leaderboard', 1500, 'alice', 2300, 'bob', 1800, 'charlie');

// Top 10
const top10 = await redis.zrevrange('leaderboard', 0, 9, 'WITHSCORES');

// User's rank (0-indexed)
const rank = await redis.zrevrank('leaderboard', 'alice');

// Increment score
await redis.zincrby('leaderboard', 100, 'alice');

Pub/Sub

// Publisher
await redis.publish('notifications', JSON.stringify({ userId: '123', type: 'new_message' }));

// Subscriber (separate connection)
const sub = new Redis(process.env.REDIS_URL!);
sub.subscribe('notifications');
sub.on('message', (channel, message) => {
  const data = JSON.parse(message);
  console.log(`${channel}:`, data);
});

Queue (list-based)

// Producer: push to queue
await redis.lpush('email:queue', JSON.stringify({ to: 'user@example.com', subject: 'Hello' }));

// Consumer: pop from queue (blocking)
const [, job] = await redis.brpop('email:queue', 30); // Wait up to 30s
if (job) {
  const email = JSON.parse(job);
  await sendEmail(email);
}

Session storage

// Store session
const sessionId = crypto.randomUUID();
await redis.set(`session:${sessionId}`, JSON.stringify({
  userId: user.id,
  email: user.email,
  createdAt: Date.now(),
}), 'EX', 86400); // 24 hours

// Retrieve session
const session = JSON.parse(await redis.get(`session:${sessionId}`) ?? 'null');

// Destroy session
await redis.del(`session:${sessionId}`);

Best Practices

  • Set TTL (expiry) on all cache keys — prevent stale data and memory bloat
  • Use pipelines for multiple operations — reduces round trips
  • Use Upstash for serverless/edge, ioredis for long-lived servers
  • Namespace keys with prefixes — user:, session:, cache: — to stay organized
  • Use hashes for objects with multiple fields — more memory-efficient than JSON strings
  • Use sorted sets for ranked/scored data — built-in ordering and range queries
  • Always handle the cache-miss case — Redis is a cache, not the source of truth

Anti-Patterns

  • Using Redis as the primary database for critical data — it's a cache first
  • Not setting TTLs — memory grows unbounded until OOM
  • Creating a new connection per request — reuse the client
  • Storing large values (>1MB) — Redis is for small, hot data
  • Using KEYS command in production — blocks the server; use SCAN instead
  • Not handling connection errors — Redis connections can drop
  • Pub/Sub without a separate subscriber connection — blocks the main client

Install this skill directly: skilldb add database-services-skills

Get CLI access →