Lewati ke konten
Kembali ke Blog

Cara Menggunakan Redis untuk Caching

· · 6 menit baca

Redis adalah in-memory data store yang sangat cepat untuk caching. Mari pelajari cara menggunakannya.

Apa itu Redis?

Kegunaan Redis

Redis digunakan untuk:
- Caching (paling umum)
- Session storage
- Real-time analytics
- Message queue/broker
- Leaderboards
- Rate limiting
- Pub/Sub messaging

Keunggulan Redis

- Sangat cepat (in-memory)
- Support berbagai data structures
- Persistence options
- Clustering dan replication
- Atomic operations
- Pub/Sub support

Install Redis

Ubuntu/Debian

# Install Redis
sudo apt update
sudo apt install redis-server

sudo systemctl start redis-server sudo systemctl enable redis-server

Check status

sudo systemctl status redis-server

Test connection

redis-cli ping

Response: PONG

Docker

# Run Redis container
docker run -d \
    --name redis \
    -p 6379:6379 \
    redis:alpine

With persistence

docker run -d \ --name redis \ -p 6379:6379 \ -v redis-data:/data \ redis:alpine redis-server --appendonly yes

Redis CLI Basics

Basic Commands

# Connect to Redis
redis-cli

Set value

SET name "Budi" SET age 25

Get value

GET name GET age

Set with expiration (seconds)

SET session:123 "data" EX 3600 SETEX session:456 3600 "data"

Check TTL

TTL session:123

Delete key

DEL name

Check if key exists

EXISTS name

Get all keys (use with caution)

KEYS KEYS user:

Data Types

# String
SET greeting "Hello"
APPEND greeting " World"
GET greeting  # "Hello World"

Integer operations

SET counter 0 INCR counter # 1 INCRBY counter 5 # 6 DECR counter # 5

List

LPUSH mylist "first" RPUSH mylist "last" LRANGE mylist 0 -1 # Get all LPOP mylist RPOP mylist

Set (unique values)

SADD myset "apple" "banana" "orange" SMEMBERS myset SISMEMBER myset "apple" # 1 (true)

Hash

HSET user:1 name "Budi" age 25 email "[email protected]" HGET user:1 name HGETALL user:1 HINCRBY user:1 age 1

Sorted Set

ZADD leaderboard 100 "player1" 200 "player2" 150 "player3" ZRANGE leaderboard 0 -1 WITHSCORES ZREVRANGE leaderboard 0 2 # Top 3

Caching Patterns

Cache-Aside Pattern

// Node.js dengan ioredis
const Redis = require("ioredis");
const redis = new Redis();

async function getUser(userId) { // 1. Check cache const cached = await redis.get( user:${userId}); if (cached) { console.log("Cache hit"); return JSON.parse(cached); }

// 2. Cache miss - fetch from database console.log("Cache miss"); const user = await db.users.findById(userId);

// 3. Store in cache await redis.set( user:${userId}, JSON.stringify(user), "EX", 3600 // 1 hour );

return user; }

// Invalidate cache on update async function updateUser(userId, data) { await db.users.update(userId, data); await redis.del( user:${userId}); }

Write-Through Pattern

async function saveUser(userId, userData) {
  // Save to database
  await db.users.save(userId, userData);

// Update cache immediately await redis.set( user:${userId}, JSON.stringify(userData), "EX", 3600); }

TTL Strategy

// Different TTL for different data
const TTL = {
  USER_PROFILE: 3600, // 1 hour
  PRODUCT_LIST: 300, // 5 minutes
  SESSION: 86400, // 24 hours
  RATE_LIMIT: 60, // 1 minute
  STATIC_CONFIG: 604800, // 1 week
};

async function cacheData(key, data, type) { await redis.set(key, JSON.stringify(data), "EX", TTL[type]); }

Express.js Integration

Setup Redis Client

// redis-client.js
const Redis = require("ioredis");

const redis = new Redis({ host: process.env.REDIS_HOST || "localhost", port: process.env.REDIS_PORT || 6379, password: process.env.REDIS_PASSWORD || undefined, retryDelayOnFailover: 100, maxRetriesPerRequest: 3, });

redis.on("connect", () => console.log("Redis connected")); redis.on("error", (err) => console.error("Redis error:", err));

module.exports = redis;

Caching Middleware

// cache-middleware.js
const redis = require("./redis-client");

function cache(duration) { return async (req, res, next) => { const key = cache:${req.originalUrl};

try {
  const cached = await redis.get(key);
  if (cached) {
    return res.json(JSON.parse(cached));
  }

  // Store original json method
  const originalJson = res.json.bind(res);

  // Override json method to cache response
  res.json = async (data) => {
    await redis.set(key, JSON.stringify(data), "EX", duration);
    return originalJson(data);
  };

  next();
} catch (error) {
  console.error("Cache error:", error);
  next();
}

};
}

module.exports = cache;

Using Cache Middleware

const express = require("express");
const cache = require("./cache-middleware");
const app = express();

// Cache for 5 minutes app.get("/api/products", cache(300), async (req, res) => { const products = await db.products.findAll(); res.json(products); });

// Cache for 1 hour app.get("/api/users/:id", cache(3600), async (req, res) => { const user = await db.users.findById(req.params.id); res.json(user); });

// No cache app.post("/api/users", async (req, res) => { const user = await db.users.create(req.body); res.json(user); });

Session Storage

Express Session with Redis

const session = require("express-session");
const RedisStore = require("connect-redis").default;
const redis = require("./redis-client");

app.use( session({ store: new RedisStore({ client: redis }), secret: process.env.SESSION_SECRET, resave: false, saveUninitialized: false, cookie: { secure: process.env.NODE_ENV === "production", httpOnly: true, maxAge: 24 60 60 * 1000, // 24 hours }, }) );

Rate Limiting

Simple Rate Limiter

async function rateLimiter(userId, limit = 100, window = 60) {
  const key = `ratelimit:${userId}`;

const current = await redis.incr(key);

if (current === 1) { await redis.expire(key, window); }

if (current > limit) { return { allowed: false, remaining: 0, resetIn: await redis.ttl(key), }; }

return { allowed: true, remaining: limit - current, resetIn: await redis.ttl(key), }; }

// Express middleware async function rateLimitMiddleware(req, res, next) { const userId = req.ip; // or req.user.id const result = await rateLimiter(userId, 100, 60);

res.set("X-RateLimit-Remaining", result.remaining); res.set("X-RateLimit-Reset", result.resetIn);

if (!result.allowed) { return res.status(429).json({ error: "Too many requests" }); }

next(); }

Pub/Sub Messaging

Publisher

const Redis = require("ioredis");
const publisher = new Redis();

// Publish message async function publishEvent(channel, data) { await publisher.publish(channel, JSON.stringify(data)); }

// Example usage publishEvent("notifications", { type: "new_order", userId: 123, orderId: 456, });

Subscriber

const Redis = require("ioredis");
const subscriber = new Redis();

// Subscribe to channel subscriber.subscribe("notifications", (err, count) => { console.log( Subscribed to ${count} channels); });

// Handle messages subscriber.on("message", (channel, message) => { const data = JSON.parse(message); console.log( Received on ${channel}:, data);

// Process message based on type switch (data.type) { case "new_order": // Handle new order break; case "payment_received": // Handle payment break; } });

Configuration

redis.conf Optimization

# /etc/redis/redis.conf

Memory limit

maxmemory 256mb maxmemory-policy allkeys-lru

Persistence

save 900 1 # Save if 1 key changed in 900 sec save 300 10 # Save if 10 keys changed in 300 sec save 60 10000 # Save if 10000 keys changed in 60 sec

AOF persistence (more durable)

appendonly yes appendfsync everysec

Security

requirepass yourpassword bind 127.0.0.1

Performance

tcp-keepalive 300 timeout 0

Eviction Policies

allkeys-lru      # Remove least recently used
allkeys-lfu      # Remove least frequently used
volatile-lru     # LRU among keys with TTL
volatile-lfu     # LFU among keys with TTL
volatile-ttl     # Remove shortest TTL first
noeviction       # Return error when full

Monitoring

Redis CLI Commands

# Server info
INFO

Memory usage

INFO memory MEMORY USAGE key

Connected clients

CLIENT LIST

Slow queries

SLOWLOG GET 10

Monitor commands (debug only)

MONITOR

Stats

INFO stats

Key Metrics

- Memory usage
- Connected clients
- Hit rate (hits / (hits + misses))
- Evicted keys
- Blocked clients
- Commands per second

Kesimpulan

Redis adalah tool powerful untuk caching dan berbagai use case lainnya. Mulai dengan simple caching lalu explore fitur advanced seperti Pub/Sub dan clustering.

Ditulis oleh

Hendra Wijaya

Tinggalkan Komentar

Email tidak akan ditampilkan.