Building Scalable REST APIs with Redis Caching
Caching is the secret weapon of high-performance APIs. Discover how Redis caching can transform your API from struggling under load to effortlessly handling millions of requests per day.
The Performance Crisis
Picture this: Your API is live, users are happy, and suddenly traffic spikes 10x. Within minutes:
- Response times climb from 50ms to 5 seconds
- Database connections max out
- Server CPU hits 100%
- Users start complaining
- Your phone won't stop ringing
The culprit? Every request is hitting your database. The solution? Intelligent caching with Redis.
Why Redis?
Redis (Remote Dictionary Server) is an in-memory data store that's blazingly fast:
| Operation | MongoDB | PostgreSQL | Redis |
|---|---|---|---|
| Read | 50-100ms | 20-50ms | < 1ms |
| Write | 30-80ms | 15-40ms | < 1ms |
| Complex query | 200-500ms | 100-300ms | < 5ms |
That's 50-100x faster for most operations!
Beyond Speed
Redis offers more than just speed:
- Built-in TTL (Time-To-Live) - Automatic cache expiration
- Data Structures - Lists, sets, sorted sets, hashes
- Pub/Sub - Real-time messaging
- Persistence - Optional disk snapshots
- Replication - High availability
- Clustering - Horizontal scalability
Caching Strategies
1. Cache-Aside (Lazy Loading)
The most common pattern - JifiJs default:
async function getUser(userId: string) {
// 1. Check cache first
const cacheKey = `user:${userId}`;
const cached = await cacheService.get(cacheKey);
if (cached) {
return { error: false, data: cached, fromCache: true };
}
// 2. Cache miss - query database
const user = await User.findById(userId);
if (!user) {
return { error: true, message: 'User not found' };
}
// 3. Store in cache (1 hour TTL)
await cacheService.set(cacheKey, user, 3600);
return { error: false, data: user, fromCache: false };
}
Performance Impact:
- First request: 50ms (database)
- Subsequent requests: < 1ms (cache)
- 50x improvement for cached data
2. Write-Through Caching
Update cache immediately when data changes:
async function updateUser(userId: string, data: any) {
// 1. Update database
const user = await User.findByIdAndUpdate(userId, data, { new: true });
// 2. Update cache immediately
const cacheKey = `user:${userId}`;
await cacheService.set(cacheKey, user, 3600);
return { error: false, data: user };
}
Benefits:
- Cache always up-to-date
- No stale data issues
- Consistent read performance
3. Write-Behind (Write-Back) Caching
Update cache first, database later:
async function updateUserScore(userId: string, points: number) {
const cacheKey = `score:${userId}`;
// 1. Update cache immediately
await cacheService.increment(cacheKey, points);
// 2. Queue database update
await queue.add('update-score', { userId, points });
return { error: false, message: 'Score updated' };
}
Use cases:
- High-frequency writes (game scores, analytics)
- Non-critical data
- Asynchronous updates acceptable
JifiJs Caching Implementation
JifiJs makes caching effortless with the BaseService class:
Built-in Cache Methods
class ProductService extends BaseService<IProduct> {
// Automatic caching with findByIdCached
async getProduct(id: string) {
return await this.findByIdCached(
id,
{ populate: 'category' }, // Query options
null, // Filter
3600 // TTL in seconds
);
}
// Custom cache-aside pattern
async getFeaturedProducts() {
return await this.cacheGetOrSet(
'products:featured',
async () => {
return await this.find(
{ featured: true, inStock: true },
{ sort: { createdAt: -1 }, limit: 10 }
);
},
3600 // 1 hour
);
}
// Manual cache operations
async updateProduct(id: string, data: any) {
const result = await this.update(id, data);
if (!result.error) {
// Invalidate specific cache
await this.invalidateCache(id);
// Invalidate pattern (all product caches)
await this.cacheDeletePattern('products:*');
}
return result;
}
// Cache statistics
async getCacheStats() {
return this.cacheService.getStats();
}
}
Dual-Backend Support
JifiJs automatically uses Redis if available, falls back to in-memory:
// Redis configuration
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=your-password
REDIS_DB=0
// If Redis unavailable, automatic fallback to memory
// No code changes needed!
Real-World Performance Gains
Case Study 1: E-Commerce Product Listing
Scenario: Product catalog with 10,000 items, 1,000 requests/minute
Before Caching:
- Database queries: 1,000/minute
- Average response time: 120ms
- Database CPU: 80%
- Scalability limit: ~2,000 req/min
After Caching (1-hour TTL):
- Database queries: ~17/minute (only cache misses)
- Average response time: 2ms
- Database CPU: 10%
- Scalability limit: 100,000+ req/min
Result: 60x faster, 98.3% fewer database queries
Case Study 2: User Authentication
Scenario: 500 concurrent users, frequent auth checks
Before Caching:
- Auth check time: 80ms (2 DB queries: user + auth)
- Database load: High
- Auth checks/sec: 500
After Caching:
- Auth check time: < 1ms (cache hit)
- Database load: Minimal
- Auth checks/sec: 50,000+
Result: 80x faster authentication
Case Study 3: Analytics Dashboard
Scenario: Complex aggregation queries for dashboard
Before Caching:
- Query time: 2.5 seconds
- Database CPU: 95% during queries
- Concurrent users: 10
After Caching (5-minute TTL):
- Query time: 5ms
- Database CPU: 5%
- Concurrent users: 1,000+
Result: 500x faster, massively improved scalability
Advanced Caching Patterns
1. Multi-Level Caching
Layer caches for maximum performance:
class OrderService extends BaseService<IOrder> {
async getOrder(orderId: string) {
// Level 1: Application memory (fastest)
if (this.memoryCache.has(orderId)) {
return this.memoryCache.get(orderId);
}
// Level 2: Redis (very fast)
const cached = await this.cacheGet(`order:${orderId}`);
if (cached) {
this.memoryCache.set(orderId, cached);
return cached;
}
// Level 3: Database (slowest)
const order = await this.findById(orderId);
if (order.data) {
await this.cacheSet(`order:${orderId}`, order.data, 3600);
this.memoryCache.set(orderId, order.data);
}
return order;
}
}
2. Cache Warming
Preload cache before traffic spikes:
// Run before expected traffic spike
async function warmCache() {
console.log('Warming cache...');
// Popular products
const products = await productService.find(
{ popular: true },
{ limit: 100 }
);
for (const product of products.data) {
await productService.cacheSet(
product._id.toString(),
product,
7200 // 2 hours
);
}
console.log('Cache warmed!');
}
// Schedule before Black Friday, product launches, etc.
3. Conditional Caching
Cache based on data characteristics:
async function getCategoryProducts(categoryId: string) {
const category = await Category.findById(categoryId);
// Cache popular categories longer
const ttl = category.popular ? 7200 : 1800;
return await this.cacheGetOrSet(
`category:${categoryId}:products`,
async () => {
return await this.find({ category: categoryId });
},
ttl
);
}
4. Cache Stampede Prevention
Prevent multiple requests from hitting database simultaneously:
class CacheService {
private lockMap = new Map<string, Promise<any>>();
async getOrSet<T>(key: string, fetcher: () => Promise<T>, ttl: number): Promise<T> {
// Check cache
const cached = await this.get<T>(key);
if (cached) return cached;
// Check if fetch in progress
if (this.lockMap.has(key)) {
return await this.lockMap.get(key);
}
// Start fetch and lock
const fetchPromise = fetcher();
this.lockMap.set(key, fetchPromise);
try {
const data = await fetchPromise;
await this.set(key, data, ttl);
return data;
} finally {
this.lockMap.delete(key);
}
}
}
Cache Invalidation Strategies
"There are only two hard things in Computer Science: cache invalidation and naming things." - Phil Karlton
1. Time-Based (TTL)
Simplest approach - JifiJs default:
// Cache expires automatically after 1 hour
await cacheService.set('key', data, 3600);
Pros: Simple, predictable Cons: May serve stale data until expiration
2. Event-Based Invalidation
Invalidate on data changes:
async function updateProduct(id: string, data: any) {
const result = await productService.update(id, data);
if (!result.error) {
// Invalidate specific product cache
await cacheService.delete(`product:${id}`);
// Invalidate related caches
await cacheService.deletePattern('products:list:*');
await cacheService.deletePattern(`category:${data.category}:*`);
}
return result;
}
Pros: Always fresh data Cons: More complex, requires careful tracking
3. Version-Based Invalidation
Add version to cache keys:
const cacheVersion = 'v2'; // Increment when schema changes
async function getUser(id: string) {
return await cacheService.getOrSet(
`user:${cacheVersion}:${id}`,
async () => await User.findById(id),
3600
);
}
// Version bump invalidates all user caches
Pros: Clean slate on version bump Cons: Invalidates all data, not selective
Monitoring & Optimization
Cache Hit Rate
Track cache effectiveness:
const stats = await cacheService.getStats();
console.log({
hits: stats.hits,
misses: stats.misses,
hitRate: (stats.hits / (stats.hits + stats.misses)) * 100,
size: stats.size
});
// Target: > 80% hit rate
Cache Size Management
Prevent unbounded growth:
// Redis maxmemory policy
maxmemory 2gb
maxmemory-policy allkeys-lru // Evict least recently used
// Or periodic cleanup
setInterval(async () => {
await cacheService.deletePattern('temp:*');
}, 3600000); // Every hour
Performance Metrics
Monitor cache performance:
// JifiJs automatic logging includes cache info
{
timestamp: '2025-12-20T10:30:00Z',
endpoint: '/api/products',
execution_time: 2, // ms
cache_hit: true,
database_queries: 0
}
Best Practices
1. Cache What's Expensive
Prioritize caching:
- ✅ Complex aggregations
- ✅ Frequently accessed data
- ✅ External API calls
- ✅ Authentication checks
- ❌ Unique, one-time queries
- ❌ Rapidly changing data
2. Choose Appropriate TTLs
// Reference data (rarely changes)
await cache.set('countries', data, 86400); // 24 hours
// User session (moderate changes)
await cache.set('user:auth', data, 3600); // 1 hour
// Real-time data (frequent changes)
await cache.set('stock:price', data, 10); // 10 seconds
3. Handle Cache Failures Gracefully
async function getUser(id: string) {
try {
const cached = await cache.get(`user:${id}`);
if (cached) return cached;
} catch (err) {
console.error('Cache error:', err);
// Continue to database
}
return await User.findById(id);
}
4. Use Structured Keys
// Good: Hierarchical, predictable
user:123
user:123:profile
user:123:orders
category:electronics:products
// Bad: Flat, hard to invalidate
user_123_data
electronics_products
5. Cache Serialization
// Automatic JSON serialization in JifiJs
await cache.set('user', { id: 1, name: 'John' }, 3600);
const user = await cache.get('user'); // Object returned
// Manual for complex types
await cache.set('date', JSON.stringify(new Date()), 3600);
const date = new Date(JSON.parse(await cache.get('date')));
Scaling Redis
1. Redis Cluster
Horizontal scaling for massive datasets:
// redis.conf
cluster-enabled yes
cluster-config-file nodes.conf
cluster-node-timeout 5000
// Connect to cluster
const Redis = require('ioredis');
const cluster = new Redis.Cluster([
{ host: 'redis-1', port: 6379 },
{ host: 'redis-2', port: 6379 },
{ host: 'redis-3', port: 6379 }
]);
Capacity: Unlimited (add nodes as needed)
2. Redis Sentinel
High availability with automatic failover:
const sentinel = new Redis({
sentinels: [
{ host: 'sentinel-1', port: 26379 },
{ host: 'sentinel-2', port: 26379 },
{ host: 'sentinel-3', port: 26379 }
],
name: 'mymaster'
});
Uptime: 99.99%+ with proper configuration
3. Managed Redis
Cloud providers handle scaling:
- AWS ElastiCache: Auto-scaling, backups
- Azure Cache for Redis: Multi-region replication
- Google Cloud Memorystore: High availability
Conclusion
Redis caching transforms APIs from database-bound to lightning-fast. The performance gains are not incremental—they're revolutionary.
Key Takeaways:
- Caching reduces database load by 95%+
- Response times improve 50-500x
- Scalability limits increase 10-100x
- JifiJs makes caching automatic and effortless
Start caching today:
# Install Redis
docker run -d -p 6379:6379 redis:7-alpine
# Configure JifiJs
REDIS_URL=redis://localhost:6379
# That's it! Caching is now active
JifiJs handles the complexity. You enjoy the performance.
Resources:
Happy caching! ⚡
