Skip to content

Cache overview

The Cache interface is the framework’s opportunistic key/value cache. Three operations cover ~95 % of real cases: get, set, atomic increment, set-if-absent, delete.

Used by:

interface Cache {
get<V>(key: string): Promise<Option<V>>;
set<V>(key: string, value: V, ttlMs?: number): Promise<void>;
incr(key: string, ttlMs?: number): Promise<number>;
setIfAbsent<V>(key: string, value: V, ttlMs?: number): Promise<boolean>;
delete(...keys: string[]): Promise<void>;
mget<V>(keys: string[]): Promise<Map<string, V>>;
mset(entries: Array<[string, unknown, number?]>): Promise<void>;
}

Small surface — no pattern scans, no pub/sub (the cluster has its own pub/sub).

BackendUse
InMemoryCacheSingle-pod / tests. In-process Map.
RedisCacheMulti-pod production. Wraps ioredis.
MemcachedCacheMulti-pod where Memcached fits. Wraps memjs.

Pick by deployment shape:

  • Single pod — InMemoryCache. Fast, no extra peer deps.
  • Multi-pod with Redis already — RedisCache.
  • Multi-pod with Memcached already — MemcachedCache.
  • Multi-pod, no preference — RedisCache. More features (pub/sub, persistence, etc.) and the wider ecosystem.

Caches are lossy by design. A get returning None means “not cached” — the caller’s job is to fall back to the source of truth:

const cached = await cache.get<User>(`user:${id}`);
if (cached.isSome()) return cached.value;
const user = await db.users.findById(id); // ← source of truth
await cache.set(`user:${id}`, user, 60_000);
return user;

If set fails (Redis down, network blip), the cache stays empty — but the call still returns the right answer (the source of truth was consulted).

Cache implementations return defaults on transient failures rather than throwing. Misuse (bad TTL, malformed value) throws.

await cache.set('key', value, 60_000); // expires in 60s
await cache.set('key', value); // no expiry
  • With ttlMs — entry expires after the window.
  • Without — entry stays until evicted (LRU on InMemoryCache; backend-policy on Redis / Memcached).

Most uses: always set a TTL. No-TTL entries grow until eviction; explicit TTLs are predictable.

const count = await cache.incr(`requests:${userId}`, 60_000);
if (count > 100) throw new Error('rate limit');

incr returns the new count after incrementing. When ttlMs is set AND the counter was just created (count === 1), the TTL is set.

Used by rate-limit middleware for fixed-window counters.

const got = await cache.setIfAbsent('lock:key', 'me', 5_000);
if (got) {
// I won the race; do the work
} else {
// Someone else has it
}

Atomic CAS-style write. Used by idempotency-key middleware to detect “I’m the first request with this key.”

const users = await cache.mget<User>(['user:1', 'user:2', 'user:3']);
// → Map<string, User> — missing keys absent

Round-trip optimization. Critical for shared-entity-hydration patterns after a sharding rebalance — pull every active entity’s state in one Redis call instead of N.

mset is the dual:

await cache.mset([
['user:1', user1, 60_000],
['user:2', user2, 60_000],
]);

The Cache API reference covers the full interface.