In-memory cache
InMemoryCache is the default Cache implementation.
In-process, LRU eviction, per-entry TTL — fast and zero
dependencies.
import { InMemoryCache } from 'actor-ts';
const cache = new InMemoryCache({ maxEntries: 10_000,});When to use it
Section titled “When to use it”Three scenarios:
- Tests — fast, no IO, clean teardown.
- Single-pod production — no need to share cache state.
- Dev / local — same code without Redis on the laptop.
For multi-pod production, use Redis or Memcached instead — pods would each have their own copy otherwise.
Configuration
Section titled “Configuration”interface InMemoryCacheSettings { maxEntries?: number; // LRU cap; default 10_000 cleanupMs?: number; // expired-entry sweep cadence; default 60_000}| Field | Purpose |
|---|---|
maxEntries | Bound on cache size. LRU eviction beyond this. |
cleanupMs | How often to sweep expired entries. Without this, expired entries linger until accessed. |
For most apps, defaults are fine.
LRU eviction
Section titled “LRU eviction”maxEntries: 1000
→ 1001st distinct key inserted → least-recently-used entry evictedLRU means frequently-accessed entries stay; rarely-accessed ones go. Good for read-heavy caches where the hot set fits in memory but the long tail doesn’t.
TTL handling
Section titled “TTL handling”await cache.set('key', value, 60_000); // expires at now + 60sTwo cleanup paths:
- Lazy — on
get, expired entries return None (and are removed). - Periodic sweep — every
cleanupMs, the cache walks expired entries and removes them. Reduces memory for write-then-never-read keys.
Sharing across the system
Section titled “Sharing across the system”import { CacheExtensionId } from 'actor-ts';
system.extension(CacheExtensionId).configure({ defaultCache: new InMemoryCache({ maxEntries: 50_000 }),});
// Reach via:const cache = system.extension(CacheExtensionId).cache;A system-wide cache that multiple consumers (HTTP middleware, projection actors, custom code) share. Useful when:
- You want one configured cache instance, not one per consumer.
- Cache stats accumulate across the system.
For per-route caches, instantiate new InMemoryCache() directly
without going through the extension.
When it’s wrong for production
Section titled “When it’s wrong for production”Where to next
Section titled “Where to next”- Cache overview — the bigger picture.
- Redis cache — multi-pod alternative.
- Memcached cache — alternative.
- CachedSnapshotStore — one consumer of the cache abstraction.