03

Concepts

Caching

How systems remember expensive answers. Every caching decision is a tradeoff between freshness, memory, and complexity.

Fundamentals
What a cache is, why it exists, and where it sits in a system. The mental model before you touch any strategy.
Cache HitCache MissHit RateCache Aside
Open topic
Writing Strategies
Read-through, write-through, write-back, write-around. When data changes, these strategies decide which copy gets updated first.
Write-ThroughWrite-BackRead-ThroughWrite-Around
Open topic
Eviction Policies
LRU, LFU, FIFO, TTL. When the cache is full, eviction policy decides what gets thrown out — and getting it wrong is a performance cliff.
LRULFUFIFOTTL
Open topic
Population Strategies
Refresh-ahead and cache warming. How you pre-fill a cache so users never hit a cold miss on a hot path.
Refresh-AheadCache WarmingLazy LoadingPre-Population
Open topic
Cache Invalidation
TTL, event-driven, write-through, versioning, stale-while-revalidate. The hardest problem in caching — keeping the cache honest without killing performance.
TTLEvent-DrivenVersioningStale-While-Revalidate
Open topic
Distributed Caching
Why a single cache node fails at scale. Consistent hashing, cache coherence, replication, two-level caching, and node failure handling.
Consistent HashingCache CoherenceReplicationTwo-Level Cache
Open topic
Cache Problems
Stampede, cold start, penetration, avalanche. The failure modes that hit hardest when traffic spikes or the cache restarts.
Cache StampedeCold StartPenetrationAvalanche
Open topic
Redis
Data structures, patterns, persistence, streams, and the single-threaded event loop. Redis internals that explain why it's fast and where it breaks.
Data StructuresPersistenceStreamsEvent Loop
Open topic