Data Fetching
Dec 05, 2023176 min read

Cache Ledgers: Deep Dive #87

NR
Niko Rahman

State Modeling Specialist

Cache ledgers make caching a first-class part of your route story. Instead of assuming cache behavior is implicit (and therefore un-auditable), you treat it like any other contract:

  • the shell selects a cache lane,
  • fetch helpers apply it consistently,
  • and the UI renders evidence about what the cache did (or refused to do).

This is the natural continuation of the last arc:

Now we need the thing that makes caches safe to live with: a ledger that turns caching into inspectable state.

Required docs links:

Extra docs links for this entry:

A cache is only "helpful" when it is honest. Cache honesty means: every cache decision is written into a ledger and rendered into the DOM. If you can't test it by reading the UI, you can't trust it during migrations.

What a cache ledger is (and isn’t)

A cache ledger is a route-local document that records cache behavior:

  • whether a read was served from cache or bypassed,
  • whether a cache entry was considered stale,
  • whether a dedupe happened (and how many consumers joined),
  • and what the route decided to do when cache truth disagreed with network truth.

A cache ledger is not:

  • a global cache implementation,
  • a proxy layer,
  • or a metrics pipeline.

It exists to keep route behavior stable across both file routers and component router motion.

Cache vocabulary (render-visible keys)

To keep everything auditable, use a stable vocabulary in receipts and DOM:

  • cacheLaneId: ghost | sticky | bypass
  • cacheResult: hit | miss | stale | bypassed
  • cacheKey: a stable key, e.g. warehouse.orders:list
  • entryAgeMs: how old the entry was when read
  • ttlMs: the lane’s effective ttl
  • joinCount: how many consumers joined a shared entry

Worked Example: a cache-ledgered /warehouse surface

Continuing the file:/warehouse route story from #81-#86, we’ll make caching explicit.

We will:

  1. define cache receipts,
  2. implement a cache ledger with bounded history,
  3. add a cache-aware fetch helper that emits receipts,
  4. render ledger evidence into the shell,
  5. show a route-flow test posture.

Step 1: Cache receipt schema (small, stable, render-safe)

export function createCacheReceipt(input: {
  cacheLaneId: string;
  cacheKey: string;
  cacheResult: string;
  entryAgeMs: number;
  ttlMs: number;
  joinCount: number;
  laneId: string;
  laneKey: string;
  requestKey: string;
  url: string;
}) {
  return {
    id: `${input.laneKey}:cache:${input.cacheKey}`,
    at: Date.now(),
    ...input,
  };
}

The receipt id is a contract. Keep it stable.

Step 2: Cache ledger hook (bounded evidence)

import { useLiveState } from "react";

export function useCacheLedger(namespace: string) {
  return useLiveState(namespace, {
    cacheLaneId: "ghost",
    receipts: [],
    cap: 80,
    last: null,
  });
}

export function pushCacheReceipt(ledger: any, receipt: any) {
  const cap = ledger.read("cap");
  const next = [...ledger.read("receipts"), receipt].slice(-cap);
  ledger.write("receipts", next);
  ledger.write("last", receipt);
}

Step 3: Cache-aware render fetch helper (cache lane applied + receipt emitted)

This helper is intentionally pragmatic: it assumes caches can lie, so it records the lie.

import { useRenderFetch, useCacheBypass, useRenderEffect } from "react";
import { CacheLanes } from "../cache-lanes";
import { createCacheReceipt, pushCacheReceipt } from "./cache-ledger";

export function useCacheLedgerFetch(opts: {
  lane: any;
  cacheLaneId: string;
  cacheLedger: any;
  requestKey: string;
  cacheKey: string;
  url: string;
}) {
  const { lane, cacheLaneId, cacheLedger, requestKey, cacheKey, url } = opts;
  const lanePolicy = CacheLanes[cacheLaneId];

  // Invented-but-consistent cache metadata: we treat fetch results as "annotated" surfaces.
  const result =
    lanePolicy.cache === "bypass"
      ? useCacheBypass(url, { laneId: lane.laneId, laneKey: lane.laneKey })
      : useRenderFetch(url, { cache: lanePolicy.cache, laneId: lane.laneId, laneKey: lane.laneKey });

  useRenderEffect(() => {
    const cacheResult =
      lanePolicy.cache === "bypass"
        ? "bypassed"
        : result?.cacheHit
          ? "hit"
          : result?.cacheStale
            ? "stale"
            : "miss";

    const receipt = createCacheReceipt({
      cacheLaneId,
      cacheKey,
      cacheResult,
      entryAgeMs: Number(result?.cacheAgeMs ?? 0),
      ttlMs: Number(lanePolicy.ttlMs ?? 0),
      joinCount: Number(result?.cacheJoinCount ?? 1),
      laneId: lane.laneId,
      laneKey: lane.laneKey,
      requestKey,
      url,
    });

    cacheLedger.write("cacheLaneId", cacheLaneId);
    pushCacheReceipt(cacheLedger, receipt);
  });

  return result;
}

The invented cacheHit/cacheStale/cacheAgeMs/cacheJoinCount metadata is intentional: it keeps cache behavior render-visible without forcing you to teach every component a cache implementation detail.

Step 4: Shell panel (cache evidence as UI)

export function CacheLedgerPanel({ cacheLedger }: any) {
  const lane = cacheLedger.read("cacheLaneId");
  const receipts = cacheLedger.read("receipts");

  return (
    <aside data-ledger="cache" data-cache-lane={lane}>
      <header>
        <strong>Cache Ledger</strong>{" "}
        <span className="text-sm text-[color:var(--docs-muted)]">
          {receipts.length} receipts
        </span>
      </header>

      <ol>
        {receipts.map((r: any) => (
          <li key={r.id} data-cache-receipt={r.id} data-result={r.cacheResult}>
            <strong>{r.cacheKey}</strong>{" "}
            <span className="text-sm text-[color:var(--docs-muted)]">
              {r.cacheResult} age:{r.entryAgeMs}ms ttl:{r.ttlMs}ms joins:{r.joinCount}
            </span>
          </li>
        ))}
      </ol>

      <pre data-cache-ledger-json>{JSON.stringify(cacheLedger.read(), null, 2)}</pre>
    </aside>
  );
}

Step 5: Using the helper in a track (orders list)

import { useCacheLedgerFetch } from "./useCacheLedgerFetch";

export default function OrdersTrack({ lane, cacheLaneId, cacheLedger }: any) {
  const orders = useCacheLedgerFetch({
    lane,
    cacheLaneId,
    cacheLedger,
    requestKey: "orders:list",
    cacheKey: "warehouse.orders:list",
    url: "/api/orders?limit=50",
  });

  return (
    <section data-track="orders" data-cache-lane={cacheLaneId}>
      <h2>Orders</h2>
      <ul>
        {orders.items.map((o: any) => (
          <li key={o.id} data-order={o.id}>
            {o.number}
          </li>
        ))}
      </ul>
    </section>
  );
}

Step 6: Route-flow testing posture (assert on evidence)

test("orders emits cache receipts", async () => {
  const app = await mountRoute("/warehouse");
  await app.click("button", { text: "orders" });

  await app.expect("[data-ledger='cache']").toExist();
  await app.expect("[data-result]").toExist();
  await app.expect("[data-cache-receipt*='warehouse.orders:list']").toExist();
});

You’re testing the route story, not the cache internals.

Checklist

  • Cache lane is stored as route state and rendered as a data attribute.
  • Cache reads emit receipts with stable cache keys.
  • Ledger history is bounded.
  • Evidence is rendered into the DOM.
  • Tests assert against rendered receipts.