Most static sites are fast by architecture and slow by policy.

That sounds backwards until you inspect the actual headers going to the browser and the edge. A site can be built with Astro, shipped as static HTML, deployed on Cloudflare Pages, and still waste a surprising amount of performance if its cache policy is vague, inconsistent, or copied from a generic starter.

This is where the operational details start to matter.

What Cloudflare’s own docs make clear

Cloudflare’s official Pages headers documentation states that a _headers file in your static asset directory is parsed and applied to static asset responses, not served as an asset itself. The same docs also warn that _headers rules do not apply to responses generated by Pages Functions or SSR responses, where you must attach headers in function code instead.

Separately, Cloudflare’s Cache docs explain that Origin Cache Control is enabled by default on Free, Pro, and Business plans, and that Cloudflare follows Cache-Control directives from the origin unless you deliberately override them with Cache Rules.

That gives you the right mental model:

  • _headers governs static Pages responses
  • Cache-Control still matters deeply
  • dynamic responses must set headers in code, not by wishful thinking

Editorial sites need three cache tiers

A serious publication usually wants three distinct behaviors:

  1. fingerprinted assets should be cached aggressively
  2. HTML should be fresh enough to reflect new publishing quickly
  3. search assets should be stable but replaceable on rebuild

If those tiers collapse into one default, you either get stale content or wasted performance.

Tier 1: Immutable assets should be boring

If a CSS, JS, font, or image asset is fingerprinted by the build, treat it like it will never change under the same URL:

Code snippet

txt

    /_astro/*
  Cache-Control: public, max-age=31556952, immutable

/fonts/*
  Cache-Control: public, max-age=31556952, immutable

/og/covers/*
  Cache-Control: public, max-age=31556952, immutable

  

This is the easiest performance win on a static site because the browser and the edge can both relax. The asset URL changes when the file changes. That is exactly what immutable caching is for.

Tier 2: HTML should be short-lived and honest

HTML is where editorial sites change most frequently. If you cache it like an asset bundle, publishing starts to feel laggy and mysterious.

For a fast-moving publication, I prefer conservative HTML caching:

Code snippet

txt

    /en/*
  Cache-Control: public, max-age=0, must-revalidate

/ar/*
  Cache-Control: public, max-age=0, must-revalidate

  

This keeps browsers honest while still allowing Cloudflare to participate in revalidation rather than blindly serving stale markup forever.

Tier 3: Search assets deserve explicit handling

Pagefind output is static, but it behaves differently from decorative assets. It is part of the site’s editorial surface. If a major content update goes live, the search index needs to reflect it without ambiguity.

That makes this a strong compromise:

Code snippet

txt

    /pagefind/*
  Cache-Control: public, max-age=3600, stale-while-revalidate=60

  

The point is not perfection. The point is predictable search freshness without overturning cache efficiency on every request.

Revalidation is where people get sloppy

Cloudflare’s docs around cache revalidation and request collapsing are useful because they remind you that freshness does not have to mean constant origin pressure. stale-while-revalidate and stale-if-error are not just CDN trivia. They are reliability tools.

For editorial search or lightweight indexes, these directives are especially useful:

  • stale-while-revalidate smooths freshness during rebuild cycles
  • stale-if-error gives you a grace window when origin revalidation fails
  • s-maxage lets Cloudflare and browsers behave differently when needed

Where teams still get caught

The most common mistakes I see are:

  • caching HTML and static assets the same way
  • assuming _headers also covers dynamic Pages Function responses
  • forgetting that search artifacts are part of the product surface
  • overriding origin intent without documenting why
  • shipping aggressive caching before verifying build output is actually fingerprinted

These are not dramatic errors, but they make performance harder to reason about, which is why they persist.

The playbook I would run on a content-heavy site

If I were launching a bilingual publication on Cloudflare Pages today, I would keep the policy simple:

  • immutable caching for fingerprinted assets and covers
  • revalidated caching for HTML
  • bounded freshness for Pagefind artifacts
  • function-level headers for any future SSR or API behavior
  • periodic header inspection after each deployment

That is enough to keep the site fast, debuggable, and easy to explain to the whole team.

Final view

Cache strategy is one of those areas where professionalism looks almost boring. There is no magic trick here, just clean intent expressed with headers that match how the site actually behaves.

That is exactly why it matters. On a serious editorial platform, boring cache policy is often the difference between “fast in theory” and “fast every day.”