Back to notes

Field note

Why Cloudflare 1020 can block valid traffic and what to check first

April 8, 2026 2 min read

Cloudflare 1020 is usually a rules problem, not a mysterious SEO penalty. The failure path tends to sit in WAF logic, bot handling, rate limits, or environment mismatches.

Cloudflare 1020 means the request matched a rule that blocked it. That sounds obvious, but teams still waste time treating it like a vague platform issue instead of tracing the exact rule path.

Identify the blocked path first

Before changing anything, isolate:

  • which URL or path is failing
  • whether the failure is global or country-specific
  • whether it hits users, bots, logged-in staff, or everyone
  • when the error started

If the answer is "only some traffic, only some paths, after a recent rules change," that is already pointing you away from origin problems and toward the edge.

The common causes are not exotic

Most valid-traffic 1020 incidents come from a short list:

  • a custom WAF rule that is too broad
  • bot or threat score rules catching good requests
  • rate limits firing on normal behavior
  • country or ASN rules applied too aggressively
  • preview, staging, or admin patterns copied into production rules

This is why the first useful move is comparing the blocked request against the rule logic, not clearing caches at random.

Check crawler and real-user paths separately

Sometimes the site "works" for normal browsing but fails for a subset of crawlers, monitoring tools, or logged-out requests. That matters because Cloudflare can create an SEO problem without the browser path looking obviously broken.

Test the live edge behavior on the exact path that matters:

  • the public page request
  • any rendered assets it depends on
  • any redirects in front of it

The wrong rule on one step can make the final issue look bigger or stranger than it really is.

Validate the final rule set

After changing rules, confirm the outcome from the same request path that was failing before. The goal is not "fewer errors in the dashboard." The goal is a working configuration that allows valid traffic, allows the required bots or crawlers, and still keeps the intended protection in place.