Skip to main content
Back to issue library

Critical issue

Noindex regressions on important pages

Important pages can fall out of search when a deploy leaves a noindex directive on URLs that should be crawlable.

Why it matters

A noindex directive is an explicit instruction not to show a page in search results. When it lands on revenue, signup, or content pages by accident, rankings can disappear even though the page still loads for users.

Common signals

  • A live page contains a noindex robots meta tag or X-Robots-Tag header.
  • Search traffic drops after a release while the affected page still returns HTTP 200.
  • Template-level robots settings differ between staging and production builds.

Stable fix pattern

  1. Remove noindex from production templates that serve public, indexable pages.
  2. Keep noindex only on intentionally private, duplicate, or low-value surfaces.
  3. Re-crawl priority pages after deploys to confirm the directive is gone.

How SitePulse will monitor this in v1

  • Detects noindex directives on configured priority URLs.
  • Flags changed indexability state between scan runs.
  • Links the affected URL to approved remediation guidance instead of a generic health score.
Run a free scan