No crawl baseline before patching
Without a rendered crawl snapshot, you cannot prove which patch fixed the issue.
Symptom
Markup and page fixes can still be shipped, but you lack a clean baseline to prove which change produced the result.
Exact cause
SurfaceOps detected this state because storeSteady does not yet have a crawl-derived markup baseline for your catalog. That means markup-side validation and before/after proof are limited until the storefront pages are crawlable and observed.
This is an internal evidence-collection state. It is valuable in the workflow but thin as a broad public SEO page, so it should stay available without being index-first.
Which system wins
This is a workflow and evidence problem, not a Merchant Center disapproval. Crawl baselines are what let StoreSteady prove markup-side changes later.
Correct edit point
Fix the scanner or crawl workflow first so a clean pre-change snapshot exists before the next patch cycle.
Manual fix steps
- Make sure your storefront product pages are publicly accessible to crawlers.
- Run another sync after the storefront is reachable and product pages return real HTML.
- If the store is password-protected, use a public storefront for crawl validation.
Validation steps
- Capture a fresh crawl of the affected product or policy pages before making the next markup change.
- Store the rendered HTML and extracted structured data so the before-state is preserved.
- After patching, compare the new crawl against the saved baseline to confirm what actually changed.
Expected resolution time
Once product pages are publicly reachable, StoreSteady can capture crawl baselines on the next sync.
Scan your store for this exact issue
Check whether your storefront is showing the same public signal problem described in this article before you connect the app. Start with no crawl baseline.