After a migration, "Crawled - currently not indexed" usually means Google found the URLs, spent some crawl budget on them, and still did not like the technical picture enough to keep them moving. That is different from a pure discovery problem.
Start with the migration delta
Do not begin with a giant audit. Start with the exact delta between the old site and the new one:
- which templates changed
- which URLs changed
- which canonicals changed
- which sections lost links
- which sitemap entries are new, stale, or duplicated
If the affected URLs cluster around one template or one section, the fix is usually there.
Check template signals before content assumptions
Teams often jump to "maybe the pages are thin" too early. After a migration, the more common failures are structural:
- canonical tags still pointing to old paths
- self-referencing canonicals missing on new templates
- duplicate parameter or faceted paths leaking into the crawl set
- internal links still pointing to redirected URLs
- XML sitemaps listing URLs that are technically valid but weak, duplicate, or low-priority
When those signals conflict, coverage states drift even if the page copy is acceptable.
Fix discovery and prioritization together
Google does not judge pages in isolation. If the internal linking path is weak and the sitemap is noisy, important URLs compete with garbage. That is how migrations quietly create crawl waste.
The fastest first pass is usually:
- remove bad or duplicate URLs from sitemaps
- fix canonical and noindex conflicts on templates
- tighten internal links to the affected sections
- make sure redirected legacy paths are not still being promoted
That sequence does more than a round of manual recrawl requests with no structural change.
Validate the recovery in the right order
Verification should stay practical:
- inspect the affected templates
- confirm the live canonical, robots, and linking state
- verify sitemap cleanup
- watch the coverage mix in GSC over the next crawl cycle
The goal is not instant indexing on every page. The goal is a cleaner crawl path, stronger technical signals, and a measurable reduction in the wrong URLs competing for attention.