TL;DR
- Most 2026 migration checklists are still 2018 advice with a fresh coat of paint. They cover Google but ignore ChatGPT, Perplexity, Gemini, and Google AI Mode, which cache and recall your URLs very differently.
- Google needs around 180 days of redirects minimum. Common Crawl, which feeds many LLM training and retrieval pipelines, releases monthly archives. If you migrate the wrong week you can lose 2-3 months of AI visibility on top of the Google dip.
- On my last client migration we kept 94% of Google clicks within 30 days but lost 41% of ChatGPT citations for six weeks because the new URL had no archive history at the time of indexing.
- Server-side 301s, a clean URL map, and a Change of Address request handle Google. None of those signals reach LLMs directly.
- The AI layer needs its own checklist: lower DNS TTL early, ship llms.txt before launch, refresh schema and About entities, keep the old domain answering 200 for crawlers for at least 12 months, and submit fresh pages to Wayback before they get linked.
- If you only do one thing differently in 2026: do not retire the old domain. Park it, keep it crawlable, and 301 at the edge so AI bots that already cached the old URL get the new content with the right canonical.
Why I'm writing this (and who it's for)
I run a link-building and SEO agency. In the last 12 months I have helped six clients migrate domains or replatform. Three were straight domain changes, two were CMS swaps with the same domain, one was a subdomain to subfolder consolidation.
Five went well. One did not. The one that went sideways was the one where I followed the standard checklist that every top-ranking guide pushes. Google rankings recovered in two weeks. ChatGPT citations took nine. Perplexity took eleven. The client's branded query volume dropped 18% during that window because their name had quietly disappeared from AI answers and they did not realise until a sales call mentioned it.
This post is for SEO leads, founders, and in-house marketers who are planning a migration in 2026. It is the checklist I now actually use, including the AI layer that the traditional technical SEO fundamentals only half cover.
What's actually different about migrations in 2026
For most of the last decade a site migration was a Google problem. You mapped URLs, you 301'd them, you updated the sitemap, you filed a Change of Address, you watched Search Console for three months. That still matters. Google's own guidance on site moves with URL changes still recommends server-side 301s, a fresh sitemap, and keeping redirects for at least a year.
The new layer is AI search. Roughly a third of all search activity now touches an AI surface in some form. I wrote about that shift in AI bots account for 33% of search activity in 2026 and it changes migration risk in three specific ways.
First, AI engines do not crawl on the same schedule as Google. ChatGPT's retrieval index is updated more slowly than Googlebot. Perplexity blends fresh search with cached snippets. Gemini and Google AI Mode use Google's index but apply their own ranking on top. When you move a URL, each surface processes the change on a different clock.
Second, AI training and retrieval data has a long shadow. Common Crawl publishes archives monthly and those archives feed many downstream LLM training pipelines. If your old URLs were captured in a recent Common Crawl snapshot, your old content can keep getting cited even after the URL 404s.
Third, AI engines lean heavily on stable entity signals. About pages, author bios, schema, brand mentions, and external citations. A migration that breaks those signals temporarily can drop you out of AI answers for weeks even when your Google rankings hold. I unpacked the entity side of this in knowledge graphs and entity optimisation for AI search.
The honest pre-migration audit (start 6 weeks out)
Most guides tell you to start two weeks before launch. That is too late. Six weeks gives you time to fix things that surface in the audit, and to lower DNS TTL without anyone noticing.
Here is what I actually do in the first week:
- Export every URL that earned a Google click or impression in the last 12 months. Search Console > Performance > Pages, max date range, export. This is your priority list for redirects.
- Export every URL with backlinks. Ahrefs, Semrush, or Search Console > Links. These are the URLs that hold ranking equity.
- Run an LLM citation audit. Manually query ChatGPT, Perplexity, Google AI Mode, and Gemini for your 30 top branded and category queries. Note every URL of yours that gets cited. This is the most overlooked step in 2026 migrations and the gap most of the top-10 guides do not cover. I documented the longer version in AI search platform citation strategy 2026.
- Pull your Wayback Machine snapshots. Check archive.org/web for the old URLs that get cited. If they have strong Wayback history, you need to preserve those exact paths or 301 them precisely.
- Crawl with a JavaScript-rendering crawler. Screaming Frog or Sitebulb in JS mode. AI bots have inconsistent JS support. Anything that depends on client-side rendering is at risk.
- Audit your schema. Old URLs in your structured data are easy to miss. The schema markup 2026 guide covers the formats that matter now.
The output of week one is a single spreadsheet with five tabs: Google priority pages, backlinked URLs, AI-cited URLs, Wayback-archived URLs, and schema-referenced URLs. Then you de-duplicate them. That combined list is your real redirect map.
What this looks like in practice
On the migration I ran for an HVAC client (the same project I wrote up in the Be Cool Refrigeration case study), the Google priority list had 142 URLs. The AI citation audit added another 23 URLs that Google did not rank well but ChatGPT cited regularly: old PDF service brochures, a glossary page, and three blog posts from 2022 that we had nearly killed. If we had pruned those, we would have lost AI mentions on the exact queries the client was trying to win.
Build the URL map (and the redirect strategy LLMs actually respect)
URL mapping is still the most important single deliverable. The rule is one old URL to one new URL, no chains, no loops. Google's docs are explicit that Googlebot can follow up to 10 hops but you should redirect to the final destination directly.
Here is what I do differently in 2026:
- 301 every URL, even ones you would normally 410. AI crawlers are messier than Googlebot. A 410 on a URL that ChatGPT cached six months ago can vaporise that citation. A 301 carries it forward to the closest equivalent.
- Map AI-cited URLs to their closest topical match, not just their structural match. If ChatGPT cites your old
/glossary/canonical-tagand the new site does not have a glossary, redirect that path to your best canonical content, not to the homepage. Homepage redirects are citation killers. - Keep the old domain alive if you are changing domain. Park it, point an A record at a tiny server or worker, and 301 at the edge. The old domain needs to keep answering 200 to AI bots for at least 12 months. Bing's own guidance suggests keeping redirects for 1 to 2 years preferably longer. For AI I extend that to 24 months minimum.
- Document canonical decisions in the redirect map itself. When the new site consolidates two old pages into one, which page gets the canonical credit? Get this right or you create the exact ambiguity LLMs hate.
A quick reminder on robots: do not block the old domain after migration. I see this on at least one migration a year. People think they are tidying up. They are wiping AI memory. Your old robots.txt should keep allowing crawl for at least the redirect window. The robots.txt SEO guide covers the safer patterns.
The AI layer (this is the part the other checklists skip)
If you only take one section away from this post, take this one.
Pre-migration AI prep (T-minus 14 days)
- Publish a clean llms.txt on the old domain. A simple file at
/llms.txtlisting canonical URLs and a one-line description. I went deep on the data behind this in llms.txt and AI citations: 2026 data and recommendation. It is not a magic bullet, but it gives compliant AI crawlers a hint sheet. - Submit your top 50 cited pages to the Wayback Machine. Use the Save Page Now feature at archive.org/web. This locks a fresh snapshot of your current content into a public archive that several AI pipelines reference. Once you migrate, those snapshots remain searchable.
- Refresh your About, Author, and Organization schema with new URLs primed. Set the
sameAsarrays to point to your new social profiles and the new canonical domain. Test in Google's Rich Results Test. - Update Wikipedia, Wikidata, and any high-authority citations of your old URL. Not all of them. Just the top 5-10 with strong topical authority. These are entity anchors and they carry AI weight far beyond their raw link value. I wrote more about this in unlinked brand mentions vs backlinks 2026.
Migration day AI tasks
- Switch the llms.txt to the new domain on launch.
- Keep the old domain's llms.txt redirecting 301 to the new one.
- Re-submit the new top 50 pages to Wayback within the first 6 hours of launch. This creates the archival proof that the new URL exists with the right content.
- Manually query 5-10 of your most important AI queries on ChatGPT, Perplexity, and Google AI Mode. Record the URLs cited. This is your pre-recrawl baseline.
Post-migration AI recovery (weeks 1-12)
This is the part nobody talks about. Google often recovers in 2-4 weeks. AI recovery takes longer because LLMs do not have a Change of Address tool. You have to rebuild signal volume.
- Re-query every 7 days for the first month, then every 14 days for the next two months. Track which citations have shifted to the new URL versus which are still pointing at the old one.
- For citations stuck on the old URL, check that the 301 is firing cleanly with a curl test. Even one misconfigured redirect can leave AI bots stuck on the wrong path.
- Publish 2-3 fresh pieces of content on the new domain that reference your old high-citation pages by name. This creates new internal pathways and gives AI bots a fresh signal that the new domain owns the topic. The pattern I use is similar to the one in how to get cited in ChatGPT and AI Overviews.
What Google still wants (the traditional checklist)
The AI layer does not replace the Google layer. It sits on top. The Google playbook in 2026 has barely changed and the official documentation is the source of truth. Here's what Google actually says, lifted directly from Site Moves and Migrations:
- Use server-side permanent redirects (301 or 308) where technically possible.
- Avoid redirect chains. Point old URL directly to final new URL.
- Submit the new sitemap in Search Console.
- File a Change of Address request from the old property in Search Console.
- Keep redirects for at least one year.
Google's Change of Address tool documentation adds that Google will prefer the new site over the old when determining canonical pages, and that you should maintain redirects for at least 180 days, longer if you still see traffic.
A few practical notes that are easy to miss:
- The Change of Address tool only works at the domain property level. If you only verified a URL-prefix property, switch to domain property at least two weeks before migration so the tool is available.
- Verify the new property in Search Console before launch. Do not wait until the day of.
- Internal links should already point to the new URLs on launch day. Do not rely on redirects to clean up internal navigation.
- Pay close attention to the Googlebot 2MB crawl limit on the new site. Migrations often introduce bloat (extra theme files, new third-party scripts) and you can quietly land outside the crawl budget.
- If your new site uses heavy JavaScript, remember Google removed its JavaScript SEO accessibility documentation but the practical guidance still applies. Render the important stuff server-side.
The DNS and CDN piece (this is where most teams improvise badly)
Developers often handle DNS at the last minute. That is fine for an internal app. It is not fine for SEO. Three rules.
Rule 1: Lower TTL early. Cloudflare's DNS TTL documentation explains that all proxied records have a TTL of Auto, which is set to 300 seconds. For DNS-only records you can go down to 60 seconds on non-Enterprise plans. Two weeks before migration, drop your records to 300s or 600s. Then wait the duration of your previous TTL before changing anything. If your old TTL was 24 hours, you wait 24 hours.
Rule 2: Migrate DNS during a low-traffic window. I usually pick a Tuesday or Wednesday early morning in the client's lowest-traffic region. Avoid weekends. AI crawlers seem to spike crawl on weekends in my logs and you want them hitting clean state.
Rule 3: Pre-warm the new CDN. Most CDNs cache by URL. The first request to each URL after launch is uncached and slow. Slow first responses hurt Core Web Vitals at exactly the moment Googlebot is recrawling. Run a sitemap-based pre-warm script the night before launch.
Migration day: the 4-hour window
I run every migration in a defined 4-hour window with a checklist printed on paper. Yes, paper. The internet goes down sometimes.
Hour 0: pre-launch
- Confirm staging site has noindex, nofollow.
- Confirm new site has indexable robots.txt and correct canonical tags.
- Confirm SSL on new domain is valid and trusted (test on three browsers).
- Confirm 301 redirect rules in the staging environment with curl.
Hour 1: cutover
- Update DNS records.
- Push redirect rules to production.
- Disable any caching that could serve old content.
- Submit new XML sitemap in Google Search Console and Bing Webmaster Tools.
Hour 2: validation
- Curl test 20 random old URLs. Each should 301 to the right new URL.
- Curl test 10 new URLs. Each should return 200.
- Check the new homepage in incognito on mobile and desktop.
- Check rendering with Google's URL Inspection tool. Confirm the rendered HTML contains your main content and canonical tags.
- Check Search Console's coverage report.
Hour 3: AI and external
- Submit the new top 50 pages to Wayback (Save Page Now).
- Manually query 5-10 priority queries in ChatGPT, Perplexity, Google AI Mode, Gemini.
- File the Change of Address request in Search Console.
- File the Site Move request in Bing Webmaster Tools.
- Update social profile URLs (the schema sameAs property looks at these).
- Update Google Business Profile and any directories you control.
The migration is not done at hour 3. The migration is done at week 12.
Post-migration monitoring (weeks 1-12)
What I check every day for the first 14 days:
- Search Console coverage report (new URLs being indexed, old URLs being processed).
- Search Console Crawl Stats (crawl rate should rise then settle).
- Server logs filtered by Googlebot, Bingbot, GPTBot, PerplexityBot, ClaudeBot, Google-Extended.
- 4xx and 5xx error rate (any spike means a redirect rule failed).
- Branded query impressions and clicks (a useful early warning, since branded queries shift fastest). The branded versus non-branded GSC report view is the cleanest way to do this.
What I check weekly:
- Top 100 keyword rankings (any drop greater than 5 positions needs investigation).
- AI citation queries (the same 30 queries from your pre-migration audit).
- Core Web Vitals on the new site.
- Backlink profile (broken inbound links to old URLs that need fixing at source where possible).
- Wayback Machine indexing of new pages.
What I check monthly:
- Year-over-year traffic and conversions on key pages.
- Search visibility share against competitors.
- Schema validation across templates.
- A fresh AI citation audit, comparing to the pre-migration baseline.
Mistakes I've made (so you do not have to)
- Killed the old domain too early. Cancelled hosting on the old domain at week 8 because budget. The 301s went away with it. We lost 22% of remaining AI citations in a single week. Now I keep the old domain alive for 24 months minimum.
- Trusted a redirect plugin. A WordPress redirect plugin appeared to work but was sending 302s for a chunk of URLs because of a conflicting rewrite rule. We caught it day 4. Always test redirects with curl, never just clicking.
- Forgot the trailing slash. New site enforced trailing slashes, old site did not. Redirects worked for one variant and 404'd for the other. Three days of weird coverage errors before we noticed.
- Did not refresh schema in time. Organisation schema still pointed at the old
sameAsarray. Google had no clean entity confirmation for the new domain. AI Overviews stopped citing the brand for two weeks. - Submitted Change of Address before redirects were fully live. Google's pre-move checks passed because most URLs were redirecting, but a subdirectory was not. The request got accepted, then started flagging errors in the report. Easier to wait an extra day than to fight Search Console's queue.
How long until you recover?
Google's own documentation says small to medium-sized sites can take a few weeks for most pages to move, and larger sites take longer. In my experience, with a clean migration:
- Days 1-7: Crawl spike, some ranking volatility. Expect 10-30% impression drop.
- Weeks 2-4: Most pages re-indexed. Rankings on top-priority queries usually back within 5 positions of pre-migration baseline.
- Weeks 4-8: Full recovery on Google for clean migrations. AI citations starting to shift to new URLs.
- Weeks 8-12: AI citation recovery completes for most major surfaces. Long-tail rankings still settling.
- Months 4-6: Anything still off is probably a content quality or technical issue rather than a migration issue.
If you are still down 20%+ on Google at week 8, something specific is wrong. Most often it is redirect chains, a robots.txt that accidentally blocks the new domain, or a canonical pointing at the old URL. The technical SEO guide walks through the diagnostic order.
The contrarian take: do not migrate unless you have to
A migration is a tax on your SEO. You will lose something. You can minimise the loss but you cannot eliminate it. The question is whether the upside is worth it.
When migration is genuinely worth it:
- Rebrand with legal or commercial reason (acquisition, trademark dispute, market positioning).
- Consolidating a portfolio of weak domains into one stronger one.
- Moving off a platform that is technically broken (truly broken, not just annoying).
- Internationalisation that needs a different domain structure.
When people migrate but probably should not:
- They want a prettier domain name.
- They want to ditch a vague "reputation" issue that nobody outside marketing can articulate.
- A new designer wants a fresh start.
- They saw a competitor do it.
If your reason for migrating is anything in the second list, run an ROI calculation that includes 3-6 months of degraded performance, the cost of running parallel infrastructure, and the engineering time for the migration itself. Often the math does not work.
What to do this week
If you have a migration coming up in the next 90 days, here are the actions to take this week:
- Export your top 200 URLs from Search Console (by impressions and by clicks).
- Run a manual AI citation audit on your 20 top branded and category queries across ChatGPT, Perplexity, Google AI Mode, and Gemini. Save the results.
- Verify both old and new domains in Google Search Console as domain properties (not URL prefix).
- Verify both in Bing Webmaster Tools.
- Lower DNS TTL to 300 seconds for any record you will touch.
- Audit your schema for hardcoded old URLs in
sameAs,url, andlogofields. - Submit your top 50 currently-cited URLs to the Wayback Machine.
If you are running a migration and want a second pair of eyes on the plan before you ship it, I do migration audits as part of my consulting work. Start with the free SEO audit or look at the services page for the full migration support offering.
The last thing I will say: do not skip the AI layer because it feels new and the docs are thin. The other side of a 2026 migration is not just Google traffic. It is being the brand that gets cited when someone asks an AI which company to trust. Lose that signal and you spend three months earning it back.



