Skip to main content

Google Is Retiring These Schema Types in 2026. Here's Where to Focus Instead.

Google is dropping book actions, course listings, fact check and more. Here is what structured data actually matters now.

JB

By Jhonty Barreto

Founder of SEO Engico|March 11, 2026|11 min read

Google is cleaning house with structured data. Here is what actually matters now.

I have been watching Google quietly retire schema types for a while now, but 2026 feels different. This is not just trimming the edges. Google is making a clear statement about where structured data is heading, and most SEO practitioners are not paying attention.

Over the past few months, Google has confirmed that book actions, course listings, automotive listings, vehicle listings, fact check markup, and special announcement schema are all being retired or deprecated. If you have been relying on any of these for rich results, that pipeline is closing. But here is the part most people miss: the retirement list tells you something important about Google's priorities going forward.

I have spent the better part of the last year rethinking how I approach technical SEO and structured data for my clients. This post is what I have learned, what I have changed, and where I think the smart money is going.

What Google is actually retiring and why it matters

Let me walk through the specifics, because the details matter more than the headlines suggest.

Book actions schema was used to connect users directly to platforms where they could buy or borrow books. Google is pulling this because it never gained meaningful adoption outside a handful of major publishers. The cost of maintaining it outweighed its utility.

Course listings structured data helped education providers surface course information in search. This one hurts some clients, but Google has signalled that course discovery is moving toward other surfaces entirely.

Automotive and vehicle listings schema gave dealerships a way to showcase inventory details in search results. Google is consolidating this into its own vehicle search experiences, which means the schema was just an intermediary step that is no longer needed.

Fact check markup is the interesting one. Google introduced ClaimReview schema with a lot of fanfare a few years back. Now it is being deprioritised. Proton Effect covered this shift well, noting that Google seems to be moving fact-checking capabilities into its own AI systems rather than relying on third-party markup.

Special announcement schema, which was introduced during COVID, has served its purpose and is being sunset. No surprises there.

The pattern across all of these retirements is clear. Markup that merely decorates search listings is fading. Google does not need you to tell it how to display results anymore. It has its own systems for that.

The real shift: structured data is not about rich snippets anymore

I have been telling clients this for months. The era of implementing schema purely for rich results is ending. Not tomorrow, but the direction is unmistakable.

Think about what Google is investing in. AI Overviews. Gemini. Conversational search. Every major product initiative is about understanding content at a deeper level and synthesising answers. Structured data still matters enormously in that world, but the purpose has changed.

The strategic shift is this: use structured data not for rich snippets, but for AI citation.

When Gemini pulls together an AI Overview, it needs to understand your content clearly. It needs to know what your product costs, what features it has, who wrote your article, when it was published, and what specific claims you are making. JSON-LD is how you communicate that information unambiguously.

I cover this concept in more depth in my piece on LLM visibility, but the short version is this: if your content is not structured so that large language models can easily extract facts from it, you are going to lose ground to competitors who get this right.

What structured data actually matters in 2026

Here is where I focus my time now when running a technical SEO audit. This is not theoretical. These are the schema types I am implementing and recommending every week.

Product and offer schema with detailed attributes

For any ecommerce SEO client, Product schema with comprehensive attributes is non-negotiable. I am not talking about the bare minimum name-price-availability setup. I mean detailed specifications, brand information, GTIN codes, shipping details, return policies, and product condition.

Why? Because AI systems are pulling this data to answer comparison queries. "What is the cheapest 4K monitor under $500 with USB-C?" If your product schema does not include those attributes clearly in JSON-LD, you are invisible to that query in AI-generated answers.

Organisation and author schema

E-E-A-T is not going away. Google's AI systems need to attribute information to credible sources. Organisation schema with detailed contact information, founding date, social profiles, and industry classification helps establish entity identity. Author schema with credentials, expertise areas, and publication history does the same for individual content creators.

I have seen measurable improvements in AI citation rates after implementing thorough author markup on client sites. This is not correlation. I have tested it across multiple properties.

FAQ and HowTo schema (yes, still)

Google pulled FAQ rich results for most sites back in 2023. A lot of people dropped FAQ schema entirely after that. I think that was a mistake.

FAQ schema still helps AI systems understand the question-answer relationship in your content. Even without the visual rich result, the structured data feeds into how LLMs parse and cite your content. I wrote about this intersection in my ChatGPT SEO and schema piece.

Article and WebPage schema with detailed metadata

Every piece of content should have comprehensive Article or WebPage schema. Publication date, modification date, author, publisher, headline, description, word count, and content section breakdown. This is basic hygiene that a surprising number of sites still get wrong.

Local business schema

For service businesses, LocalBusiness schema remains one of the highest-ROI implementations. Google is not retiring this. If anything, it is becoming more important as AI assistants handle more local queries.

How I approach structured data differently now

My process has changed significantly over the past year. Here is what I actually do.

Step one: audit what exists. I use my SEO audit checklist to catalogue every piece of structured data on a site. Most sites have a messy mix of outdated schema, duplicate declarations, and missing types. The meta tag analyzer helps catch some of this, but structured data auditing requires dedicated tooling.

Step two: remove deprecated markup. If a client still has any of the retired schema types, those come out immediately. Deprecated markup does not actively hurt you, but it adds code bloat and can trigger validation warnings that make real issues harder to spot.

Step three: build an entity map. Before writing any new schema, I map out the key entities on the site. Products, people, locations, services, content pieces. Each entity gets a consistent identifier and a clear relationship to other entities. This is where semantic SEO and structured data overlap.

Step four: implement with AI extraction in mind. When I write JSON-LD now, I am thinking about what an LLM would need to accurately cite this content. That means including specific data points, not just categories. A product is not just "running shoe." It has a weight, a heel drop, a target use case, a price point, and compatibility information.

I go deeper on this methodology in my schema markup guide, which I update regularly as things change.

The AI citation angle most people are missing

Here is the thing I wish I knew sooner. AI-focused structured data is becoming more important not because Google told us it would, but because of how LLMs actually work.

Large language models do not read your page the way a human does. They tokenise it. They process it in chunks. And they are significantly better at extracting and citing information that is explicitly structured versus information that is buried in prose.

When you provide clear product attributes in JSON-LD, you are giving the LLM a cheat sheet. "Here are the facts about this thing, in a format you can parse without ambiguity." That is enormously valuable in a world where AI Overviews and conversational search are consuming more and more of the click landscape.

This ties directly into broader AI content strategy. Your content needs to serve two audiences now: humans who read it and machines that extract facts from it. Structured data is the bridge between those two.

Ankord Media published a solid checklist for structured data on new websites that aligns with a lot of what I am recommending here. Worth reading if you are building a site from scratch.

Common mistakes I see with structured data right now

After auditing dozens of sites this year, these are the patterns that keep showing up.

Using schema generators without customisation. Most automated schema tools produce bare-minimum markup. They give you the required fields and nothing else. In 2026, the required fields are the floor, not the ceiling. You need to populate every relevant property.

Ignoring nested entities. A Product schema that references an Organisation with just a name string instead of a full Organisation entity is leaving value on the table. Nest your entities properly. Give each one its own complete set of properties.

Not validating after deployment. Schema that validates in a testing tool but breaks in production is more common than you would think. I always validate post-deployment using the best technical SEO tools available, including Google's own Rich Results Test and Schema.org's validator.

Duplicating schema across templates. When your CMS injects default schema on every page and you also have plugin-generated schema, you end up with conflicting declarations. This confuses both Google and AI systems. Pick one source of truth.

Treating structured data as a set-and-forget task. Google changes its supported types and requirements regularly. Search Engine Journal maintains a useful timeline of Google's algorithm and feature changes that I check regularly to stay current.

What this means for different business types

The impact of these changes varies a lot depending on what kind of site you run.

For ecommerce sites, product schema is your top priority. Get every attribute populated. Think about what questions a buyer might ask an AI assistant and make sure your structured data answers them.

For SaaS companies, SoftwareApplication schema with detailed feature descriptions, pricing tiers, and integration capabilities is where I would focus. Comparison queries handled by AI are a massive opportunity here.

For content publishers, Article schema with thorough author and publisher information is essential. The E-E-A-T signal from structured data feeds directly into how AI systems decide which sources to cite.

For local businesses, keep your LocalBusiness schema comprehensive and accurate. AI assistants are handling more "near me" and service recommendation queries every month.

Practical steps you can take this week

I like to leave people with something actionable, so here is what I would do if I were looking at my site for the first time with fresh eyes.

First, run a structured data audit. Check every template and page type for existing schema. Note what is there, what is missing, and what is outdated. My technical SEO strategies piece includes a section on how to approach this systematically.

Second, remove any deprecated schema types. If you have book actions, course listings, automotive listings, fact check, or special announcement markup, take it out.

Third, pick your highest-value pages and enrich their schema. Add every relevant property, not just the required ones. For product pages, this means specs, dimensions, materials, compatibility, and use cases. For articles, this means author credentials, topic categories, and content structure.

Fourth, test how AI systems interpret your content. Ask ChatGPT or Gemini a question that your content should answer. See if they cite you. If they do not, look at how your competitors who do get cited are structuring their data.

Fifth, set up a recurring review. I check my clients' structured data quarterly at minimum. Google's requirements shift, new properties become available, and business information changes. Structured data needs to stay current to be useful.

Where this is all heading

I think we are in the early stages of a fundamental shift in how search engines use structured data. The old model was transactional: you add schema, Google gives you a rich result, you get more clicks. That model is not dead yet, but it is shrinking.

The new model is about machine comprehension. You structure your data so that AI systems, whether Google's or anyone else's, can understand your content with precision. The reward is not a special badge in search results. It is being the source that AI cites when answering questions.

That is a harder thing to measure and a harder thing to optimise for. But it is where the value is moving. The sites that figure this out early are going to have a significant advantage as AI-driven search becomes the norm.

I will keep updating my schema markup guide as Google makes further changes. If you want a professional review of your structured data setup, a technical SEO audit is the best place to start.

Ready to grow?

Scale your SEO with proven systems

Get predictable delivery with our link building and content services.