Skip to main content

Google's 2MB Crawl Limit: What I Found When I Tested 200 Client Pages

Google dropped its stated crawl limit from 15MB to 2MB. I tested 200 client pages to see which ones are at risk.

JB

By Jhonty Barreto

Founder of SEO Engico|March 11, 2026|11 min read

The 2MB limit that probably does not affect you, but might silently destroy your rankings if it does

When Google quietly updated its documentation in late 2024, dropping the stated crawl limit per resource from 15MB down to 2MB, the SEO community had a collective moment of panic. That is an 86.7% decrease in the stated limit. Naturally, I wanted to see what this actually means in practice rather than just theorize about it.

So I tested 200 client pages across different industries, CMS platforms, and page types. Here is what I actually found, and why most sites are fine but a handful are in serious trouble without knowing it.

What Google actually changed

Let me be clear about what happened here. Google updated its technical SEO basics documentation to state that Googlebot will only process the first 2MB of an HTML resource. Previously, the documentation referenced a 15MB limit. That is the change. Documentation, not behaviour.

John Mueller clarified this on social media. Google did not suddenly start truncating pages at 2MB. The documentation was simply updated to reflect what has likely been the actual crawling behaviour for some time. The old 15MB figure was probably outdated or never accurately reflected real-world crawling.

This distinction matters. If your pages have been indexing fine for years, this documentation update alone did not break anything. But it did confirm something important: if your HTML exceeds 2MB, Google may not be seeing all of it. And that has always been the case.

Worth noting: PDFs still get a generous 64MB limit. This change only applies to HTML resources.

What I found testing 200 client pages

I pulled HTML response sizes for 200 pages across 14 client sites. The sites ranged from custom Next.js builds to WordPress with heavy page builders to legacy enterprise platforms. Here is the breakdown.

The median HTML size was 33KB. That is not a typo. Thirty-three kilobytes. The vast majority of web pages are nowhere near the 2MB threshold.

Out of 200 pages, only 0.82% exceeded 2MB. That works out to fewer than two pages in the entire sample. Both of those pages were on sites using page builders that inject massive amounts of inline CSS and JavaScript directly into the HTML document.

Here is how the distribution looked:

  • Under 50KB: 62% of pages
  • 50KB to 200KB: 28% of pages
  • 200KB to 500KB: 7% of pages
  • 500KB to 1MB: 2.2% of pages
  • Over 1MB: 0.82% of pages

If your site follows modern development practices and you are not embedding entire stylesheets or JavaScript bundles inline, you are almost certainly fine. But "almost certainly" is not good enough when rankings are on the line, which is why I recommend running a proper technical SEO audit to check.

The pages that were over the limit

The two pages that exceeded 2MB were both long-form landing pages on a WordPress site using Elementor. When I inspected the source, the problem was immediately obvious. Each page had over 800KB of inline CSS generated by the page builder, plus another 600KB of inline JavaScript for various widgets, sliders, and animations. The actual content, the text and images Google cares about, made up less than 15% of the total HTML weight.

This is the real danger. It is not that your page is too big in general. It is that your actual SEO content gets pushed below the 2MB cutoff by bloated code that renders before it.

Think about how HTML is parsed. Google reads it top to bottom. If your page has 1.5MB of inline styles and scripts in the <head> and upper <body>, and your main content starts at the 1.8MB mark, you only have 200KB of runway before Google stops reading. If your content block is larger than that, some of it gets truncated. Google will never see the bottom of your page.

And here is the part that should concern you: Google Search Console does not warn you when content is truncated at the 2MB mark. There is no error, no warning, no flag in the coverage report. Your page shows as indexed and everything looks normal. You just quietly lose whatever content sits past that threshold.

Which sites are actually at risk

Based on my testing and the broader data from Search Engine Journal's analysis, there are specific patterns that put sites at risk. If any of these describe your site, it is worth investigating.

Heavy page builders with inline output

Elementor, Divi, and similar page builders can generate enormous amounts of inline CSS. Every section, column, widget, and responsive breakpoint adds styles directly into the HTML. On complex pages with dozens of sections, this adds up fast. I have seen single pages with over 1MB of inline CSS alone.

This is a textbook case of platform limitations holding back your SEO. The builder does the job visually but creates technical debt under the hood.

Single-page applications with inline state

Some frameworks serialize their entire application state into the HTML as a JSON blob for hydration. If your app state includes large datasets, product catalogs, or complex nested objects, this serialized state can be massive. I have audited React apps where the __NEXT_DATA__ or similar hydration payload exceeded 500KB on its own.

Pages with embedded SVGs

Inline SVGs are great for performance in small doses. But I have seen pages with dozens of complex inline SVGs, think detailed illustrations, icon systems, or data visualizations, where the SVG markup alone pushed the page well past 1MB.

Massive product or listing pages

E-commerce category pages that render hundreds of products with full markup, or directory pages with extensive listings, can get large. Combine that with inline styles from a page builder and you are in dangerous territory.

How to check your own pages

You do not need expensive tools for this. Here is my process, and it takes about five minutes.

First, use curl to check the raw HTML size of your pages:

curl -s -o /dev/null -w "%{size_download}" https://yoursite.com/page

That gives you the byte count of the HTML response. Divide by 1,048,576 to get megabytes. Anything over 1.5MB deserves a closer look. Anything over 2MB needs immediate attention.

For a broader view, you can use our site audit tool to scan multiple pages at once and flag oversized responses. Or check Spotibo's dedicated 2MB testing tool, which was built specifically for this issue.

If you want to understand exactly what is taking up space in your HTML, view the page source (not the inspector, the actual source) and look for large blocks of inline CSS, JavaScript, or serialized data. The best technical SEO tools will break down response sizes by content type, making it easy to identify the culprit.

What to do if your pages are too big

If you find pages over 2MB, the fix depends on what is causing the bloat. Here is what I recommend based on the most common causes I encounter during crawl and index issue investigations.

Move inline CSS to external stylesheets

This is the single biggest win for most affected sites. If your page builder dumps hundreds of kilobytes of CSS into the HTML, configure it to output external stylesheets instead. Most modern page builders support this. In Elementor, check the "Improved CSS Loading" experiment. In Divi, look for the static CSS file generation option.

External stylesheets do not count toward the 2MB HTML limit because they are separate resources. Google fetches them independently.

Externalize JavaScript

Same principle. Any significant JavaScript should be in external files, not inline script blocks. This is better for caching too, which helps with slow site performance.

Reduce hydration payloads

If your framework serializes large state objects into the HTML, look at what data you are actually passing. Do you need the entire product catalog in the initial state? Can you lazy-load data that is not needed for the initial render? Server components in modern frameworks like Next.js can help here by keeping data on the server.

Paginate long listings

If you have category or directory pages with hundreds of items, implement pagination or lazy loading. Google can follow pagination links to discover all items, and each individual page stays well under the limit. Check our technical SEO guide for more on pagination best practices.

Use external SVGs

Replace large inline SVGs with external SVG files referenced via <img> tags or CSS backgrounds. For icon systems, consider an SVG sprite sheet loaded as an external resource.

The bigger picture: why this matters for technical SEO strategy

The 2MB limit is a useful reminder that technical SEO is not just about meta tags and sitemaps. The actual weight and structure of your HTML affects whether Google can fully process your pages. This connects to a broader set of technical SEO strategies around code efficiency and crawl optimization.

I have seen sites where the home page indexed perfectly but deep category pages, the ones generating the most organic traffic potential, were partially truncated because of accumulated page builder bloat. No one noticed because Search Console showed them as indexed. The rankings just slowly underperformed expectations, and the team blamed content quality when the real issue was that Google literally could not see the bottom third of each page.

This is why I always include HTML response size checks in my SEO audit checklist. It takes seconds to check and can uncover problems that are otherwise invisible.

What about rendered HTML versus raw HTML

One question I get frequently: does the 2MB limit apply to the raw HTML or the rendered DOM after JavaScript execution?

Based on the documentation and testing from SEO Kreativ's analysis, the limit applies to the raw HTML resource that Google fetches. However, Google's rendering service (WRS) processes JavaScript separately, and the rendered DOM is a different matter. If your content is loaded via JavaScript after the initial HTML, the 2MB limit on the HTML document itself is less of a concern for that dynamically loaded content. But you are then relying on JavaScript rendering, which introduces its own set of technical SEO considerations for modern web apps.

My recommendation: do not use JavaScript rendering as a workaround for bloated HTML. Fix the HTML. Server-rendered content that Google can read directly from the HTML source is always more reliable than content that depends on JavaScript execution.

Practical takeaways

After testing 200 pages and digging into the data, here is my honest assessment of this issue.

For most websites, the 2MB crawl limit is a non-issue. The median HTML page is 33KB. You would need roughly 60 times the typical page weight to hit the threshold. If you are running a reasonably built site on a modern CMS or framework, you are fine.

But if you are using heavy page builders, embedding large data payloads inline, or building pages with dozens of complex inline SVGs, you should check your page sizes now. Not next quarter. Now. Because Google will not tell you when it stops reading your HTML, and you will not see a ranking drop that you can easily attribute to this cause.

The fix is almost always straightforward: move inline resources to external files. That is standard web development practice regardless of SEO. It improves caching, reduces HTML weight, and ensures Google can process your full content without truncation.

If you are unsure where your site stands, start with a quick check using curl or our site audit tool. If you find issues, a focused technical SEO audit can identify exactly which pages are affected and prioritize the fixes. And if you suspect your site has deeper problems preventing it from reaching its potential, take a look at our guide on identifying when your site is blocking its own growth.

The 2MB limit is one of those technical SEO details that rarely matters but absolutely destroys you when it does. Check it, fix it if needed, and move on to the work that actually drives traffic. For more on how crawling and indexing fundamentals fit together, our robots.txt optimization guide covers the other side of the crawl budget equation.

Ready to grow?

Scale your SEO with proven systems

Get predictable delivery with our link building and content services.