Core Web Vitals and AI Visibility: What You Need to Know

Load times are not just a ranking factor for Google — they also determine whether AI crawlers like GPTBot and ClaudeBot can fully read your page at all. This guide explains the connection between Core Web Vitals and AI visibility and shows concrete measures for optimisation.

Why load times matter for AI crawlers

When a person visits a website that takes 5 seconds to load, they may wait — or they may leave. An AI crawler has no patience: it sends an HTTP request, waits a defined time for a response and aborts if the server takes too long.

This timeout is significantly shorter for AI crawlers than for Googlebot. A page that is still "fast enough" for Google can already be too slow for GPTBot or ClaudeBot — with the result that the crawler rates the page as unreachable and may skip it on the next crawl attempt.

There is also a second problem: even if the crawler reaches the page, poor performance can mean only part of the content is loaded before the timeout kicks in. The crawler then sees an incomplete page — and rates the content accordingly lower.

Core message: Core Web Vitals are not just an SEO factor for Google — they are the technical prerequisite for AI crawlers to be able to read your content completely and reliably.

Core Web Vitals explained

Google introduced Core Web Vitals in 2020 as standardised metrics for user experience. They measure three aspects: loading speed, interactivity and visual stability.

LCP
Largest Contentful Paint
Good: ≤ 2.5 sec
Needs improvement: 2.5–4.0 sec
Poor: > 4.0 sec

Measures how long it takes for the largest visible element (image or text block) to fully load.

INP
Interaction to Next Paint
Good: ≤ 200 ms
Needs improvement: 200–500 ms
Poor: > 500 ms

Measures response time to user interactions. Less relevant for AI crawlers — they do not interact.

CLS
Cumulative Layout Shift
Good: ≤ 0.1
Needs improvement: 0.1–0.25
Poor: > 0.25

Measures visual stability — how much elements shift during loading. Indirectly relevant for AI crawlers.

TTFB
Time to First Byte
Good: ≤ 800 ms
Needs improvement: 800–1800 ms
Poor: > 1800 ms

Time until the first byte of the server response arrives. The most important value for AI crawlers.

TTFB — the most critical value for AI crawlers

Among all performance metrics, TTFB (Time to First Byte) is the most important for AI visibility. It measures the time between an HTTP request from a crawler and the moment the first byte of the response arrives at the crawler.

A high TTFB means the server takes a long time to even begin responding. This can be due to a slow server, too many server-side calculations, a blocking database query or missing server-side caching.

While Googlebot tolerates timeouts of several seconds, AI crawlers abort more frequently when TTFB exceeds 2–3 seconds. A TTFB under 800ms is recommended — this is also the value Google PageSpeed classifies as "good".

Causes of high TTFB

  • No server-side caching: Every request triggers a new database query instead of serving cached content
  • Slow hosting provider: Shared hosting with overloaded servers
  • No CDN: The server is physically far from the crawler
  • Heavy PHP/Python processes: Complex server-side logic executed before every response
  • Too many database queries: Particularly on WordPress websites with many plugins

Googlebot vs. AI crawlers: timeout differences

CriterionGooglebotGPTBot / ClaudeBot
TTFB toleranceup to ~5 secondsrecommended under 800ms
JavaScript renderingYes (delayed)No
Crawl frequencyDaily for important pagesWeeks to months
Error toleranceHigh — retriesLower — more likely to skip
TransparencySearch ConsoleNo public tools
Effect onGoogle search resultsAI answers and citations

Important: A page scoring 70/100 on Google PageSpeed can still be problematic for AI crawlers — especially if TTFB is high or important content only becomes visible after JavaScript rendering.

Measuring Core Web Vitals

Google PageSpeed Insights

The most important free tool for Core Web Vitals. It analyses both lab data (simulated test) and field data (real user experiences from the Chrome user pool). Available at pagespeed.web.dev — or directly via the PageSpeed Check on this website.

Google Search Console

Under "Experience" → "Core Web Vitals" the Search Console shows which URLs on your website have poor values — based on real Chrome user data. This is more valuable than pure lab tests.

Chrome DevTools

In the browser developer tool (F12) under "Lighthouse" you can run a full performance audit directly from the browser. Useful for quick local tests.

Concrete optimisation measures

Improve TTFB

  • Enable server-side caching: For WordPress e.g. WP Super Cache or W3 Total Cache. For static websites caching is usually already built in.
  • Use a CDN: Cloudflare (free) serves your content from a server near the crawler — dramatically reducing latency.
  • Upgrade hosting: Shared hosting with poor performance is a real problem for AI visibility. A VPS or dedicated server often improves TTFB threefold.
  • Optimise database queries: Especially on WordPress: reduce plugins, optimise the database, enable query caching.

Improve LCP

  • Compress images and serve in WebP format: WebP is significantly smaller than JPEG at the same quality.
  • Lazy loading for images below the fold: The largest visible element (LCP element) should never be lazy loaded.
  • Preload directive for the LCP element: <link rel="preload"> signals to the browser early which image or font should be loaded first.
  • Remove render-blocking resources: CSS and JavaScript that blocks initial rendering delays LCP.

Reduce JavaScript dependency

This is particularly important for AI crawlers: since GPTBot and ClaudeBot do not render JavaScript, important content must not be loaded exclusively via JavaScript.

  • Server-side rendering (SSR) instead of client-side rendering: For React, Vue or Angular the most important content should be rendered server-side.
  • Critical content in the initial HTML: Headings, introductory text and structured data must be present in the raw HTML source.
  • JavaScript for enhancement, not for content: JavaScript should improve the user experience, not deliver the core content.

Improve CLS

  • Specify image dimensions in HTML: width and height attributes on img tags prevent layout shifts.
  • Reserve fixed space for ads: Ads that appear after loading and shift other elements significantly increase CLS.
  • Web fonts with font-display: optional: Prevents text from jumping after the font loads.

Checklist for AI-optimised performance

  • TTFB under 800ms — measure with PageSpeed Insights
  • LCP under 2.5 seconds
  • CLS under 0.1
  • Server-side caching enabled
  • CDN (e.g. Cloudflare) active
  • Images compressed and in modern format (WebP)
  • Critical content in the initial HTML — not only loaded via JavaScript
  • Render-blocking JavaScript minimised
  • Image dimensions defined in HTML (width + height attributes)
  • SSL certificate valid — HTTP connections are treated worse by AI crawlers

How does your website perform for AI crawlers?

AI-Ready Check analyses TTFB, PageSpeed score and other technical factors — free, in 30 seconds.

Test for free now →

Frequently Asked Questions

Are Core Web Vitals a direct ranking factor for AI answers?+

Not directly — AI systems like ChatGPT and Perplexity do not use metrics like LCP or CLS as explicit ranking signals. But the underlying performance affects whether an AI crawler can fully read the page at all. A page that is not fully crawled due to poor performance cannot be cited either.

My Google PageSpeed score is 90+. Am I well positioned for AI crawlers?+

A good PageSpeed score is a good foundation — but not sufficient. The TTFB is particularly important: a PageSpeed score of 90 can still come with a TTFB of 1.5 seconds, which may already be problematic for AI crawlers. Check TTFB separately and make sure no important content is loaded exclusively via JavaScript.

How much does a CDN improve TTFB?+

A CDN like Cloudflare can significantly reduce TTFB — often by 50–70%. The reason: the crawler no longer reaches the physical server wherever it is located, but a CDN node nearby that serves the cached page directly. For websites that should be crawled internationally, a CDN is particularly important.

INP is irrelevant for AI crawlers — why should I still optimise it?+

INP measures response to user interactions — AI crawlers do not interact, so INP is directly irrelevant for crawling. Indirectly it matters though: a poor INP value signals an overall performance-weak JavaScript bundle that can also affect initial load time. INP is also a Google ranking factor — and a good Google ranking increases the likelihood that AI systems rate a page as trustworthy.

Is it enough to optimise performance once?+

No — performance is not a one-time project but an ongoing process. New plugins, more content, changed server loads and new image formats can degrade performance over time. Monthly review of Core Web Vitals in Google Search Console and regular PageSpeed tests for important pages are recommended.