Why load times matter for AI crawlers
When a person visits a website that takes 5 seconds to load, they may wait — or they may leave. An AI crawler has no patience: it sends an HTTP request, waits a defined time for a response and aborts if the server takes too long.
This timeout is significantly shorter for AI crawlers than for Googlebot. A page that is still "fast enough" for Google can already be too slow for GPTBot or ClaudeBot — with the result that the crawler rates the page as unreachable and may skip it on the next crawl attempt.
There is also a second problem: even if the crawler reaches the page, poor performance can mean only part of the content is loaded before the timeout kicks in. The crawler then sees an incomplete page — and rates the content accordingly lower.
Core message: Core Web Vitals are not just an SEO factor for Google — they are the technical prerequisite for AI crawlers to be able to read your content completely and reliably.
Core Web Vitals explained
Google introduced Core Web Vitals in 2020 as standardised metrics for user experience. They measure three aspects: loading speed, interactivity and visual stability.
Measures how long it takes for the largest visible element (image or text block) to fully load.
Measures response time to user interactions. Less relevant for AI crawlers — they do not interact.
Measures visual stability — how much elements shift during loading. Indirectly relevant for AI crawlers.
Time until the first byte of the server response arrives. The most important value for AI crawlers.
TTFB — the most critical value for AI crawlers
Among all performance metrics, TTFB (Time to First Byte) is the most important for AI visibility. It measures the time between an HTTP request from a crawler and the moment the first byte of the response arrives at the crawler.
A high TTFB means the server takes a long time to even begin responding. This can be due to a slow server, too many server-side calculations, a blocking database query or missing server-side caching.
While Googlebot tolerates timeouts of several seconds, AI crawlers abort more frequently when TTFB exceeds 2–3 seconds. A TTFB under 800ms is recommended — this is also the value Google PageSpeed classifies as "good".
Causes of high TTFB
- No server-side caching: Every request triggers a new database query instead of serving cached content
- Slow hosting provider: Shared hosting with overloaded servers
- No CDN: The server is physically far from the crawler
- Heavy PHP/Python processes: Complex server-side logic executed before every response
- Too many database queries: Particularly on WordPress websites with many plugins
Googlebot vs. AI crawlers: timeout differences
| Criterion | Googlebot | GPTBot / ClaudeBot |
|---|---|---|
| TTFB tolerance | up to ~5 seconds | recommended under 800ms |
| JavaScript rendering | Yes (delayed) | No |
| Crawl frequency | Daily for important pages | Weeks to months |
| Error tolerance | High — retries | Lower — more likely to skip |
| Transparency | Search Console | No public tools |
| Effect on | Google search results | AI answers and citations |
Important: A page scoring 70/100 on Google PageSpeed can still be problematic for AI crawlers — especially if TTFB is high or important content only becomes visible after JavaScript rendering.
Measuring Core Web Vitals
Google PageSpeed Insights
The most important free tool for Core Web Vitals. It analyses both lab data (simulated test) and field data (real user experiences from the Chrome user pool). Available at pagespeed.web.dev — or directly via the PageSpeed Check on this website.
Google Search Console
Under "Experience" → "Core Web Vitals" the Search Console shows which URLs on your website have poor values — based on real Chrome user data. This is more valuable than pure lab tests.
Chrome DevTools
In the browser developer tool (F12) under "Lighthouse" you can run a full performance audit directly from the browser. Useful for quick local tests.
Concrete optimisation measures
Improve TTFB
- Enable server-side caching: For WordPress e.g. WP Super Cache or W3 Total Cache. For static websites caching is usually already built in.
- Use a CDN: Cloudflare (free) serves your content from a server near the crawler — dramatically reducing latency.
- Upgrade hosting: Shared hosting with poor performance is a real problem for AI visibility. A VPS or dedicated server often improves TTFB threefold.
- Optimise database queries: Especially on WordPress: reduce plugins, optimise the database, enable query caching.
Improve LCP
- Compress images and serve in WebP format: WebP is significantly smaller than JPEG at the same quality.
- Lazy loading for images below the fold: The largest visible element (LCP element) should never be lazy loaded.
- Preload directive for the LCP element: <link rel="preload"> signals to the browser early which image or font should be loaded first.
- Remove render-blocking resources: CSS and JavaScript that blocks initial rendering delays LCP.
Reduce JavaScript dependency
This is particularly important for AI crawlers: since GPTBot and ClaudeBot do not render JavaScript, important content must not be loaded exclusively via JavaScript.
- Server-side rendering (SSR) instead of client-side rendering: For React, Vue or Angular the most important content should be rendered server-side.
- Critical content in the initial HTML: Headings, introductory text and structured data must be present in the raw HTML source.
- JavaScript for enhancement, not for content: JavaScript should improve the user experience, not deliver the core content.
Improve CLS
- Specify image dimensions in HTML: width and height attributes on img tags prevent layout shifts.
- Reserve fixed space for ads: Ads that appear after loading and shift other elements significantly increase CLS.
- Web fonts with font-display: optional: Prevents text from jumping after the font loads.
Checklist for AI-optimised performance
- TTFB under 800ms — measure with PageSpeed Insights
- LCP under 2.5 seconds
- CLS under 0.1
- Server-side caching enabled
- CDN (e.g. Cloudflare) active
- Images compressed and in modern format (WebP)
- Critical content in the initial HTML — not only loaded via JavaScript
- Render-blocking JavaScript minimised
- Image dimensions defined in HTML (width + height attributes)
- SSL certificate valid — HTTP connections are treated worse by AI crawlers
How does your website perform for AI crawlers?
AI-Ready Check analyses TTFB, PageSpeed score and other technical factors — free, in 30 seconds.
Test for free now →