SEO for AI User Agents
A new blog post by web platform Vercel reveals some characteristic behaviors of AI user agents (web crawlers) like OpenAI's GPTBot and Anthropic's Claude, based on 900M+ requests by these agents across their network.
In aggregate, requests by these AI agents represent about 28% of Googlebot's total volume.
Key findings:
- Major AI agents don't render JavaScript.
- They tend to be inefficient, spending 30+% of their requests on 404 pages.
- They prioritize HTML and images over other content.
- Their operations are heavily US-biased.
Implications for site owners:
- Because AI agents don't render JavaScript, they don't see anything that's rendered client-side. Render critical content server-side.
- Use static pages where possible.
- Make sure navigation, meta information and main content (including images) are present in HTML when pages open.
- Use descriptive ALT text for images.
- Use correct header tag hierarchy: H1,H2, etc.
- Keep your sitemap updated.
- Do redirects correctly: no excessively long or broken chains.
- Use a logical, consistent URL structure.
- Stay on top of, and fix, 404 errors.
Comment: Happily, none of these recommendations conflict with what you should be doing for Google. Implement now.
Comments on SEO for AI User Agents